Search Results

Search found 60391 results on 2416 pages for 'data generation'.

Page 876/2416 | < Previous Page | 872 873 874 875 876 877 878 879 880 881 882 883  | Next Page >

  • send datas to php with ajax - Internal Server Error(500)

    - by user1277467
    i try to send my datas to php with ajax but there's strange mistake. this is my ajax script, function deleteData2() { var artistIds = new Array(); $(".p16 input:checked").each(function(){ artistIds.push($(this).attr('id')); }); $.post('/json/crewonly/deleteDataAjax2', { json: JSON.stringify({'artistIds': artistIds}) }, function(response){ alert(response); }); } i think this works correctly but in php side, i face 500 internal server error(500). public function deleteDataAjax2() { $json = $_POST['json']; $data = json_decode($json); $artistIds = $data['artistIds']; $this->sendJSONResponse($artistIds); } Above code is my php. For example, when i try to send $data to ajax, i print my ids in json mode: However, when i try to send $artistIds to ajax side, i gives 500 error why?

    Read the article

  • Problem with outputting html via AJAX

    - by Marek
    Hello I am new to JS and AJAX, but I have to do my homework. I choose jQuery, so it little easy now. I want to get a html via AJAX request, but in result it looks, ex: <fieldset id=\"item4\" class=\"item\"><legend>Odno\u015bnik 4<\/legend> I set response content-type to text/html. When I outputting result on server everything is ok. jQuery code: enter code here $.ajax({ dataType : 'html', data : 'add_sz='+changeSize+'&next_id='+nextId, url : '/kohana/admin/menus/ajax_items_refresh', error : function(err, xhr, status) { msgOutput.text('error msg'); }, success : function(data, xhr, textStatus) { msgOutput.text('success msg'); var tabs = $('#items-list'); $('#items-wrap').html($('#items-wrap').html() + data); Could somebody help me? What I am doing wrong? Kind Regards.

    Read the article

  • Money type field returns Dollar, but regional set to UK (Windows 7)

    - by Gareth
    Hi, On a Windows 7 machine with Advantage Data Architect version 9.10.0.11, money type data is returned as Dollars, instead of Pounds. Sometimes, it will suddenly switch to Pounds, without me changing any settings. Everything else returns Pounds correctly (regional settings is UK with £ as the currency symbol). Has anyone else had this problem and/or found a solution? If I run any reports using the money data type field I can't be sure that it's going to be accurate. And no, I can't change the field type and handle the currency symbol myself. Any help would be appreciated.

    Read the article

  • Python code to do csv file row entries comparison operations and count the number of times row value

    - by Venomancer
    have an excel based CSV file with two columns (or rows, Pythonically) that I am working on. What I need to do is to perform some operations so that I can compare the two data entries in each 'row'. To be more precise, one column has constant numbers all the way down, whereas the other column has varying values. So I need to count the number of times the varying column data entry values crosses the constant value on the other column. For example, fro the csv file i have two columns: Varying Column; Constant Column 24 25 26 25 crossed 27 25 26 25 25.5 25 23 25 crossed 26 25 crossed Thus, the varying column data entries have crossed 25 three times. I need to generate a code that can count the number of the crosses. Please do help out, Thanks.

    Read the article

  • operating sever client from same program

    - by sksingh73
    i want to make a single program for operating both server & client. i want my program to run in such a way that when program is launched, server should start listening for requests from other machines. but when i want to send data to other machines, my server should quit & client is launched so that i start sending data. once complete data has been transferred by client, it should quit & come back to server mode. Any suggestion on whether its feasible. if yes, then how.

    Read the article

  • How to improve the speed of a loop containing a sqlalchemy query statement as conditional

    - by LtPinback
    This loop checks if a record is in the sqlite database and builds a list of dictionaries for those records that are missing and then executes a multiple insert statement with the list. This works but it is very slow (at least i think it is slow) as it takes 5 minutes to loop over 3500 queries. I am a complete newbie in python, sqlite and sqlalchemy so I wonder if there is a faster way of doing this. list_dict = [] session = Session() for data in data_list: if session.query(Class_object).filter(Class_object.column_name_01 == data[2]).filter(Class_object.column_name_00 == an_id).count() == 0: list_dict.append({'column_name_00':a_id, 'column_name_01':data[2]}) conn = engine.connect() conn.execute(prices.insert(),list_dict) conn.close() session.close() edit: I moved session = Session() outside the loop. Did not make a difference.

    Read the article

  • MySQL Full-text Search Workaround for innoDB tables

    - by Rob
    I'm designing an internal web application that uses MySQL as its backend database. The integrity of the data is crucial, so I am using the innoDB engine for its foreign key constraint features. I want to do a full-text search of one type of records, and that is not supported natively with innoDB tables. I'm not willing to move to MyISAM tables due to their lack of foreign key support and due to the fact that their locking is per table, not per row. Would it be bad practice to create a mirrored table of the records I need to search using the MyISAM engine and use that for the full-text search? This way I'm just searching a copy of the data and if anything happens to that data it's not as big of a deal because it can always be re-created. Or is this an awkward way of doing this that should be avoided? Thanks.

    Read the article

  • WPF input datagrid?

    - by SAD
    I would like to create an input DataGrid - an empty set of rows with the column headers that can be filled with data by a user and then saved to a database. The DataGrid will not be therefore used to display data but acts like an input field. The behaviour would be similar to the one of Excell. The problem is that if the DataGrid has no data provided for it, it is not visible in the application and the user cannot enter anything. Is the DataGrid a good choice? How can this sort of functionality be implemented with a different control? I would prefer to use the DataGrid tough. Thanks a lot for any help on this.

    Read the article

  • In jquery Set a delay to fade out or fadeout right away on click

    - by JClu
    I am trying to right a script for a notification popup. I wan't the popup to fade out after X seconds or fadeout when the user clicks on the message. I can get both effects to work individually but when I try to combine them fadeOut works. This is my code so far: function notify(data, type) { switch(type) { case "success": $('#notify').html(data) .removeAttr("style") .addClass('notifySuccess') .click(function() { $("#notify").fadeOut(); }) .delay(5000).fadeOut(); break; case "error": $('#notify').html(data) .removeAttr("style") .addClass('notifyFailure') .click(function() { $("#notify").fadeOut(); }) .delay(5000).fadeOut(); break; } }

    Read the article

  • Array of datas returning issue, Overwriting

    - by sijith
    Hi, Please help me on this Here i want to save the converted data into new pointers. But everytime the data is overwriting with most recent data. Please check my code TCHAR nameBuffer[256]; //Globally Declared void Caller() { TCHAR* ptszSecondInFile= QStringToTCharBuffer(userName); TCHAR* ptszOutFile=QStringToTCharBuffer(Destinationfilename); } TCHAR *dllmerge::QStringToTCharBuffer( QString buffer ) { memset(nameBuffer, 0, sizeof(nameBuffer)); #if UNICODE _tcscpy_s(nameBuffer, _countof(nameBuffer), buffer.toUtf8()); #else _tcscpy_s(nameBuffer, _countof(nameBuffer), buffer.toLocal8Bit()); #endif _tprintf( _T( "nameBuffer %s\n" ), nameBuffer ); return nameBuffer; } I am gettting ptszSecondInFile and ptszOutFile both same answer. Is it possible to do with TCHAR* nameBuffer[256];

    Read the article

  • Python/Numpy - Save Array with Column AND Row Titles

    - by Scott B
    I want to save a 2D array to a CSV file with row and column "header" information (like a table). I know that I could use the header argument to numpy.savetxt to save the column names, but is there any easy way to also include some other array (or list) as the first column of data (like row titles)? Below is an example of how I currently do it. Is there a better way to include those row titles, perhaps some trick with savetxt I'm unaware of? import csv import numpy as np data = np.arange(12).reshape(3,4) # Add a '' for the first column because the row titles go there... cols = ['', 'col1', 'col2', 'col3', 'col4'] rows = ['row1', 'row2', 'row3'] with open('test.csv', 'wb') as f: writer = csv.writer(f) writer.writerow(cols) for row_title, data_row in zip(rows, data): writer.writerow([row_title] + data_row.tolist())

    Read the article

  • Use parameters with CTL

    - by Hal
    Hi. I am using a CTL file to load data stored in a file to a specific table in my Oracle database. Currently, I launch the loader file using the following command line: sqlldr user/pwd@db data=my_data_file control=my_loader.ctl I would like to know if it is possible to use specify parameters to be retrieved in the CTL file. Also, is it possible to retrieve the name of the data file used by the CTL to fill the table ?I also would like to insert it for each row. I currently have to call a procedure to update previously inserted records. Any help would be appreciated !

    Read the article

  • How to store matrix information in MySQL?

    - by dedalo
    Hi, I'm working on an application that analizes music similarity. In order to do that I proccess audio data and store the results in txt files. For each audio file I create 2 files, 1 containing and 16 values (each value can be like this:2.7000023942731723) and the other file contains 16 rows, each row containing 16 values like the one previously shown. I'd like to store the contents of these 2 file in a table of my MySQL database. My table looks like: Name varchar(100) Author varchar (100) in order to add the content of those 2 file I think I need to use the BLOB data type: file1 blob file2 blob My question is how should I store this info in the data base? I'm working with Java where I have a double array containing the 16 values (for the file1) and a matrix containing the file2 info. Should I process the values as strings and add them to the columns in my database? Thanks

    Read the article

  • jQuery .get()/.post() doesn't send an updated cookie

    - by Markus
    Hello, I'am doing something like this: function setCookie(name, value, expire) { var exdate = new Date(); exdate.setDate(exdate.getDate()+expire); document.cookie=name+ "=" +escape(value)+((expire==null) ? "" : ";expires="+exdate.toGMTString()); } // When I click a button // Update the cookie setCookie("test", "1"); function getContent() { // Get the data of our new URL $.get("index.php", loadContent); } function loadContent(data) { // alert the new title // the PHP script is changing title after what cookie-value you have alert($("title", data).text()) } First of all, when Iam alerting it is still the last showing up. So I guess it doesn't send the changed cookies at all.. And for some reason I don't see the request in my Live HTTP Headers window... But also, if i request the URL(without the ajax) it works just fine What am I doing wrong here? Regards Markus

    Read the article

  • R graphics plotting a linegraph with date/time horizontally along x-axis

    - by user2978586
    I want to get a linegraph in R which has Time along x and temperature along y. Originally I had the data in dd/mm/yyyy hh:mm format, with a time point every 30 minutes. https://www.dropbox.com/s/q35y1rfila0va1h/Data_logger_S65a_Ania.csv Since I couldn't find a way of reading this into R, I formatted the data to make it into dd/mm/yyyy and added a column 'time' with 1-48 for all the time points for each day https://www.dropbox.com/s/65ogxzyvuzteqxv/temp.csv This is what I have so far: temp<-read.csv("temp.csv",as.is=T) temp$date<-as.Date(temp$date, format="%d/%m/%Y") #inputting date in correct format plot(temperature ~ date, temp, type="n") #drawing a blank plot with axes, but without data lines(temp$date, temp$temperature,type="o") #type o is a line overlaid on top of points. This stacks the points up vertically, which is not what I want, and stacks all the time points (1-48) for each day all together on the same date. Any advice would be much appreciated on how to get this horizontal, and ordered by time as well as date.

    Read the article

  • Display continious dates in Pivot Chart

    - by Douglas
    I have a set of data in a pivot table with date times and events. I've made a pivot chart with this data, and grouped the data by day and year, then display a count of events for each day. So, my horizontal axis goes from 19 March 2007 to 11 May 2010, and my vertical axis is numeric, going from zero to 140. For some days, I have zero events. These days don't seem to be shown on the horizontal axis, so 2008 is narrower than 2009. How do I display a count of zero for days with no events? I'd like my horizontal axis to be continuous, so that it does not miss any days, and every month ends up taking up the same amount of horizontal space. (This question is similar to the unanswered question here, but I'd rather not generate a table of all the days in the last x number of years just to get a smooth plot!)

    Read the article

  • iPhone Server Mirror Functionality

    - by hecta
    My app reads a from decentralized (so I have the ability to change servers if I have to) xml file with TBXML parser. The xml file consists of only a few lines like this <xml> <mirror url="http://www.someserverabc.com/data.xml" priority="1"/> <mirror url="http://www.someservermirror.com/data.xml" priority="2"/> <mirror url="http://www.anotherserver.com/data.xml/" priority="3"/> </xml> So I have the corresponding priority to the url. Now I want to check if server with priority 1 is reachable, and if not, try the 2nd one and so forth. If a server is reachable I'm parsing XML with the url from the mirror list How could I implement this approach and is it even a good approach or how can this be tweaked? (is XML even desirable in the first scenario)

    Read the article

  • Making linq avoid using in memory filtering where possible

    - by linqmonkey
    Consider the these two LINQ2SQL data retrieval methods. The first creates a 'proper' SQL statement that filters the data, but requires passing the data context into the method. The second has a nicer syntax but loads the entire list of that accounts projects, then does in memory filtering. Is there any way to preserve the syntax of the second method but with the performance advantage of the first? public partial class Account { public IQueryable<Project> GetProjectsByYear(LinqDataContext context, int year) { return context.Projects.Where(p => p.AccountID==this.AccountID && p.Year==year).OrderBy(p => p.ProjectNo) } public IQueryable<Project> GetProjectsByYear(int year) { return this.Projects.Where(p => p.Year==year).OrderBy(p => p.ProjectNo).AsQueryable() } }

    Read the article

  • Using Multiple Databases

    - by sergiuoala
    A company is hired by another company for helping in a certain field. So I created the following tables: Companies: id, company name, company address Administrators: (in relation with companies) id, company_id, username, email, password, fullname Then, each company has some workers in it, I store data about workers. Hence, workers has a profession, Agreement Type signed and some other common things. Now, the parent tables and data in it for workers (Agreement Types, Professions, Other Common Things) are going to be the same for each company. Should I create 1 new database for each company? Or store All data into the same database? Thanks.

    Read the article

  • Database Backup

    - by Jungle_hacker
    Scenario i want to take backup from 7 client database to 1 server database. i dont know structure of the db { either server or client db }. both databases are having old data. now i have to make the tool take the backup for that. and should possible to backup old data also[if any updates done on old data.] please help to find the solution for this. 1. how can i proceed with the problem. 2. database not specified, may be MS access or Sql server 2005 3. In which i can implement this [ I am thinking of doing it in c#] please help me to find the solution

    Read the article

  • How to use SynonymFilterFactory in Solr?

    - by AlxVallejo
    I'm trying to execute synonym filtering at query time so that if I search for X, results for Y also show up. I go to where Solr is being run, edit the .txt file and add X, Y on a new line. This does not work. I check the schema and I see: <analyzer type="query"> <filter class="solr.SynonymFilterFactory" synonyms="synonyms.txt" ignoreCase="true" expand="true" /> What am I missing? EDIT Assessing configuration files tomcat6/Catalina/localhost seems to point to the correct location <Context docBase="/data/solr/solr.war" debug="0" privileged="true" allowLinking="true" crossContext="true"> <Environment name="solr/home" type="java.lang.String" value="/data/solr" override="true" /> </Context> Also, in the Solr admin I see this. What does cwd mean? cwd=/usr/share/tomcat6 SolrHome=/data/solr/

    Read the article

  • Trying to use json to populate areas of my website using mysql, php, and jquery.

    - by RyanPitts
    Ok, so this is my first attempt at doing anything with JSON. I have done a lot with PHP and MySql as well as jQuery and JavaScript...but nothing with JSON. I have some data in a MySql database. In the codes below i am using a PHP file to retrieve the data from the MySql database and using json_encode to format it to JSON. This file is being called by the JavaScript that runs when the page loads (well, it runs on document.ready actually). I then use jQuery to access the JSON keys and values to fill in areas of the page "dynamically". Here is the code snippets i am using (excuse my "noobness" on writing these snippets, still learning all this). This is my script that is on my HTML page test.php: <script type="text/javascript"> $(document).ready(function(){ $.getJSON("json_events.php",function(data){ $.each(data.events, function(i,events){ var tblRow = "<tr>" +"<td>" + events.id + "</td>" +"<td>" + events.customerId + "</td>" +"<td>" + events.filingName + "</td>" +"<td>" + events.title + "</td>" +"<td>" + events.details + "</td>" +"<td>" + events.dateEvent + "</td>" +"<td><a href='assets/customers/testchurch/events/" + events.image + "'>" + events.image + "</a></td>" +"<td>" + events.dateStart + "</td>" +"<td>" + events.dateEnd + "</td>" +"</tr>" $(tblRow).appendTo("#eventsdata tbody"); }); }); $.getJSON("json_events.php",function(data){ $.each(data.events, function(i,events){ $("#title").html("First event title: " + events.title + " ..."); }); }); }); </script> This is the code for the php file being called by the above JS: json_events.php <?php require_once('includes/configread.php'); $arrayEvents = array(); $resultsEvents = mysql_query("SELECT * FROM events"); while($objectEvents = mysql_fetch_object($resultsEvents)) { $arrayEvents[] = $objectEvents; } $json_object_events = json_encode($arrayEvents); $json_events = "{\"events\": " . $json_object_events . " }"; echo $json_events; require_once('includes/closeconnread.php'); ?> This is my JSON that is held in the variable $json_events from my php file json_events.php: { "events": [ { "id": "2", "customerId": "1004", "filingName": "testchurch", "title": "Kenya 2011 Training Meeting", "details": "This meeting will be taking place on Sunday, February 10th @ 6pm. Get ready for our annual Kenya trip in 2011. We have been blessed to be able to take this trip for the past 3 years. Now, it's your turn to bless others! Come to this training meeting to learn how to minister to the people in Kenya and also learn what we'll be doing there.", "dateEvent": "2011-02-10", "image": "kenya2011.jpg", "dateStart": "2010-09-04", "dateEnd": "2011-02-10" }, { "id": "6", "customerId": "1004", "filingName": "testchurch", "title": "New Series: The Journey", "details": "We will be starting our new series titled "The Journey". Come worship with us as we walk with Paul on his 2nd missionary journey.", "dateEvent": "2011-01-02", "image": "", "dateStart": "2010-09-06", "dateEnd": "2011-01-02" } ] } This is my HTML on test.php: <table id="eventsdata" border="1"> <thead> <tr> <th>id</th> <th>customerId</th> <th>filingName</th> <th>title</th> <th>details</th> <th>dateEvent</th> <th>image</th> <th>dateStart</th> <th>dateEnd</th> </tr> </thead> <tbody></tbody> </table> <div id="title"></div> I have two questions really... Question 1: Does this code look like it is written correctly at first glance? Question 2: I want to be able to select only the title from the first event in the JSON array. The code i am using now is selecting the second events' title by default it seems. How can i accomplish this?

    Read the article

  • MySQL - Conflicting WHERE and GROUP BY Statements

    - by TaylorPLM
    I have a query returning the counts of some data, but I do NOT want data that has a null value in it... As an example, the code rolls stats from a clicking system into a table. SELECT sh.dropid, ... FROM subscriberhistory sh INNER JOIN subscriberhistory sh2 on sh.subid = sh2.subid WHERE sh.dropid IS NOT NULL AND sh.dropid != "" ... GROUP BY sh.dropid An example of the record set returned would look like this... 400 2 3 4 5 6 401 2 3 6 5 4 NULL 2 3 4 5 1 There are some other where clauses, and a few more selects (as I said, using the count aggregate) that are also within the query. There is also an order by statement. Again, the goal is to keep the NULL data out of the preceding record set. Could someone explain to me why this behavior is occurring and what to do to solve it.

    Read the article

  • Getting my app to work on my phone

    - by user237259
    Hello I wrote an app android sdk version 1.5 using it's built in emulator in eclipse. I created a database outside of eclipse populated it with data and push it to the emulator folder data/data/project name/database and could access it fine from the my app, everything was working great. My final step was to test the apk file on an actual phone. When I did that the program would not work. After looking into it I found that the database file was not being uploaded into the phone. Is there a special way to install the apk file on the actual phone to get the embedded database to work? Or can I assume that since everything worked fine on the sdk 1.5 emulator that everything will work when I upload it to the market. Thanks in advance.

    Read the article

  • php - How do I get rid of this strange "empty delimiter" message

    - by Steven
    I have some code that uses the stristr function to extract data I need. It works, in that it gives me the results I'm looking for. BUT (you knew there was a but), it gives me this error message for every iteration of the loop: Warning: stristr() [function.stristr]: Empty delimiter in ... line 55 Like I said, the code works apart from this error. Can anyone suggest how i could amend this code to get rid of the message? Thanks in advance $data = stristr("$text", "$key"); $result = string_limit_words($data,2); print "$result<BR>";

    Read the article

< Previous Page | 872 873 874 875 876 877 878 879 880 881 882 883  | Next Page >