Search Results

Search found 95201 results on 3809 pages for 'system data sqlite'.

Page 386/3809 | < Previous Page | 382 383 384 385 386 387 388 389 390 391 392 393  | Next Page >

  • jQuery: How to fire event when all asynchronous calls return?

    - by Jeremy
    I have a jQuery application that loads data from five asynchronous server calls. I do not want to display any data until all five calls return. (I plan on displaying a Loading message until that happens.) How can I detect when all five calls have returned? I considered having each callback method increment a variable (using jQuery's data() method, perhaps) and then waiting for the value to become 5. (I am not sure yet how I would listen for that event.) I do not think this is a very good solution, however. What would happen if two calls return at the same time? Is there a better way to do this?

    Read the article

  • asp.net disappearing data from Dropdownlist inside listview when loading from DataTable

    - by piotrmichal
    Hello I have a problem with dropdowllist ddl1 that is inside of insertitem of listview. I populate it from table on page load. In Page_load I have: if (!Page.IsPostBack) { ListView1.DataBind(); InsDropDownList3.DataSource = GetAllPlayersName(); InsDropDownList3.DataBind(); } DropDownList ddl1 = (DropDownList)ListView1.InsertItem.FindControl("InsDropDownList1"); if (ddl1 != null) { DataTable t = GetAllPlayersName(); ddl1.DataSource = t; ddl1.DataBind(); } When I select a row in listview or edit it data from dropdownlist in insertitem row disappears - list became empty until I close page and open again. There is another InsDropDownList3 outside of listview and there is no problem with this one. Why one is loosing data and the other one not? As I understand ddl1 should even get data every time page loads.

    Read the article

  • Google analytics-style custom report builder UI

    - by gregmac
    I'm looking for a reporting engine/UI that can be integrated into a product, which has a UI along the lines of Google Analytics' Custom Reports builder. Is anyone aware of such a thing? The data is in our case is not page views/visitors/etc, but is similar in nature, in that there are limited entities or types of data, but each entity has many attributes/columns and many different ways of aggregating data (or in GA-style speak, metrics and dimensions). The analytics-style UI is very intuitive and allows many reports to be created in powerful ways, without having to know SQL. I have preference for a web-based tool (seeing that it is 2010 and this is a web app -- I mention only because it seems the vast majority of reporting tools still have only a non-web-based creation tool).

    Read the article

  • Macro to copy data from one sheet to another based on the current date

    - by SgtSnafu
    Does anyone have a macro that copy data from one sheet to another based on the current date? I am working with a single workbook of three sheets. Sheet one will hold the manual input of daily production figures for multiple plants, sheet two is to hold ongoing daily data, keyed on sheet one. The macro will be associated with a button, so that once clicked it would search for every row that has a date of today, and copy that row to the next available blank row on sheet two. Sample Data... Plant 1 Input Date - $ Produced - Labor Hour 3-29-10 - 4538 - 8 3-30-10 - 7862 - 12 3-31-10 4-1-10 4-2-10 Plant 2 Input Date - $ Produced - Labor Hour 3-29-10 - 4545 - 9 3-30-10 - 7645 - 12 3-31-10 4-1-10 4-2-10

    Read the article

  • wxpython: Updating a dict or other appropriate data type from wx.lib.sheet.CSheet object

    - by bvmou
    If I have a notebook with three spreadsheet widgets, what is the best way to have changes to the spreadsheet update a dictionary (or maybe an sqlite file?). Do all wx grid objects come with a built in dictionary related to the SetNumberRows and SetNumberCols? Basically I am looking for guidance on how to work with the user-input data from a spreadsheet widget, as in this example adapted from the tutorial on python.org: class ExSheet(wx.lib.sheet.CSheet): def __init__(self, parent): sheet.CSheet.__init__(self, parent) self.SetLabelBackgroundColour('#CCFF66') self.SetNumberRows(50) self.SetNumberCols(50) class Notebook(wx.Frame): def __init__(self, parent, id, title): wx.Frame.__init__(self, parent, id, title) nb = wx.Notebook(self, -1, style=wx.NB_BOTTOM) self.sheet1 = ExSheet(nb) self.sheet2 = ExSheet(nb) self.sheet3 = ExSheet(nb) nb.AddPage(self.sheet1, "Sheet1") nb.AddPage(self.sheet2, "Sheet2") nb.AddPage(self.sheet3, "Sheet3") self.sheet1.SetFocus() self.StatusBar()

    Read the article

  • update data in jqgrid

    - by griZZZly8
    Hi! I uses jqgrid in this scenario: Grid gets JSON data from first url. If url returns correct JSON - grid displays that data. If url returns incorrect url, thet fires 'loadError' event of grid. In this event i want to change url of grid to url2 fnd get JSON data from thus new url. Here is my code. loadError: function(xhr, st, err) { $("#list").setGridParam({ url: '/new_url' }); $("#list").trigger("reloadGrid"); } But it doesnt't works.

    Read the article

  • Comparing two java objects on fly (data type not known)

    - by Narendra
    Hi All, I need to compare different data objects. Can any one tell me how can i do this. I don't know what are the data types i will get priorly. If i need to use any util from apache commons then please give reference to it. At present I am using .equals() for comparing equality of objects .It is working fine when I am comparing quality for two strings. If i am comparing java.sql.date data type then it is showing unequal even though both contains same values. Can any one suggest me on this regard. Thanks, Narendra

    Read the article

  • Failed to save data at the server from memcached program

    - by zahir hussain
    hi i want to know why i cant store multi dimensional(array size is more than 1000) $memcache = new Memcache; $memcache->connect('localhost', 11211) or die ("Could not connect"); the above s working correctly... the below one have error... $memcache->set('key', $sear, false, 60) or die ("Failed to save data at the server"); if the $sear is string or object array then no problem for store data at the server.. but i store multi dimensional array in memcached,,i will get the error is Failed to save data at the server thanks and advance

    Read the article

  • python sending incomplete data over socket

    - by tipu
    I have this socket server script, import SocketServer import shelve import zlib class MyTCPHandler(SocketServer.BaseRequestHandler): def handle(self): self.words = shelve.open('/home/tipu/Dropbox/dev/workspace/search/words.db', 'r'); self.tweets = shelve.open('/home/tipu/Dropbox/dev/workspace/search/tweets.db', 'r'); param = self.request.recv(1024).strip() try: result = str(self.words[param]) except KeyError: result = "set()" self.request.send(str(result)) if __name__ == "__main__": HOST, PORT = "localhost", 50007 SocketServer.TCPServer.allow_reuse_address = True server = SocketServer.TCPServer((HOST, PORT), MyTCPHandler) server.serve_forever() And this receiver, from django.http import HttpResponse from django.template import Context, loader import shelve import zlib import socket def index(req, param = ''): HOST = 'localhost' PORT = 50007 s = socket.socket(socket.AF_INET, socket.SOCK_STREAM) s.connect((HOST, PORT)) s.send(param) data = zlib.decompress(s.recv(131072)) s.close() print 'Received', repr(data) t = loader.get_template('index.html') c = Context({ 'foo' : data }) return HttpResponse(t.render(c)) I am sending strings to the receiver that are in the hundreds of kilobytes. I end up only receiving a portion of it. Is there a way that I can fix that so that the whole string is sent?

    Read the article

  • Performance - User defined query / filter to search data

    - by Cagatay Kalan
    What is the best way to design a system where users can create their own criterias to search data ? By "design" i mean, data storage, data access layer and search structure. We will actually refactor an existing application which is written in C# and ASP .NET and we don't want to change the infrastructure. Our main issue is performance and we use MSSQL and DevExpress to build queries. Some queries run in 4-5 minutes and all the columns included in the queries have indexes. When i check queries, i see that DevExpress builds too many "exists" clauses and i'm not happy with that because i have doubts that some of these queries skip some indexes. What may be the alternatives to DevExpress? NHibernate or Entity Framework? Can we build dynamic criteria system and store these to database in both of them ? And also do we need any alternative storage like a lucene index or OLAP database?

    Read the article

  • How to scale an image (in data URI format) in JavaScript (real scaling, not using styling)

    - by 103067513055141045393
    We are capturing a visible tab in a Chrome browser (by using the extensions API chrome.tabs.captureVisibleTab) and receiving a snapshot in the data URI scheme (Base64 encoded string). Is there a JavaScript library that can be used to scale down an image to a certain size? Currently we are styling it via CSS, but have to pay performance penalties as pictures are mostly 100 times bigger than required. Additional concern is also the load on the localStorage we use to save our snapshots. Does anyone know of a way to process this data URI scheme formatted pictures and reduce their size by scaling them down? References: Data URI scheme on http://en.wikipedia.org/wiki/Data_URI_scheme Chrome Extensions API onhttp://code.google.com/chrome/extensions/tabs.html The "Recently Closed Tabs" Chrome Extension onhttp://code.google.com/p/recently-closed-tabs

    Read the article

  • Is SELECT INTO able to affect data from its original table during UPDATE

    - by driveby
    Whilst asking this question asp.net scheduling timed events user murph posted some insightful information: Point about this is that its very, very simple - you have an process for exchange that is performing a clearly defined task and you have a high frequency task that is not doing anything particularly complex, its a straightforward query (select from table where sent = false and send at < value) - probably into temporary table so that you can run a single query update after you've done the sends - that you can optimise the index for. You're not trying to queue up a huge pile of event triggers, just one that fires once a minute and processes things that are due. Is it possible to SELECT data from table X INTO table Y and have the UPDATES that are performed on table Y pushed into table X? I guess the alternative would be that the data gets updated in table Y then an update command can be run on table X based on the data in table Y. What would be the advantage of selecting into another table? Thank you,

    Read the article

  • What format do I use to store a relatively small amount of user data

    - by wcm
    I am writing a small program for our local high school (pro bono). The program has an interface allows the user to enter school holidays. This is a simple stand alone Windows app. What format should I use to store the data? A big relational data is obviously overkill. My initial plan was to store the data in an XML file. Co-workers have been suggesting that I use JSON files, Access Databases, SQL Lite, and SQL Server Express. There was even a suggestion of old school INI files.

    Read the article

  • getting web page data as json object?

    - by encryptor
    I have a url, the data of which page i need as a json object. I ve tried xmlhttprequest and ajaxobject both but doesnt work. It doesnt even give a responseText when I give it as an alert Ill post both the code snippets here. url = http://mydomain.com:port/a/b/c AJAX : var ajaxRequest = new ajaxObject(URL); ajaxRequest.callback = function (responseText,responseStatus) { alert(responseStatus); JSONData = responseText.parseJSON(); processData(JSONData); } USING xmlhttprequest: var client = new XMLHttpRequest(); client.open('GET',URL,true ); data = JSON.parse(client.responseText); alert(data.links.length); can someone please help me out with this. I understand cross scripting may be an issue, but how to come over it? and shouldn't then too it should give the alerts as zero or null

    Read the article

  • Data loss between conversion

    - by Alex Brooks
    Why is it that I loose data between the conversions below even though both types take up the same amount of space? If the conversion was done bitwise, it should be true that x = z unless data is being stripped during the conversion, right? Is there a way to do the two conversions without losing data (i.e. so that x = z)? main.cpp: #include <stdio.h> #include <stdint.h> int main() { double x = 5.5; uint64_t y = static_cast<uint64_t>(x); double z = static_cast<double>(y) // Desire : z = 5.5; printf("Size of double: %lu\nSize of uint64_t: %lu\n", sizeof(double), sizeof(uint64_t)); printf("%f\n%lu\n%f\n", x, y, z); } Results: Size of double: 8 Size of uint64_t: 8 5.500000 5 5.000000

    Read the article

  • No value given for one or more required parameters in connection initialisation

    - by Jean-François Côté
    I have an C# form application that use an access database. This application works perfectly in debug and release. It works on all version of Windows. But it crash on one computer with Windows 7. The message I got is: System.Data.OleDb.OleDbException: No value given for one or more required parameters. EDIT, after some debugging with messagebox on the computer that have the problem, here is the code that bug.The error is catched on the cmd.ExecuteReader(). The messagebox juste before is shown and the next one is the one in the catch with the exception below. Any ideas? public List<CoeffItem> GetModeleCoeff() { List<CoeffItem> list = new List<CoeffItem>(); try { OleDbDataReader dr; OleDbCommand cmd = new OleDbCommand("SELECT nIDModelAquacad, nIDModeleBorne, fCoefficient FROM tbl_ModelBorne ORDER BY nIDModelAquacad", m_conn); MessageBox.Show("Commande SQL créée avec succès"); dr = cmd.ExecuteReader(); MessageBox.Show("Exécution du reader sans problème!"); while (dr.Read()) { list.Add(new CoeffItem(Convert.ToInt32(dr["nIDModelAquacad"].ToString()), Convert.ToInt32(dr["nIDModeleBorne"].ToString()), Convert.ToDouble(dr["fCoefficient"].ToString()))); } MessageBox.Show("Lecture du reader"); dr.Close(); MessageBox.Show("Fermeture du reader"); } catch (OleDbException err) { MessageBox.Show("Erreur dans la lecture des modèles/coefficient: " + err.ToString()); } return list; } I think it's something related to the connection string but why only on that computer. Thanks for your help! EDIT Here is the complete error message: See the end of this message for details on invoking just-in-time (JIT) debugging instead of this dialog box. ***** Exception Text ******* System.Data.OleDb.OleDbException: No value given for one or more required parameters. at System.Data.OleDb.OleDbCommand.ExecuteCommandTextErrorHandling(OleDbHResult hr) at System.Data.OleDb.OleDbCommand.ExecuteCommandTextForSingleResult(tagDBPARAMS dbParams, Object& executeResult) at System.Data.OleDb.OleDbCommand.ExecuteCommandText(Object& executeResult) at System.Data.OleDb.OleDbCommand.ExecuteCommand(CommandBehavior behavior, Object& executeResult) at System.Data.OleDb.OleDbCommand.ExecuteReaderInternal(CommandBehavior behavior, String method) at System.Data.OleDb.OleDbCommand.ExecuteReader(CommandBehavior behavior) at System.Data.OleDb.OleDbCommand.ExecuteReader() at DatabaseLayer.DatabaseFacade.GetModeleCoeff() at DatabaseLayer.DatabaseFacade.InitConnection(String strFile) at CalculatriceCHW.ListeMesure.OuvrirFichier(String strFichier) at CalculatriceCHW.ListeMesure.nouveauFichierMenu_Click(Object sender, EventArgs e) at System.Windows.Forms.ToolStripItem.RaiseEvent(Object key, EventArgs e) at System.Windows.Forms.ToolStripMenuItem.OnClick(EventArgs e) at System.Windows.Forms.ToolStripItem.HandleClick(EventArgs e) at System.Windows.Forms.ToolStripItem.HandleMouseUp(MouseEventArgs e) at System.Windows.Forms.ToolStripItem.FireEventInteractive(EventArgs e, ToolStripItemEventType met) at System.Windows.Forms.ToolStripItem.FireEvent(EventArgs e, ToolStripItemEventType met) at System.Windows.Forms.ToolStrip.OnMouseUp(MouseEventArgs mea) at System.Windows.Forms.ToolStripDropDown.OnMouseUp(MouseEventArgs mea) at System.Windows.Forms.Control.WmMouseUp(Message& m, MouseButtons button, Int32 clicks) at System.Windows.Forms.Control.WndProc(Message& m) at System.Windows.Forms.ScrollableControl.WndProc(Message& m) at System.Windows.Forms.ToolStrip.WndProc(Message& m) at System.Windows.Forms.ToolStripDropDown.WndProc(Message& m) at System.Windows.Forms.Control.ControlNativeWindow.OnMessage(Message& m) at System.Windows.Forms.Control.ControlNativeWindow.WndProc(Message& m) at System.Windows.Forms.NativeWindow.Callback(IntPtr hWnd, Int32 msg, IntPtr wparam, IntPtr lparam)

    Read the article

  • Data Access from single table in sql server 2005 is too slow

    - by Muhammad Kashif Nadeem
    Following is the script of table. Accessing data from this table is too slow. SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[Emails]( [id] [int] IDENTITY(1,1) NOT NULL, [datecreated] [datetime] NULL CONSTRAINT [DF_Emails_datecreated] DEFAULT (getdate()), [UID] [nvarchar](250) COLLATE Latin1_General_CI_AS NULL, [From] [nvarchar](100) COLLATE Latin1_General_CI_AS NULL, [To] [nvarchar](100) COLLATE Latin1_General_CI_AS NULL, [Subject] [nvarchar](max) COLLATE Latin1_General_CI_AS NULL, [Body] [nvarchar](max) COLLATE Latin1_General_CI_AS NULL, [HTML] [nvarchar](max) COLLATE Latin1_General_CI_AS NULL, [AttachmentCount] [int] NULL, [Dated] [datetime] NULL ) ON [PRIMARY] Following query takes 50 seconds to fetch data. select id, datecreated, UID, [From], [To], Subject, AttachmentCount, Dated from emails If I include Body and Html in select then time is event worse. indexes are on: id unique clustered From Non unique non clustered To Non unique non clustered Tabls has currently 180000+ records. There might be 100,000 records each month so this will become more slow as time will pass. Does splitting data into two table will solve the problem? What other indexes should be there?

    Read the article

  • querying larg text file containing JSON objects.

    - by Maciek Sawicki
    Hi, I have few Gigabytes text file in format: {"user_ip":"x.x.x.x", "action_type":"xxx", "action_data":{"some_key":"some_value"...},...} each entry is one line. First I would like to easily find entries for given ip. This part is easy because I can use grep for example. However even for this I would like to find better solution because I would like to get response as fast as possible. Next part is more complicated because I would like to find entries from selected ip and of selected type and with particular value of some_key in action_data. Probably I would have to convert this file to SQL db (probably SQLite, because it will be desktop APP), but I would ask if there are exists better solutions?

    Read the article

  • Returning user data for forms that have errors in when using ModelForms

    - by Sevenearths
    forms.py from django.forms import ModelForm from client.models import ClientDetails, ClientAddress, ClientPhone from snippets.UKPhoneNumberForm import UKPhoneNumberField class ClientDetailsForm(ModelForm): class Meta: model = ClientDetails class ClientAddressForm(ModelForm): class Meta: model = ClientAddress class ClientPhoneForm(ModelForm): number = UKPhoneNumberField() class Meta: model = ClientPhone views.py from django.shortcuts import render_to_response, redirect from django.template import RequestContext from client.forms import ClientDetailsForm, ClientAddressForm, ClientPhoneForm def new_client_view(request): formDetails = ClientDetailsForm(initial={'marital_status':'u'}) formAddress = ClientAddressForm() formHomePhone = ClientPhoneForm(initial={'phone_type':'home'}) formWorkPhone = ClientPhoneForm(initial={'phone_type':'work'}) formMobilePhone = ClientPhoneForm(initial={'phone_type':'mobi'}) return render_to_response('client/new_client.html', {'formDetails': formDetails, 'formAddress': formAddress, 'formHomePhone': formHomePhone, 'formWorkPhone': formWorkPhone, 'formMobilePhone': formMobilePhone}, context_instance=RequestContext(request)) (the new_client.html is nothing special) How should I write views.py so that if the user's data raises an error, instead of showing them the form again with the errors in but none of their original data, it shows them the form again with the errors AND their original data?

    Read the article

  • python fdb save huge data from database to file

    - by peter
    I have this script SELECT = """ select coalesce (p.ID,'') as id, coalesce (p.name,'') as name, from TABLE as p """ self.cur.execute(SELECT) for row in self.cur.itermap(): xml +=" <item>\n" xml +=" <id>" + id + "</id>\n" xml +=" <name>" + name + "</name>\n" xml +=" </item>\n\n" #save xml to file here f = open... and I need to save data from huge database to file. There are 10 000s (up to 40000) of items in my database and it takes very long time when script runs (1 hour and more) until finish. How can I take data I need from database and save it to file "at once"? (as quick as possible? I don't need xml output because I can process data from output on my server later. I just need to do it as quickly as possible. Any idea?) Many thanks!

    Read the article

  • cron job for updating user profile data imported via facebook connect

    - by Abidoon Nadeem
    I want to write a cron job for updating user profile data on my website that I pull for users that register via facebook connect on my website. The objective is to keep their profile data on my website in sync with their profile data on facebook. So if a user updates their profile picture on facebook. I want to update his profile picture on my website as well via a cron job which will run every 24 hours. I wanted to know if this is possible and secondly if this is in violation of facebook privacy policy. Based on my research it seems doable but I wanted to know if anyone has already done something like this before. It would really help.

    Read the article

  • grabing data from url

    - by Syom
    i have a task - i must grab some data from the URL. the link is http://cba.am. the data, i want to take, are in the some table, and i have the only one identifier, to reach my wanted data, it's the word "usd", which writes in that table(html)! i've written the following script, and it works! but i never heard how more experienced programers do such things, so i want to hear your comments. here is script <?php $str = file_get_contents("http://cba.am/"); $key_usd = "USD"; $sourse_usd_1 = explode($key_usd,$str); $usd1 = $sourse_usd_1[2]; $sourse_usd_2=explode(">",$usd1); $usd2 = $sourse_usd_2[4]; $sourse_usd_3=explode("<",$usd2); $usd = $sourse_usd_3[0]; ?> sorry for poor english:)

    Read the article

  • Where should I store user config data? Specificaly the path to the data file?

    - by jamone
    I have an app using a SQLite db, and I need the ability for the user to move the data file and point the app to where it moved to. I used the Entity Framework to create the model, and by default it puts the connection string in the App.Config file. From what I've read if I make changes to the connection string there then they won't take effect until the app is restarted. That seems a bit clunky for my use. I see how I can init my model and pass in a custom string but I'm unsure what the best practice is in where to store basic user prefrences such as this? Ini, Registry, somewhere else? I don't want the user to have to "Open" the file each time, just when it relocates and then the app will try to auto open from then on.

    Read the article

  • Embedded Record is not getting loaded in Ember.js

    - by Venky
    Following is the JSON data I am trying to load using ember-data: { "product" : [ { "id" : 1, "name" : "product1", "master" : { "id" : 1, "name" : "product1", "images" : [ { "id" : 1, "productUrl" : "/images/product1_1.jpg" }, { "id" : 2, "productUrl" : "/images/product1_2.jpg" } ] } }, { "id" : 2, "name" : "product2", "master" : { "id" : 2, "name" : "product2", "images" : [ { "id" : 3, "productUrl" : "/images/product2_1.jpg" }, { "id" : 4, "productUrl" : "/images/product2_2.jpg" } ] } } ] } The models are as follows: App.Product = DS.Model.extend name: DS.attr('string') description: DS.attr('string') master: DS.belongsTo('master') App.Master = DS.Model.extend images: DS.hasMany('image') App.Image = DS.Model.extend productUrl: DS.attr('string') The Application Serializer code is as follows: App.ApplicationSerializer = DS.ActiveModelSerializer.extend(DS.EmbeddedRecordsMixin, attrs: { images: { embedded : 'always' } master: { embedded : 'always' } } ) The problem is that the "master" model records are being returned empty. I am not sure, where I am going wrong. I am using the following platform configuration: ember-source (1.4.0) ember-data-source (1.0.0.beta.7) ember-rails (0.15.0) Rails (4.1.0) Thanks

    Read the article

  • quaring larg text file containing JSON objects.

    - by Maciek Sawicki
    Hi, I have few Gigabytes text file in format: {"user_ip":"x.x.x.x", "action_type":"xxx", "action_data":{"some_key":"some_value"...},...} each entry is one line. First I would like to easily find entries for given ip. This part is easy because I can use grep for example. However even for this I would like to find better solution because I would like to get response as fast as possible. Next part is more complicated because I would like to find entries from selected ip and of selected type and with particular value of some_key in action_data. Probably I would have to convert this file to SQL db (probably SQLite, because it will be desktop APP), but I would ask if there are exists better solutions?

    Read the article

< Previous Page | 382 383 384 385 386 387 388 389 390 391 392 393  | Next Page >