Search Results

Search found 5233 results on 210 pages for 'records'.

Page 132/210 | < Previous Page | 128 129 130 131 132 133 134 135 136 137 138 139  | Next Page >

  • How to Convert multiple sets of Data going from left to right to top to bottom the Pythonic way?

    - by ThinkCode
    Following is a sample of sets of contacts for each company going from left to right. ID Company ContactFirst1 ContactLast1 Title1 Email1 ContactFirst2 ContactLast2 Title2 Email2 1 ABC John Doe CEO [email protected] Steve Bern CIO [email protected] How do I get them to go top to bottom as shown? ID Company Contactfirst ContactLast Title Email 1 ABC John Doe CEO [email protected] 1 ABC Steve Bern CIO [email protected] I am hoping there is a Pythonic way of solving this task. Any pointers or samples are really appreciated! p.s : In the actual file, there are 10 sets of contacts going from left to right and there are few thousand such records. It is a CSV file and I loaded into MySQL to manipulate the data.

    Read the article

  • Video packet capture over multiple IP cameras

    - by nimals1986
    Hello We are working on a C language application which is simple RTSP/RTP client to record video from Axis a number of Cameras . We launch a pthread for each of the camera which establishes the RTP session and begins to record the packets captured suing the recvfrom() call... A single camera single pthread records fine for well over a day without issues.. but testing with more cameras available,about 25(so 25 pthreads), the recording to file goes fine for like 15 to 20 mins and then the recording just stops ..the application still keeps running .. Its been over a month and a half we have been trying with varied implementations but nothing seems to help .. Please provide suggestions.. We are using CentOS 5 platform

    Read the article

  • How to compare two TXT files before send it to SQL

    - by adopilot
    I have to handle TXT dat files which coming from one embed device, My problem is in that device always sending all captured data but I want to take only difrences between two sending and do calculation on them. After calculation I send it to SQL using bulkinsert function. I want to extract data which is different according to first file I got from device. Lats say that device first time device send data like this in some.dat (ASCII) file 0000199991 0000199321 0000132913 0000232318 0000312898 On second calls to get data from device it is going to return all again (previous and next captured records) something like this 0000199991 0000199321 0000132913 0000232318 0000312898 9992129990 8782999022 2323423456 But this time I do want only to calculate and pass trough data added after first insert. I am trying to make Win Forms app using C# and Visual Studio 2008

    Read the article

  • Is it possible to search through json result with jQuery

    - by mickyjtwin
    What I'm trying to achieve, is to output a json list that contains a list of Css classes, and their corresponding url records, i.e. var jsonList = [{"CSSClass":"testclass1","VideoUrl":"/Movies/movie.flv"},{"CSSClass":"testclass2","VideoUrl":"/Movies/movie2.flx"}]; //]]> foreach item in the list I am adding a click event to the class... $.each(script, function() { $("." + this.CSSClass, "#pageContainer").live('click', function(e) { videoPlayer.playMovie(this); return false; }); }); What I'm wondering, is if I can somehow get the corresponding url from the jsonlist, without having to loop through them all again, searching for CSSClass, or adding the url to the link as an attribute?

    Read the article

  • very large string in memory

    - by bushman
    Hi, I am writing a program for formatting 100s of MB String data (nearing a gig) into xml == And I am required to return it as a response to an HTTP (GET) request . I am using a StringWriter/XmlWriter to build an XML of the records in a loop and returning the stringWriter.ToString() during testing I saw a few --out of memory exceptions-- and quite clueless on how to find a solution? do you guys have any suggestions for a memory optimized delivery of the response? is there a memory efficient way of encoding the data? or maybe chunking the data -- I just can not think of how to return it without building the whole thing into one HUGE string object thanks

    Read the article

  • Bad access on removing the record from the contacts

    - by Mohammed Sadiq
    Hi all, I am facing a issue in removing the ABRecord . The code that i use is as follows : BOOL isRemoved = ABAddressBookRemoveRecord(addressbook, record, &error). But I am receiving the following call stack : *#0 0x005355bd in moveToRoot * *#1 0x005bbeb6 in sqlite3VdbeExec * **#2 0x0058e7e7 in sqlite3_step ** *#3 0x000800ad in CPSqliteStatementSendResults * *#4 0x00082dcd in CPRecordStoreProcessStatementWithPropertyIndices * *#5 0x00082e1d in CPRecordStoreProcessStatement * *#6 0x325b4b63 in ABCRemoveRecord * *#7 0x325c578a in ABAddressBookRemoveRecord * I searched , and this is the only way to remove the records from the contact . I dont know the reason for this access error . Any hep will be greatly appreciated .. Best Regards, MOhammed Sadiq.

    Read the article

  • need thoughts on my interview question - .net, c#

    - by uno
    One of the questions I was asked was that I have a database table with following columns pid - unique identifier orderid - varchar(20) documentid - int documentpath - varchar(250) currentLocation - varchar(250) newlocation - varchar(250) status - varchar(15) I have to write a c# app to move the files from currentlocation to newlocation and update status column as either 'SUCCESS' or 'FAILURE'. This was my answer Create a List of all the records using linq Create a command object which would be perform moving files using foreach, invoke a delegate to move the files - use endinvoke to capture any exception and update the db accordingly I was told that command pattern and delegate did not fit the bill here - i was aksed to think and implement a more favorable GoF pattern. Not sure what they were looking for - In this day and age, do candidates keep a lot of info on head as one always has google to find any answer and come up with solution.

    Read the article

  • working with arrays

    - by user295189
    I currently do a query which goes through the records and forms an array. the print_r on query gives me this print_r($query) yields the following: Array ( [0] = ( [field1] = COMPLETE [field2] = UNKNOWN [field3] = Test comment ) [1] = ( [field1] = COMPLETE [field2] = UNKNOWN [field3] = comment here ) [2] = ( [field1] = COMPLETE [field2] = UNKNOWN [field3] = checking ) [3] = ( [field1] = COMPLETE [field2] = UNKNOWN [field3] = testing ) [4] = ( [field1] = COMPLETE [field2] = UNKNOWN [field3] = working ) ) somehow I want to take this array and convert it back to php. So for example some thing like this $myArray = array( ...) then $myArray should yield the samething as the print_r($query) yeilds. Thanks

    Read the article

  • Consolidate data from many different databases into one with minimum latency

    - by NTDLS
    I have 12 databases totaling roughly 1.0TB, each on a different physical server running SQL 2005 Enterprise - all with the same exact schema. I need to offload this data into a separate single database so that we can use for other purposes (reporting, web services, ect) with a maximum of 1 hour latency. It should also be noted that these servers are all in the same rack, connected by gigabit connections and that the inserts to the databases are minimal (Avg. 2500 records/hour). The current method is very flakey: The data is currently being replicated (SQL Server Transactional Replication) from each of the 12 servers to a database on another server (yes, 12 different employee tables from 12 different servers into a single employee table on a different server). Every table has a primary key and the rows are unique across all tables (there is a FacilityID in each table). What are my options, these has to be a simple way to do this.

    Read the article

  • how to use found_rows in oracle package to avoid two queries

    - by Omnipresent
    I made a package which I can use like this: select * from table(my_package.my_function(99, 'something, something2', 1, 50)) I make use of the package in a stored procedure. Sample stored procedure looks like: insert into something values(...) from (select * from table(my_package.my_function(99, 'something, something2', 1, 50))) a other_table b where b.something1 = a.something1; open cv_1 for select count(*) from table(my_package.my_function(99, 'something, something2', 1, 50)) So I am calling the same package twice. first time to match records with other tables and other stuff and second time to get the count. Is there a way to get the count first time around and put it into a variable and second time around I just pick that variable rather than calling the whole query again? Hope it makes sense.

    Read the article

  • Entity Framework - Optimistic Concurrency Issue

    - by Cranialsurge
    I have a windows service that runs every 10 seconds ... each time it runs, it takes some test data, modifies it and persists it to the database using the EntityFramework. However, on every second run, when I try to persist the change I get the following Optimistic Concurrency Exception:- Store update, insert, or delete statement affected an unexpected number of rows (0). Entities may have been modified or deleted since entities were loaded. Refresh ObjectStateManager entries I know for a fact that there is nothing else writing to that DB but my service which updates records every 10 seconds. What could be causing the concurrency exception here ?

    Read the article

  • Accessing and binding events to an ASP.NET Grid view using Jquery

    - by Sayem Ahmed
    I have an ASP page which displays a text box when it loads. It takes an input number, send it to the server through post back, and then displays some record in a grid view. After a number is input into the box, the server fetches some data from a database and add records to the grid view. It also contains a link column, whose URL is set to "#", so that the page isn't redirected when it is clicked. Now I want to bind a jquery "click" event to that link. How can I do that ? I have tried that to do myself but failed, because it is not available when the DOM is loaded (since it only contains rows when a number is input through the box), and is being modified through ASP.NET Ajax post back.

    Read the article

  • Where do I subclass SC.Record?

    - by Jake
    I'm using SproutCore and going through the Todos tutorial on http://wiki.sproutcore.com/Todos+06-Building+with+Rails. SproutCore and Rails use different names for the primary key on a model (Sproutcore = 'guid', Rails = 'id'). The tutorial gives directions to Adjust Rails JSON output, but in my app that I'm planning on building, I cannot do this because sproutcore is only one of a couple interfaces used to interact with the Rails app. As an alternate option, the tutorial gives the following directions: Option 2: Set primaryKey in your Sproutcore model class You can subclass SC.Record and set the primaryKey of your sublcass to "id". Your Todo- Object then needs to extend from e.g. App.Rails.Record. App.Rails.Record = SC.Record.extend({ primaryKey : "id" // Extend your records from App.Rails.Record instead of SC.Record. }); Where should I subclass SC.Record (i.e. in what file) such that it is properly read and included?

    Read the article

  • 2 large databases - worth merging into 1?

    - by Ardman
    I have 2 large databases that were sharded before. I now have removed the sharding and have created a new database with all of the data except for the tables that were originally sharded. Is it worth importing this data into the new database, or keeping them as seperate entities that I can just scan through? We are talking around 60million records in each sharded table, of which there are 2 tables. Also, whilst I have an empty table, should I be adding indexes which weren't thought of when the database was originally constructed and now too large to add them?

    Read the article

  • C# Sequence Diagramm Reverse Engineering Tool?

    - by Wayne
    It's essential for me to find a tool that will reverse engineer sequence diagrams by integrating with the debugger. I suppose using the profiler could work also but less desirable. It's a key requirement that the tool in question will record all threads of execution since the app, TickZoom, is heavily parallelized. We just evaluated a most awesome tool from Sparx called Enterprise Architect which integrates with the debugger. It allows you to set a break point, start recording method traces from that break point. It's a lovely design and GUI. Hope it works for you but it only records a single thread of execution so that makes it unusable for us. I will put in a feature request to Sparx. But hope to find a similar tool that does this now since that's the only features we need--not all the other stuff that Sparx also does rather well. Sincerely, Wayne

    Read the article

  • How can I update multiple columns with a Replace in SQL server?

    - by Kettenbach
    How do I update different columns and rows across a table? I want to do something similiar to replace a string in SQL server I want to do this but the value exists in multiple columns of the same type. The values are foreign keys varchars to an employee table. Each column represents a task, so the same employee may be assigned to several tasks in a record and those tasks will vary between records. How can I do this effectively? Basically something of a replace all accross varying columns throughout a table. Thanks for any help or advice. Cheers, ~ck in San Diego

    Read the article

  • SQL Server: how to optimize "like" queries?

    - by duke84
    I have a query that searches for clients using "like" with wildcard. For example: SELECT TOP (10) [t0].[CLIENTNUMBER], [t0].[FIRSTNAME], [t0].[LASTNAME], [t0].[MI], [t0].[MDOCNUMBER] FROM [dbo].[CLIENT] AS [t0] WHERE (LTRIM(RTRIM([t0].[DOCREVNO])) = '0') AND ([t0].[FIRSTNAME] LIKE '%John%') AND ([t0].[LASTNAME] LIKE '%Smith%') AND ([t0].[SSN] LIKE '%123%') AND ([t0].[CLIENTNUMBER] LIKE '%123%') AND ([t0].[MDOCNUMBER] LIKE '%123%') AND ([t0].[CLIENTINDICATOR] = 'ON') It can also use less parameters in "where" clause, for example: SELECT TOP (10) [t0].[CLIENTNUMBER], [t0].[FIRSTNAME], [t0].[LASTNAME], [t0].[MI], [t0].[MDOCNUMBER] FROM [dbo].[CLIENT] AS [t0] WHERE (LTRIM(RTRIM([t0].[DOCREVNO])) = '0') AND ([t0].[FIRSTNAME] LIKE '%John%') AND ([t0].[CLIENTINDICATOR] = 'ON') Can anybody tell what is the best way to optimize performance of such query? Maybe I need to create an index? This table can have up to 1000K records in production.

    Read the article

  • Recording user data for heatmap with javascript

    - by Hanpan
    Hi, I was wondering how sites such as crazyegg.com store user click data during a session. Obviously there is some underlying script which is storing each clicks data, but how is that data then populated into a database? It seems to me the simple solution would be to send data via AJAX but when you consider that it's almost impossible to get a cross browser page unload function setup, I'm wondering if there is perhaps some other more advanced way of getting metric data. I even saw a site which records each mouse movement and I am guessing they are definitely not sending that data to a database on each mouse move event. So, in a nutshell, what kind of technology would I need in order to monitor user activity on my site and then store this information in order to create metric data? I am not looking to recreate GA, I'm just very interested to know how this sort of thing is done. Thanks in advance

    Read the article

  • getline at a certain line for file IO

    - by BSchlinker
    Is there anyway to use getline to read a specific line within a file? For instance, to immediately read line #20? It seems inefficient to do any type of look to read and discard earlier lines. I know about fseek, but there is no guarantee that the records will be the same length on each line. I imagine this is simply what is required in order to find lines. After all, to know when the end of the line has been reached, it needs to find the break line character, so it makes sense for it to need to read each line. Just wondering if there was any quicker method.

    Read the article

  • Value Comparison with a multivalued column in SQL Database Table

    - by Rishabh Ohri
    Hi All, Suppose there is a table A which has a column AccessRights which is multivalued( Eg of values in it in this format STOLI,HELP,BRANCH(comma separated string) Now a stored procedure is written against this table to fetch records based on a AccessRight parameter sent to the SP. Let that parameter be @AccessRights, this is also a comma separated string which may have a value like STOLI,BRANCH,HELPLINE etc Now I want to compare individual values from the parameter @AccessRights with the column AccessRights. Current Approach is I split the Comma Separated string(@AccessRights) using a User Defined Function Split. And I get Individual values in a Table variable(Contains only one column "accessGroup"), the individual values are in a Table variable under the column name accessGroup and I use following code in the SP for comparison Where AccessRights like '%'+accessGroup+'%' Now if the user passes the parameter (HELP, OLI) instead of( HELP,STOLI) the SP will give the output. What should be done for comparison so that that subststring OLI does not give the output for STOLI

    Read the article

  • MS SQL Server: how to optimize "like" queries?

    - by duke84
    I have a query that searches for clients using "like" with wildcard. For example: SELECT TOP (10) [t0].[CLIENTNUMBER], [t0].[FIRSTNAME], [t0].[LASTNAME], [t0].[MI], [t0].[MDOCNUMBER] FROM [dbo].[CLIENT] AS [t0] WHERE (LTRIM(RTRIM([t0].[DOCREVNO])) = '0') AND ([t0].[FIRSTNAME] LIKE '%John%') AND ([t0].[LASTNAME] LIKE '%Smith%') AND ([t0].[SSN] LIKE '%123%') AND ([t0].[CLIENTNUMBER] LIKE '%123%') AND ([t0].[MDOCNUMBER] LIKE '%123%') AND ([t0].[CLIENTINDICATOR] = 'ON') It can also use less parameters in "where" clause, for example: SELECT TOP (10) [t0].[CLIENTNUMBER], [t0].[FIRSTNAME], [t0].[LASTNAME], [t0].[MI], [t0].[MDOCNUMBER] FROM [dbo].[CLIENT] AS [t0] WHERE (LTRIM(RTRIM([t0].[DOCREVNO])) = '0') AND ([t0].[FIRSTNAME] LIKE '%John%') AND ([t0].[CLIENTINDICATOR] = 'ON') Can anybody tell what is the best way to optimize performance of such query? Maybe I need to create an index? This table can have up to 1000K records in production.

    Read the article

  • mysql Command timeout error

    - by Rahul Malhotra
    I am converting my database from sql server 2005 to mysql using asp .net mvc. I have bulk data in sql server(around 4 lakh records), But i am facing command timeout/wating for comand timeout error which when i search on google can be given 65535 as its highest value Or can be given 0 if someone wants that comand should wait for unlimited time untill its above command get executed. Both of these a'int working with me. I also have given any connectTimeout to 180. So shoiuld i have to change it too. Anybody who had face this problem or have any confirm knowledge please share

    Read the article

  • How do I make "simple" throughput servlet-filter?

    - by Tommy
    I'm looking to create a filter that can give me two things: number of request pr minute, and average responsetime pr minute. I already got the individual readings, I'm just not sure how to add them up. My filter captures every request, and it records the time each request takes: public void doFilter(ServletRequest request, ...() { long start = System.currentTimeMillis(); chain.doFilter(request, response); long stop = System.currentTimeMillis(); String time = Util.getTimeDifferenceInSec(start, stop); } This information will be used to create some pretty Google Chart charts. I don't want to store the data in any database. Just a way to get current numbers out when requested As this is a high volume application; low overhead is essential. I'm assuming my applicationserver doesn't provide this information.

    Read the article

  • mySQL experts - need help with 'intersect'

    - by MTCreations
    I know that mySQL 5.x does not support INTERSECT, but that seems to be what I need. Table A: Products (p_id) Table B: Prod_cats (cat_id) - category info (name, description, etc) Table C: prod_2cats (p_id, cat_id) - many to many prod_2cats holds the many (1 or more) categories that have been assigned to Products (A). Doing a query/filter lookup, (user interactive) and need to be able to select across multiple categories the products that meet ALL the criteria. Ex: - 80 products assigned to Category X - 50 products assigned to Category Y - but only 10 products (intersect) are assigned to BOTH cat X AND cat Y This sql works for one category: SELECT * FROM products WHERE p_show='Y' AND p_id IN ( SELECT p_id FROM prods_2cats AS PC WHERE PC.cat_id =" . $cat_id ." <-$cat_id is sanitized var passed from query form . I can't seem to find the means to say ' give me the intersect of cat A and cat B' and get back the subset (10 records, from my example) Help!

    Read the article

  • WPF Data virtualizing ListView

    - by Robert Jeppesen
    In our current WinForms app, we are displaying millions of records in ListView, using virtualization. Rows are loaded from DB as they are requested. This works well, with good performance. This is a showstopper for migrating to WPF for us. We need data virtualization in a ListView, like WinForms 2.0 has. Do you know a decent third-party control, or a relatively easy way of doing it with built-in controls? It doesn't need to be a DataGrid, a simple ListView will suffice. Note, I'm note talking about UI virtualization, it's data virtualization.

    Read the article

< Previous Page | 128 129 130 131 132 133 134 135 136 137 138 139  | Next Page >