Search Results

Search found 7311 results on 293 pages for 'rows'.

Page 138/293 | < Previous Page | 134 135 136 137 138 139 140 141 142 143 144 145  | Next Page >

  • After writing SQL statements in MySQL, how to measure the speed / performance of them?

    - by Jian Lin
    I saw something from an "execution plan" article: 10 rows fetched in 0.0003s (0.7344s) How come there are 2 durations shown? What if I don't have large data set yet. For example, if I have only 20, 50, or even just 100 records, I can't really measure how faster 2 different SQL statements compare in term of speed in real life situation? In other words, there needs to be at least hundreds of thousands of records, or even a million records to accurately compares the performance of 2 different SQL statements?

    Read the article

  • GrailsUI (YUI) data table hover event

    - by bsreekanth
    Hello, How to make the data table rows change color as hover over it. The YUI example is here link text I tried something like <script> GRAILSUI.myDataTable.subscribe("rowMouseoverEvent", GRAILSUI.myDataTable.onEventHighlightRow); GRAILSUI.myDataTable.subscribe("rowMouseoutEvent", GRAILSUI.myDataTable.onEventUnhighlightRow); GRAILSUI.myDataTable.subscribe("rowClickEvent", GRAILSUI.myDataTable.onEventSelectRow); </script> thanks,

    Read the article

  • Faster Matrix Multiplication in C#

    - by Kyle Lahnakoski
    I have as small c# project that involves matrices. I am processing large amounts of data by splitting it into n-length chunks, treating the chucks as vectors, and multiplying by a Vandermonde** matrix. The problem is, depending on the conditions, the size of the chucks and corresponding Vandermonde** matrix can vary. I have a general solution which is easy to read, but way too slow: public byte[] addBlockRedundancy(byte[] data) { if (data.Length!=numGood) D.error("Expecting data to be just "+numGood+" bytes long"); aMatrix d=aMatrix.newColumnMatrix(this.mod, data); var r=vandermonde.multiplyBy(d); return r.ToByteArray(); }//method This can process about 1/4 megabytes per second on my i5 U470 @ 1.33GHz. I can make this faster by manually inlining the matrix multiplication: int o=0; int d=0; for (d=0; d<data.Length-numGood; d+=numGood) { for (int r=0; r<numGood+numRedundant; r++) { Byte value=0; for (int c=0; c<numGood; c++) { value=mod.Add(value, mod.Multiply(vandermonde.get(r, c), data[d+c])); }//for output[r][o]=value; }//for o++; }//for This can process about 1 meg a second. (Please note the "mod" is performing operations over GF(2^8) modulo my favorite irreducible polynomial.) I know this can get a lot faster: After all, the Vandermonde** matrix is mostly zeros. I should be able to make a routine, or find a routine, that can take my matrix and return a optimized method which will effectively multiply vectors by the given matrix, but faster. Then, when I give this routine a 5x5 Vandermonde matrix (the identity matrix), there is simply no arithmetic to perform, and the original data is just copied. ** Please note: What I use the term "Vandermonde", I actually mean an Identity matrix with some number of rows from the Vandermonde matrix appended (see comments). This matrix is wonderful because of all the zeros, and because if you remove enough rows (of your choosing) to make it square, it is an invertible matrix. And, of course, I would like to use this same routine to convert any one of those inverted matrices into an optimized series of instructions. How can I make this matrix multiplication faster? Thanks! (edited to correct my mistake with Vandermonde matrix)

    Read the article

  • MySql BulkCopy/Insert from DataReader

    - by Sky Sanders
    I am loading a bunch of rows into MySql in C#. In MS Sql I can feed a DataReader to SqlBulkCopy, but the MySqlBulkCopy only presents itself as a bootstrap for a load from file. So, my current solution is using a prepared command in a transacted loop. Is there a faster way to accomplish bulk loading of MySql using a DataReader source?

    Read the article

  • populating flexgrid with a file vb6

    - by Andeeh
    So I need to put all the names in a file into column 1 on flexgird, each name should go on its own row. here is what I have but i just get "invalid row value" namefile = App.Path & "\names.dat" Open namefile For Input As #1 While Not EOF(1) Input #1, x With MSFlexGrid1 .Col = 1 .Rows = rowcount + 1 .Text = x End With Wend End Sub Any help would be fantastic and thanks in advance

    Read the article

  • Foreign keys vs partitioning

    - by Industrial
    Hi! Since foreign keys are not supported by partitioned mySQL databases for the moment, I would like to hear some pro's and con's for a read-heavy application that will handle around 1-400 000 rows per table. Unfortunately, I dont have enough experience yet in this area to make the conclusion by myself... Thanks a lot! References: http://stackoverflow.com/questions/1537219/how-to-handle-foreign-key-while-partitioning http://stackoverflow.com/questions/2496140/mysql-partitioning-with-foreign-keys

    Read the article

  • T-SQL - Left Outer Joins - Filters in the where clause versus the on clause.

    - by Greg Potter
    I am trying to compare two tables to find rows in each table that is not in the other. Table 1 has a groupby column to create 2 sets of data within table one. groupby number ----------- ----------- 1 1 1 2 2 1 2 2 2 4 Table 2 has only one column. number ----------- 1 3 4 So Table 1 has the values 1,2,4 in group 2 and Table 2 has the values 1,3,4. I expect the following result when joining for Group 2: `Table 1 LEFT OUTER Join Table 2` T1_Groupby T1_Number T2_Number ----------- ----------- ----------- 2 2 NULL `Table 2 LEFT OUTER Join Table 1` T1_Groupby T1_Number T2_Number ----------- ----------- ----------- NULL NULL 3 The only way I can get this to work is if I put a where clause for the first join: PRINT 'Table 1 LEFT OUTER Join Table 2, with WHERE clause' select table1.groupby as [T1_Groupby], table1.number as [T1_Number], table2.number as [T2_Number] from table1 LEFT OUTER join table2 --****************************** on table1.number = table2.number --****************************** WHERE table1.groupby = 2 AND table2.number IS NULL and a filter in the ON for the second: PRINT 'Table 2 LEFT OUTER Join Table 1, with ON clause' select table1.groupby as [T1_Groupby], table1.number as [T1_Number], table2.number as [T2_Number] from table2 LEFT OUTER join table1 --****************************** on table2.number = table1.number AND table1.groupby = 2 --****************************** WHERE table1.number IS NULL Can anyone come up with a way of not using the filter in the on clause but in the where clause? The context of this is I have a staging area in a database and I want to identify new records and records that have been deleted. The groupby field is the equivalent of a batchid for an extract and I am comparing the latest extract in a temp table to a the batch from yesterday stored in a partioneds table, which also has all the previously extracted batches as well. Code to create table 1 and 2: create table table1 (number int, groupby int) create table table2 (number int) insert into table1 (number, groupby) values (1, 1) insert into table1 (number, groupby) values (2, 1) insert into table1 (number, groupby) values (1, 2) insert into table2 (number) values (1) insert into table1 (number, groupby) values (2, 2) insert into table2 (number) values (3) insert into table1 (number, groupby) values (4, 2) insert into table2 (number) values (4) EDIT: A bit more context - depending on where I put the filter I different results. As stated above the where clause gives me the correct result in one state and the ON in the other. I am looking for a consistent way of doing this. Where - select table1.groupby as [T1_Groupby], table1.number as [T1_Number], table2.number as [T2_Number] from table1 LEFT OUTER join table2 --****************************** on table1.number = table2.number --****************************** WHERE table1.groupby = 2 AND table2.number IS NULL Result: T1_Groupby T1_Number T2_Number ----------- ----------- ----------- 2 2 NULL On - select table1.groupby as [T1_Groupby], table1.number as [T1_Number], table2.number as [T2_Number] from table1 LEFT OUTER join table2 --****************************** on table1.number = table2.number AND table1.groupby = 2 --****************************** WHERE table2.number IS NULL Result: T1_Groupby T1_Number T2_Number ----------- ----------- ----------- 1 1 NULL 2 2 NULL 1 2 NULL Where (table 2 this time) - select table1.groupby as [T1_Groupby], table1.number as [T1_Number], table2.number as [T2_Number] from table2 LEFT OUTER join table1 --****************************** on table2.number = table1.number AND table1.groupby = 2 --****************************** WHERE table1.number IS NULL Result: T1_Groupby T1_Number T2_Number ----------- ----------- ----------- NULL NULL 3 On - select table1.groupby as [T1_Groupby], table1.number as [T1_Number], table2.number as [T2_Number] from table2 LEFT OUTER join table1 --****************************** on table2.number = table1.number --****************************** WHERE table1.number IS NULL AND table1.groupby = 2 Result: T1_Groupby T1_Number T2_Number ----------- ----------- ----------- (0) rows returned

    Read the article

  • dataset to List<T>using linq

    - by bharat
    I have the dataset and I want to convert the dataset into List<T> T - type object How do I do it my dataset? It has 10 columns, with all 10 properties my object has and it's returning over 15000 rows. I want to return that dataset into List<obj> and loop it how do I do that?

    Read the article

  • How do I get entity for primary key using EntityDataSource in ASP.NET

    - by drasto
    I have a GridView in my ASP.NET application that takes data to be rendered from EntityDataSource. GridView allows user to select rows. I want to get the entity that corresponds to the row user selected. I can get from GridView the ID(primary key) of the entity that corresponds to the row selected. How can I get the Entity that has that ID(primary key) ?

    Read the article

  • List - Strings - Textfiles

    - by b3y4z1d
    I've got a few questions concerning text files,list and strings. I wonder if it is possible to put in a code which reads the text in a textfile,and then using "string line;" or something else to define each new row of the text and turn all of them into one list. So I can sort the rows, remove a row or two or even all of them or search through the text for a specific row.

    Read the article

  • Optimize this MySQL query?

    - by HipHop-opatamus
    The following query takes FOREVER to execute (30+ hrs on a Macbook w/4gig ram) - I'm looking for ways to make it run more efficiently. Any thoughts are appreciated! CREATE TABLE fc AS SELECT threadid, title, body, date, userlogin FROM f WHERE pid NOT IN (SELECT pid FROM ft) ORDER BY date; (table "f" is ~1 Gig / 1,843,000 row, table "ft" is 168mb, 216,000 rows) )

    Read the article

  • Speed boost to adjacency matrix

    - by samoz
    I currently have an algorithm that operates on an adjacency matrix of size n by m. In my algorithm, I need to zero out entire rows or columns at a time. My implementation is currently O(m) or O(n) depending on if it's a column or row. Is there any way to zero out a column or row in O(1) time?

    Read the article

  • SQL - Order against two columns at the same time (intersecting)

    - by Alex
    I have a table with the fields CommonName and FirstName. Only either field has data, never both. Is there a way to order rows in an intersecting manner on SQL Server? Example: CommonName FirstName Bern Wade Ashley Boris Ayana I want records ordered like this: CommonName FirstName Ashley Ayana Bern Boris Wade Is this possible, and if so, how?

    Read the article

  • How can i solve out of Exception error in list generic ?

    - by Phsika
    How can i solve out of memory exception in list generic if adding new value foreach(DataColumn dc in dTable.Columns) foreach (DataRow dr in dTable.Rows) myScriptCellsCount.MyCellsCharactersCount.Add(dr[dc].ToString().Length); MyBase Class: public class MyExcelSheetsCells { public List<int> MyCellsCharactersCount { get; set; } public MyExcelSheetsCells() { MyCellsCharactersCount = new List<int>(); } }

    Read the article

  • Trigger Code on a table in my ERP Database

    - by David Stein
    My ERP Vendor has the following trigger on a table: SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TRIGGER [dbo].[SOItem_DeleteCheck] ON [dbo].[soitem] FOR DELETE AS BEGIN DECLARE @RecCnt int, @LogInfo varchar(256) SET @RecCnt = (SELECT COUNT(*) FROM deleted) IF @RecCnt > 150 BEGIN RAISERROR (54010, 18, 1, 'SOItem') WITH LOG ROLLBACK TRANSACTION END SET @LogInfo = 'Deleting ' + LTRIM(STR(@RecCnt)) + ' Rows From SOItem' EXEC LogDeletes @LogInfo END GO This seems very inefficient to me. Doesn't select count(*) take longer than Count(specific field)?

    Read the article

  • What is C# equivalent of PHP's mysql_fetch_array function?

    - by Mike Biff
    I am learning C#/ASP.NET and I am wondering what the C# equivalent of the following PHP code is? I know the userid, and I want to fetch the rows from this table into the array of the variable "row", so I then can use it as "row['name']" and "row['email']. $result = mysql_query("SELECT email, name FROM mytable WHERE id=7"); while ($row = mysql_fetch_array($result, MYSQL_ASSOC)) { printf("Email: %s Name: %s", $row["email"], $row["name"]); } Thanks.

    Read the article

  • GWT Table that supports dynamic filtering

    - by Holograham
    This question is similar to http://stackoverflow.com/questions/161686/gwt-table-that-supports-sorting-scrolling-and-filtering However I would prefer open source and I am looking for snappy performance. I want a good way to perform dynamic filtering on rows. SmartGWT's adaptive filter looks interesting. http://www.smartclient.com/smartgwt/showcase/#grid_adaptive_filter_featured_category Anyone have any experience with this?

    Read the article

  • Searchlogic cannot sort search result

    - by jaycode
    Imagine a code: search = Project.search( :title_or_description_or_child_name_or_child_age_or_inspiration_or_decorating_style_or_favorite_item_or_others_like_any => keys, :galleries_id_like_any => @g, :styles_id_like_any => @st, :tags_like_any => @t ) search.all returns the rows correctly. But search.descend_by_views returns nil. Is this gem buggy? What else should I use then?

    Read the article

  • How can I use generic here.

    - by Shantanu Gupta
    I am trying to use generic for the first time and trying to typecast my result returned from database to programmer defined data type. How can I do this. dsb.ExecuteQuery( "DELETE FROM CurrencyMaster WHERE CurrencyMasterId=" + returnValueFromGrid<int>(getSelectedRowIndex(), "CurrencyMasterId"));` private T returnValueFromGrid<T>(int RowNo, string ColName) { return Convert.ChangeType(dgvCurrencyMaster.Rows[RowNo].Cells[ColName].Value,T); }

    Read the article

  • Using listactivity view to create table in android

    - by cppdev
    I'm pretty new to android. I want to create a table in my Wapplication that will have three columns. Once column would have string, another would have a image and last column would have a integer. Again, I want to have table rows selectable. Can this be achieved by extending ListActivity ? What is the best to way to create such a table in android ?

    Read the article

  • Execution time of ALTER COLUMN

    - by Tommy Jakobsen
    Having a table with 60 columns, and 200 rows. Altering a BIT column from NULL to NOT NULL, now has a running execution time of over 3 hours. Why is this taking so long? This is the query that I'm execution: ALTER TABLE tbl ALTER COLUMN col BIT NOT NULL Is there a faster way to do it, besides creating a new column, updating it with values from the old column, then dropping the old column and renaming the new one? This is on MS SQL Server 2005.

    Read the article

  • SSIS flat file insertion failure to rollback

    - by Pramodtech
    I have simple SSIS package which reads data from flat file and insert into SQL database. The file has 90K rows and sometimes because of bad data package fails but it insert the partial records before it fails. What I need is if insertion fails at any time between, no records should be inserted into DB, rollback everything. how can I put it in transaction?

    Read the article

< Previous Page | 134 135 136 137 138 139 140 141 142 143 144 145  | Next Page >