Search Results

Search found 15 results on 1 pages for 'rowstate'.

Page 1/1 | 1 

  • Gridview adding row dynamically on RowDataBound with the same RowState (Alternate or Normal)

    - by rob waminal
    I am adding rows dynamically on code behind depending on the Row currently bounded on RowDataBound event. I want that added row to be the same state (Alternate or Normal) of the currently bounded row is this possible? I'm doing something like this but it doesn't follow what I want. I'm expecting the added row dynamically would be the same as the current row. But it is not. protected void gv_RowDataBound(object sender, GridViewRowEventArgs e) { if(e.Row.RowType == DataControlRowType.DataRow) { GVData data = e.Row.DataItem as GVData; // not original object just for brevity if(data.AddHiddenRow) { // the row state here should be the same as the current GridViewRow tr = new GridViewRow(e.Row.RowIndex +1, e.Row.RowIndex + 1, DataControlRowType.DataRow, e.Row.RowState); TableCell newTableCell = new TableCell(); newTableCell.Style.Add("display", "none"); tr.Cells.Add(newTableCell); ((Table)e.Row.Parent).Rows.Add(tr); } } }

    Read the article

  • Read DataTable by RowState

    - by RBrattas
    Hi, I am reading my DataTable as follow: foreach ( DataRow o_DataRow in vco_DataTable.Rows ) { //Insert More Here } It crash; because I insert more records. How can I read my DataTable without reading the new records? Can I read by RowState? Thank you for your excellence work, Rune

    Read the article

  • Read DataTable by RowState

    - by RBrattas
    I am reading my DataTable as follow: foreach ( DataRow o_DataRow in vco_DataTable.Rows ) { //Insert More Here } It crash; because I insert more records. How can I read my DataTable without reading the new records? Can I read by RowState? Thanks

    Read the article

  • I insert new parent row and child rowstate changes from Added to unchanged

    - by Joel
    rowsUpdated is an int32 to count how many rows are updated rowsToUpdate = dataset.ParentTable.Select("", "", dataviewRowState.Added) if rowsToUpdate isNot Nothing then for each row as datarow in RowsToUpdate changes the rowstate: rowsUpdated = rowsUpdated + ParentTableAdapter.update(row) Next row End if I'm sure it's something I'm over looking but I just can't see it. Thanks in advance, Joel

    Read the article

  • Preserving DataRowState when serializing DataSet using DataContractSerializer

    - by user349453
    For various reasons I am having to send a typed dataset to a WCF service endpoint. This works fine except that upon Deserializing, the RowState of each row in each DataTable is set to 'Added', regardless of what they were on the client. If I write the serialized stream out to a file, I see that the RowState is not part of the Serialized data. How can I add this so that I can preserve the RowState across service boundaries? Not that I think it matters, but the client process is running .net 3.5 while the service process is running .net 4.0

    Read the article

  • bug in my jquery code while trying replace html elements with own values

    - by loviji
    I today ask a question, about using Jquery, replace html elements with own values. link text And I use answer, and get a problem. function replaceWithValues not works same in all cases.I call this function two times: 1. btnAddParam click 2. btnCancelEdit click $("#btnAddParam").click(function() { var lastRow = $('#mainTable tr:last'); var rowState = $("#mainTable tr:last>td:first"); replaceWithValues(lastRow, rowState); var htmlToAppend = "<tr bgcolor='#B0B0B0' ><td class='textField' er='editable'><input value='' type='text' /></td><td><textarea cols='40' rows='3' ></textarea></td><td>" + initSelectType(currentID) + "</td><td><input id='txt" + currentID + "3' type='text' class='measureUnit' /></td><td><input type='checkbox' /></td><td></td></tr>"; $("#mainTable").append(htmlToAppend); }); //buttonCancelEdit located in end of row $('#mainTable input:button').unbind().live('click', function() { var row = $(this).closest('tr'); var rowState = $(this).closest('tr').find("td:first"); replaceWithValues(row, rowState); $(this).remove(); }); //do row editable -- replaceWithElements $('#mainTable tr').unbind().live('click', function() { if ($(this).find("td:first").attr("er") == "readable") { var rowState = $(this).closest('tr').find("td:first"); replaceWithElements($(this), rowState); } }); function replaceWithValues(row, er) { if (er.attr("er") == "editable") { var inputElements = $('td > input:text', row); inputElements.each(function() { var value = $(this).val(); $(this).replaceWith(value); }); er.attr("er", "readable"); } } function replaceWithElements(row, er) { if (er.attr("er") == "readable") { var tdinit = $("<td>").attr("er", "editable").addClass("textField"); $('.textField', row).each(function() { var element = tdinit.append($("<input type='text' value="+$.trim($(this).text())+" />")); $(this).empty().replaceWith(element); }); row.find("td:last").append("<input type='button'/>"); //$('.selectField') ... //$('.textAreaField') ... } } $("#btnAddParam").click() function works well. it call function replaceWithValues. I call $('#mainTable tr').unbind().live('click', function() { } to do row editable, and it creates a button in the end of row. After user can click this button and call function $('#mainTable input:button').unbind().live('click', function() {}. and this function call function replaceWithValues. but in this case it doesn't work.

    Read the article

  • Binding Click event in KnockoutJS

    - by user1918553
    I have a which has a css binding according to the value of 'rowState' as follows which is working fine. Now, I need to bind the 'click' event to fire only if the rowState is not 2. I tried the following, but with no success. I do not want to use the if statement as the div has got lots of content which I would need to repeat. The only difference is to make the div not clickable if rowState is 2. Could you please help me to sort this out?

    Read the article

  • Finding Buried Controls

    - by Bunch
    This post is pretty specific to an issue I had but still has some ideas that could be applied in other scenarios. The problem I had was updating a few buttons so their Text values could be set in the code behind which had a method to grab the proper value from an external source. This was so that if the application needed to be installed by a customer using a language other than English or needed a different notation for the button's Text they could simply update the database. Most of the time this was no big deal. However I had one instance where the button was part of a control, the button had no set ID and that control was only found in a dll. So there was no markup to edit for the Button. Also updating the dll was not an option so I had to make the best of what I had to work with. In the cs file for the aspx file with the control on it I added the Page_LoadComplete. The problem button was within a GridView so I added a foreach to go through each GridViewRow and find the button I needed. Since I did not have an ID to work with besides a random ctl00$main$DllControl$gvStuff$ctl03$ctl05 using the GridView's FindControl was out. I ended up looping through each GridViewRow, then if a RowState equaled Edit loop through the Cells, each control in the Cell and check each control to see if it held a Panel that contained the button. If the control was a Panel I could then loop through the controls in the Panel, find the Button that had text of "Update" (that was the hard coded part) and change it using the method to return the proper value from the database. if (rowState.Contains("Edit")){  foreach (DataControlFieldCell rowCell in gvr.Cells)  {   foreach (Control ctrl in rowCell.Controls)   {    if (ctrl.GetType() == typeof(Panel))     {     foreach (Control childCtrl in ctrl.Controls)     {      if (childCtrl.GetType() == typeof(Button))      {       Button update = (Button)childCtrl;       if (update.Text == "Update")       {        update.Text = method to return the external value for the button's text;       }      }     }    }   }  }} Tags: ASP.Net, CSharp

    Read the article

  • How to save data from model without any association in cakephp [on hold]

    - by Abhishek
    I have base model in that i use dataManipulation method for updation in my code ,so i want to save data in receipt, receiptline model also in OpeningBankStatement.but i create association for receipt and receiptline not for OpeningBankStatement. So i want to save data this OpeningBankStatement model without any association.my demo code is. Array ( [Receipt] => Array ( [ID] => 566 [ObjectType] => 84 [TXNName] => bbnm [TXNDate] => 03-06-2014 [BranchID] => 1 [Narration1] => 267 [Narration] => Cheque Received [ExecutiveID] => 805 [AccountType] => 104 [Account] => 68 [ReferenceNo] => [TXNCurrencyID] => 3 [ExchangeRate] => 1.00000 [ManualAdiustment] => 0 [RevisionNumber] => 1 [CompanyID] => 1 [Status] => 633 ) [ReceiptLine] => Array ( [0] => Array ( [TXNID] => 566 [LineNo] => 0 [LineType_072] => 429 [BranchID] => 1 [AccountID] => 68 [ContactID] => [Amount] => 0 [CancelAmount] => 0 [OpenAmount] => 0 [Narration] => Cheque Received [CreatedBy] => 229 [ModifiedBy] => 229 [CreatedDate] => 2014-06-03 00:00:00 [ModifiedDate] => 2014-06-03 00:00:00 [Status] => 1 [RevisionNumber] => 1 [RowState] => [tmpInstrumentDate] => ) [1] => Array ( [LineNo] => 0 [RowState] => 436 [TXNID] => 0 [BranchID] => 1 [ContactID] => [AccountID] => 68 [Narration] => Cheque Received [Amount] => 0 [RevisionNumber] => 1 [LineType_072] => 460 [CancelAmount] => 0 [OpenAmount] => 0 [Status] => 1 ) ) [OpeningBankStatement] => Array ( [ObjectType] => 131 [TXNSeries] => 1 [TXNNo] => 12345 [TXNName] => bbnm [TXNDate] => 03-06-2014 [CompanyID] => 1 [AccountID] => 68 [ExecutiveID] => 805 [Narration] => Cheque Received [ReferenceNo] => [ParentObjectType] => 84 [ParentTXNID] => 1 [CancelledBy] => 1 [CancelledDate] => 2014-02-02 [CancellationRemarks] => hfg [Status] => 1 [RevisionNumber] => 1 ) ) By any dyanamic model association or callback method it solve? suggest solution.

    Read the article

  • Accessing deleted rows from a DataTable

    - by Ken
    Hello: I have a parent WinForm that has a MyDataTable _dt as a member. The MyDataTable type was created in the "typed dataset" designer tool in Visual Studio 2005 (MyDataTable inherits from DataTable) _dt gets populated from a db via ADO.NET. Based on changes from user interaction in the form, I delete a row from the table like so: _dt.FindBySomeKey(_someKey).Delete(); Later on, _dt is passed by value to a dialog form. From there, I need to scan through all the rows to build a string: foreach (myDataTableRow row in _dt) { sbFilter.Append("'" + row.info + "',"); } The problem is that upon doing this after a delete, the following exception is thrown: DeletedRowInaccessibleException: Deleted row information cannot be accessed through the row. The work around that I am currently using (which feels like a hack) is the following: foreach (myDataTableRow row in _dt) { if (row.RowState != DataRowState.Deleted && row.RowState != DataRowState.Detached) { sbFilter.Append("'" + row.info + "',"); } } My question: Is this the proper way to do this? Why would the foreach loop access rows that have been tagged via the Delete() method??

    Read the article

  • GridView take a Row

    - by GIbboK
    Hi I use asp.net 4 and c#. I have a GridView, I would like take a Row when in Edit Mode in my code and find a control. Here my code, but does not work, it takes only the first row for the GridView. Any ideas? protected void uxManageSlotsDisplayer_RowDataBound(object sender, GridViewRowEventArgs e) { switch (e.Row.RowType) { case DataControlRowType.DataRow: // Take Row in Edit Mode DOES NOT WORK PROEPRLY if (e.RowState == DataControlRowState.Edit) { Label myTest = (Label)e.Row.FindControl("uxTest"); } break; }

    Read the article

  • Strange behaviour of DataTable with DataGridView

    - by Paul
    Please explain me what is happening. I have created a WinForms .NET application which has DataGridView on a form and should update database when DataGridView inline editing is used. Form has SqlDataAdapter _da with four SqlCommands bound to it. DataGridView is bound directly to DataTable _names. Such a CellValueChanged handler: private void dataGridView1_CellValueChanged(object sender, DataGridViewCellEventArgs e) { _da.Update(_names); } does not update database state although _names DataTable is updated. All the rows of _names have RowState == DataRowState.Unchanged Ok, I modified the handler: private void dataGridView1_CellValueChanged(object sender, DataGridViewCellEventArgs e) { DataRow row = _names.Rows[e.RowIndex]; row.BeginEdit(); row.EndEdit(); _da.Update(_names); } this variant really writes modified cell to database, but when I attempt to insert new row into grid, I get an error about an absence of row with index e.RowIndex So, I decided to improve the handler further: private void dataGridView1_CellValueChanged(object sender, DataGridViewCellEventArgs e) { if (_names.Rows.Count<e.RowIndex) { DataRow row = _names.Rows[e.RowIndex]; row.BeginEdit(); row.EndEdit(); } else { DataRow row = _names.NewRow(); row["NameText"] = dataGridView1["NameText", e.RowIndex].Value; _names.Rows.Add(row); } _da.Update(_names); } Now the really strange things happen when I insert new row to grid: the grid remains what it was until _names.Rows.Add(row); After this line THREE rows are inserted into table - two rows with the same value and one with Null value. The slightly modified code: DataRow row = _names.NewRow(); row["NameText"] = "--------------" _names.Rows.Add(row); inserts three rows with three different values: one as entered into the grid, the second with "--------------" value and third - with Null value. I really got stuck in guessing what is happening.

    Read the article

  • Bulk inserting best way to about it? + Helping me understand fully what I found so far

    - by chobo2
    Hi So I saw this post here and read it and it seems like bulk copy might be the way to go. http://stackoverflow.com/questions/682015/whats-the-best-way-to-bulk-database-inserts-from-c I still have some questions and want to know how things actually work. So I found 2 tutorials. http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx First way uses 2 ado.net 2.0 features. BulkInsert and BulkCopy. the second one uses linq to sql and OpenXML. This sort of appeals to me as I am using linq to sql already and prefer it over ado.net. However as one person pointed out in the posts what he just going around the issue at the cost of performance( nothing wrong with that in my opinion) First I will talk about the 2 ways in the first tutorial I am using VS2010 Express, .net 4.0, MVC 2.0, SQl Server 2005 Is ado.net 2.0 the most current version? Based on the technology I am using, is there some updates to what I am going to show that would improve it somehow? Is there any thing that these tutorial left out that I should know about? BulkInsert I am using this table for all the examples. CREATE TABLE [dbo].[TBL_TEST_TEST] ( ID INT IDENTITY(1,1) PRIMARY KEY, [NAME] [varchar](50) ) SP Code USE [Test] GO /****** Object: StoredProcedure [dbo].[sp_BatchInsert] Script Date: 05/19/2010 15:12:47 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[sp_BatchInsert] (@Name VARCHAR(50) ) AS BEGIN INSERT INTO TBL_TEST_TEST VALUES (@Name); END C# Code /// <summary> /// Another ado.net 2.0 way that uses a stored procedure to do a bulk insert. /// Seems slower then "BatchBulkCopy" way and it crashes when you try to insert 500,000 records in one go. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchInsert() { // Get the DataTable with Rows State as RowState.Added DataTable dtInsertRows = GetDataTable(); SqlConnection connection = new SqlConnection(connectionString); SqlCommand command = new SqlCommand("sp_BatchInsert", connection); command.CommandType = CommandType.StoredProcedure; command.UpdatedRowSource = UpdateRowSource.None; // Set the Parameter with appropriate Source Column Name command.Parameters.Add("@Name", SqlDbType.VarChar, 50, dtInsertRows.Columns[0].ColumnName); SqlDataAdapter adpt = new SqlDataAdapter(); adpt.InsertCommand = command; // Specify the number of records to be Inserted/Updated in one go. Default is 1. adpt.UpdateBatchSize = 1000; connection.Open(); int recordsInserted = adpt.Update(dtInsertRows); connection.Close(); } So first thing is the batch size. Why would you set a batch size to anything but the number of records you are sending? Like I am sending 500,000 records so I did a Batch size of 500,000. Next why does it crash when I do this? If I set it to 1000 for batch size it works just fine. System.Data.SqlClient.SqlException was unhandled Message="A transport-level error has occurred when sending the request to the server. (provider: Shared Memory Provider, error: 0 - No process is on the other end of the pipe.)" Source=".Net SqlClient Data Provider" ErrorCode=-2146232060 Class=20 LineNumber=0 Number=233 Server="" State=0 StackTrace: at System.Data.Common.DbDataAdapter.UpdatedRowStatusErrors(RowUpdatedEventArgs rowUpdatedEvent, BatchCommandInfo[] batchCommands, Int32 commandCount) at System.Data.Common.DbDataAdapter.UpdatedRowStatus(RowUpdatedEventArgs rowUpdatedEvent, BatchCommandInfo[] batchCommands, Int32 commandCount) at System.Data.Common.DbDataAdapter.Update(DataRow[] dataRows, DataTableMapping tableMapping) at System.Data.Common.DbDataAdapter.UpdateFromDataTable(DataTable dataTable, DataTableMapping tableMapping) at System.Data.Common.DbDataAdapter.Update(DataTable dataTable) at TestIQueryable.Program.BatchInsert() in C:\Users\a\Downloads\TestIQueryable\TestIQueryable\TestIQueryable\Program.cs:line 124 at TestIQueryable.Program.Main(String[] args) in C:\Users\a\Downloads\TestIQueryable\TestIQueryable\TestIQueryable\Program.cs:line 16 InnerException: Time it took to insert 500,000 records with insert batch size of 1000 took "2 mins and 54 seconds" Of course this is no official time I sat there with a stop watch( I am sure there are better ways but was too lazy to look what they where) So I find that kinda slow compared to all my other ones(expect the linq to sql insert one) and I am not really sure why. Next I looked at bulkcopy /// <summary> /// An ado.net 2.0 way to mass insert records. This seems to be the fastest. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchBulkCopy() { // Get the DataTable DataTable dtInsertRows = GetDataTable(); using (SqlBulkCopy sbc = new SqlBulkCopy(connectionString, SqlBulkCopyOptions.KeepIdentity)) { sbc.DestinationTableName = "TBL_TEST_TEST"; // Number of records to be processed in one go sbc.BatchSize = 500000; // Map the Source Column from DataTabel to the Destination Columns in SQL Server 2005 Person Table // sbc.ColumnMappings.Add("ID", "ID"); sbc.ColumnMappings.Add("NAME", "NAME"); // Number of records after which client has to be notified about its status sbc.NotifyAfter = dtInsertRows.Rows.Count; // Event that gets fired when NotifyAfter number of records are processed. sbc.SqlRowsCopied += new SqlRowsCopiedEventHandler(sbc_SqlRowsCopied); // Finally write to server sbc.WriteToServer(dtInsertRows); sbc.Close(); } } This one seemed to go really fast and did not even need a SP( can you use SP with bulk copy? If you can would it be better?) BatchCopy had no problem with a 500,000 batch size.So again why make it smaller then the number of records you want to send? I found that with BatchCopy and 500,000 batch size it took only 5 seconds to complete. I then tried with a batch size of 1,000 and it only took 8 seconds. So much faster then the bulkinsert one above. Now I tried the other tutorial. USE [Test] GO /****** Object: StoredProcedure [dbo].[spTEST_InsertXMLTEST_TEST] Script Date: 05/19/2010 15:39:03 ******/ SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO ALTER PROCEDURE [dbo].[spTEST_InsertXMLTEST_TEST](@UpdatedProdData nText) AS DECLARE @hDoc int exec sp_xml_preparedocument @hDoc OUTPUT,@UpdatedProdData INSERT INTO TBL_TEST_TEST(NAME) SELECT XMLProdTable.NAME FROM OPENXML(@hDoc, 'ArrayOfTBL_TEST_TEST/TBL_TEST_TEST', 2) WITH ( ID Int, NAME varchar(100) ) XMLProdTable EXEC sp_xml_removedocument @hDoc C# code. /// <summary> /// This is using linq to sql to make the table objects. /// It is then serailzed to to an xml document and sent to a stored proedure /// that then does a bulk insert(I think with OpenXML) /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertXMLBatch() { using (TestDataContext db = new TestDataContext()) { TBL_TEST_TEST[] testRecords = new TBL_TEST_TEST[500000]; for (int count = 0; count < 500000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; testRecords[count] = testRecord; } StringBuilder sBuilder = new StringBuilder(); System.IO.StringWriter sWriter = new System.IO.StringWriter(sBuilder); XmlSerializer serializer = new XmlSerializer(typeof(TBL_TEST_TEST[])); serializer.Serialize(sWriter, testRecords); db.insertTestData(sBuilder.ToString()); } } So I like this because I get to use objects even though it is kinda redundant. I don't get how the SP works. Like I don't get the whole thing. I don't know if OPENXML has some batch insert under the hood but I do not even know how to take this example SP and change it to fit my tables since like I said I don't know what is going on. I also don't know what would happen if the object you have more tables in it. Like say I have a ProductName table what has a relationship to a Product table or something like that. In linq to sql you could get the product name object and make changes to the Product table in that same object. So I am not sure how to take that into account. I am not sure if I would have to do separate inserts or what. The time was pretty good for 500,000 records it took 52 seconds The last way of course was just using linq to do it all and it was pretty bad. /// <summary> /// This is using linq to sql to to insert lots of records. /// This way is slow as it uses no mass insert. /// Only tried to insert 50,000 records as I did not want to sit around till it did 500,000 records. /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertAll() { using (TestDataContext db = new TestDataContext()) { db.CommandTimeout = 600; for (int count = 0; count < 50000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; db.TBL_TEST_TESTs.InsertOnSubmit(testRecord); } db.SubmitChanges(); } } I did only 50,000 records and that took over a minute to do. So I really narrowed it done to the linq to sql bulk insert way or bulk copy. I am just not sure how to do it when you have relationship for either way. I am not sure how they both stand up when doing updates instead of inserts as I have not gotten around to try it yet. I don't think I will ever need to insert/update more than 50,000 records at one type but at the same time I know I will have to do validation on records before inserting so that will slow it down and that sort of makes linq to sql nicer as your got objects especially if your first parsing data from a xml file before you insert into the database. Full C# code using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Xml.Serialization; using System.Data; using System.Data.SqlClient; namespace TestIQueryable { class Program { private static string connectionString = ""; static void Main(string[] args) { BatchInsert(); Console.WriteLine("done"); } /// <summary> /// This is using linq to sql to to insert lots of records. /// This way is slow as it uses no mass insert. /// Only tried to insert 50,000 records as I did not want to sit around till it did 500,000 records. /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertAll() { using (TestDataContext db = new TestDataContext()) { db.CommandTimeout = 600; for (int count = 0; count < 50000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; db.TBL_TEST_TESTs.InsertOnSubmit(testRecord); } db.SubmitChanges(); } } /// <summary> /// This is using linq to sql to make the table objects. /// It is then serailzed to to an xml document and sent to a stored proedure /// that then does a bulk insert(I think with OpenXML) /// http://www.codeproject.com/KB/linq/BulkOperations_LinqToSQL.aspx /// </summary> private static void LinqInsertXMLBatch() { using (TestDataContext db = new TestDataContext()) { TBL_TEST_TEST[] testRecords = new TBL_TEST_TEST[500000]; for (int count = 0; count < 500000; count++) { TBL_TEST_TEST testRecord = new TBL_TEST_TEST(); testRecord.NAME = "Name : " + count; testRecords[count] = testRecord; } StringBuilder sBuilder = new StringBuilder(); System.IO.StringWriter sWriter = new System.IO.StringWriter(sBuilder); XmlSerializer serializer = new XmlSerializer(typeof(TBL_TEST_TEST[])); serializer.Serialize(sWriter, testRecords); db.insertTestData(sBuilder.ToString()); } } /// <summary> /// An ado.net 2.0 way to mass insert records. This seems to be the fastest. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchBulkCopy() { // Get the DataTable DataTable dtInsertRows = GetDataTable(); using (SqlBulkCopy sbc = new SqlBulkCopy(connectionString, SqlBulkCopyOptions.KeepIdentity)) { sbc.DestinationTableName = "TBL_TEST_TEST"; // Number of records to be processed in one go sbc.BatchSize = 500000; // Map the Source Column from DataTabel to the Destination Columns in SQL Server 2005 Person Table // sbc.ColumnMappings.Add("ID", "ID"); sbc.ColumnMappings.Add("NAME", "NAME"); // Number of records after which client has to be notified about its status sbc.NotifyAfter = dtInsertRows.Rows.Count; // Event that gets fired when NotifyAfter number of records are processed. sbc.SqlRowsCopied += new SqlRowsCopiedEventHandler(sbc_SqlRowsCopied); // Finally write to server sbc.WriteToServer(dtInsertRows); sbc.Close(); } } /// <summary> /// Another ado.net 2.0 way that uses a stored procedure to do a bulk insert. /// Seems slower then "BatchBulkCopy" way and it crashes when you try to insert 500,000 records in one go. /// http://www.codeproject.com/KB/cs/MultipleInsertsIn1dbTrip.aspx#_Toc196622241 /// </summary> private static void BatchInsert() { // Get the DataTable with Rows State as RowState.Added DataTable dtInsertRows = GetDataTable(); SqlConnection connection = new SqlConnection(connectionString); SqlCommand command = new SqlCommand("sp_BatchInsert", connection); command.CommandType = CommandType.StoredProcedure; command.UpdatedRowSource = UpdateRowSource.None; // Set the Parameter with appropriate Source Column Name command.Parameters.Add("@Name", SqlDbType.VarChar, 50, dtInsertRows.Columns[0].ColumnName); SqlDataAdapter adpt = new SqlDataAdapter(); adpt.InsertCommand = command; // Specify the number of records to be Inserted/Updated in one go. Default is 1. adpt.UpdateBatchSize = 500000; connection.Open(); int recordsInserted = adpt.Update(dtInsertRows); connection.Close(); } private static DataTable GetDataTable() { // You First need a DataTable and have all the insert values in it DataTable dtInsertRows = new DataTable(); dtInsertRows.Columns.Add("NAME"); for (int i = 0; i < 500000; i++) { DataRow drInsertRow = dtInsertRows.NewRow(); string name = "Name : " + i; drInsertRow["NAME"] = name; dtInsertRows.Rows.Add(drInsertRow); } return dtInsertRows; } static void sbc_SqlRowsCopied(object sender, SqlRowsCopiedEventArgs e) { Console.WriteLine("Number of records affected : " + e.RowsCopied.ToString()); } } }

    Read the article

  • How can I bind events to strongly typed datasets of different types?

    My application contains several forms which consist of a strongly typed datagridview, a strongly typed bindingsource, and a strongly typed table adapter. I am using some code in each form to update the database whenever the user leaves the current row, shifts focus away from the datagrid or the form, or closes the form. This code is the same in each case, so I want to make a subclass of form, from which all of these forms can inherit. But the strongly typed data objects all inherit from component, which doesn't expose the events I want to bind to or the methods I want to invoke. The only way I can see of gaining access to the events is to use: Type(string Name).GetEvent(string EventName).AddEventHandler(object Target,Delegate Handler) Similarly, I want to call the Update method of the strongly typed table adapter, and am using Type(string Name).GetMethod(String name, Type[] params).Invoke(object target, object[] params). It works ok, but it seems very heavy handed. Is there a better way? Here is my code for the main class: using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Windows.Forms; using System.Data; using System.Data.SqlClient; using System.ComponentModel; namespace MyApplication { public class AutoSaveDataGridForm: Form { private DataRow PreviousRow; public Component Adapter { private get; set; } private Component dataGridView; public Component DataGridView { private get { return dataGridView; } set { dataGridView = value; Type t = dataGridView.GetType(); t.GetEvent("Leave").AddEventHandler(dataGridView, new EventHandler(DataGridView_Leave)); } } private Component bindingSource; public Component BindingSource { private get { return bindingSource; } set { bindingSource = value; Type t = bindingSource.GetType(); t.GetEvent("PositionChanged").AddEventHandler(bindingSource, new EventHandler(BindingSource_PositionChanged)); } } protected void Save() { if (PreviousRow != null && PreviousRow.RowState != DataRowState.Unchanged) { Type t = Adapter.GetType(); t.GetMethod("Update", new Type[] { typeof(DataRow[]) }).Invoke(Adapter, new object[] { new DataRow[] { PreviousRow } }); } } private void BindingSource_PositionChanged(object sender, EventArgs e) { BindingSource bindingSource = sender as BindingSource; DataRowView CurrentRowView = bindingSource.Current as DataRowView; DataRow CurrentRow = CurrentRowView.Row; if (PreviousRow != null && PreviousRow != CurrentRow) { Save(); } PreviousRow = CurrentRow; } private void InitializeComponent() { this.SuspendLayout(); // // AutoSaveDataGridForm // this.FormClosed += new System.Windows.Forms.FormClosedEventHandler(this.AutoSaveDataGridForm_FormClosed); this.Leave += new System.EventHandler(this.AutoSaveDataGridForm_Leave); this.ResumeLayout(false); } private void DataGridView_Leave(object sender, EventArgs e) { Save(); } private void AutoSaveDataGridForm_FormClosed(object sender, FormClosedEventArgs e) { Save(); } private void AutoSaveDataGridForm_Leave(object sender, EventArgs e) { Save(); } } } And here is a (partial) form which implements it: public partial class FileTypesInherited :AutoSaveDataGridForm { public FileTypesInherited() { InitializeComponent(); } private void FileTypesInherited_Load(object sender, EventArgs e) { // TODO: This line of code loads data into the 'sharedFoldersInformationV2DataSet.tblFileTypes' table. You can move, or remove it, as needed. this.tblFileTypesTableAdapter.Fill(this.sharedFoldersInformationV2DataSet.tblFileTypes); this.BindingSource = tblFileTypesBindingSource; this.Adapter = tblFileTypesTableAdapter; this.DataGridView = tblFileTypesDataGridView; } }

    Read the article

1