Search Results

Search found 27519 results on 1101 pages for 'sql learner'.

Page 572/1101 | < Previous Page | 568 569 570 571 572 573 574 575 576 577 578 579  | Next Page >

  • SQL Server, temporary tables with truncate vs table variable with delete

    - by Richard
    I have a stored procedure inside which I create a temporary table that typically contains between 1 and 10 rows. This table is truncated and filled many times during the stored procedure. It is truncated as this is faster than delete. Do I get any performance increase by replacing this temporary table with a table variable when I suffer a penalty for using delete (truncate does not work on table variables) Whilst table variables are mainly in memory and are generally faster than temp tables do I loose any benefit by having to delete rather than truncate?

    Read the article

  • Problem with write query

    - by phenevo
    Hi, I've got collection of geo objects in database: There are four Tables: Countries Regions Provinces Cities Cities has inter alia ProvinceCode Provinces has inter alia regionCode Regions has inter alia CountryCode And there is fifth Table: Descriptions ObjectCode ObjectType(country, region, province, city) Description. How to get from Descriptions table, all descriptions from objects which are in the definite country ??

    Read the article

  • Createing a new Index in SQL when current records don't meet that index

    - by Jonathan
    Hey all- I'd like to add an index to a table that already contains data. I know that there a few records currently in the table that are not unique with this new index. Clearly, MySQL won't let me add the index until all of them are. I need a query to identify the rows which currently have the same index. I can then delete or modify these rows as necessary. The new index contains 6 fields. Thanks- Jonathan

    Read the article

  • How to get identities of inserted data records using SQL bulk copy

    - by Olga
    Hello I have a ADO.NET dataTable with about 100.000 records. In this table there is a column "xyID" which has no values in it, because they are generated by insertion into my MSSQL Database. Now i have the problem, that i need this IDs for other processes. I am looking for a way to bulk copy this dataTable into the MSSQL database, and within the same "step" to "fill" my dataTable with the generated IDs. Thank you for your answers!

    Read the article

  • SqlBulkCopy is slow, doesn't utilize full network speed

    - by Alex
    Hi, for that past couple of weeks I have been creating generic script that is able to copy databases. The goal is to be able to specify any database on some server and copy it to some other location, and it should only copy the specified content. The exact content to be copied over is specified in a configuration file. This script is going to be used on some 10 different databases and run weekly. And in the end we are copying only about 3%-20% of databases which are as large as 500GB. I have been using the SMO assemblies to achieve this. This is my first time working with SMO and it took a while to create generic way to copy the schema objects, filegroups ...etc. (Actually helped find some bad stored procs). Overall I have a working script which is lacking on performance (and at times times out) and was hoping you guys would be able to help. When executing the WriteToServer command to copy large amount of data ( 6GB) it reaches my timeout period of 1hr. Here is the core code for copying table data. The script is written in PowerShell. $query = ("SELECT * FROM $selectedTable " + $global:selectiveTables.Get_Item($selectedTable)).Trim() Write-LogOutput "Copying $selectedTable : '$query'" $cmd = New-Object Data.SqlClient.SqlCommand -argumentList $query, $source $cmd.CommandTimeout = 120; $bulkData = ([Data.SqlClient.SqlBulkCopy]$destination) $bulkData.DestinationTableName = $selectedTable; $bulkData.BulkCopyTimeout = $global:tableCopyDataTimeout # = 3600 $reader = $cmd.ExecuteReader(); $bulkData.WriteToServer($reader); # Takes forever here on large tables The source and target databases are located on different servers so I kept track of the network speed as well. The network utilization never went over 1% which was quite surprising to me. But when I just transfer some large files between the servers, the network utilization spikes up to 10%. I have tried setting the $bulkData.BatchSize to 5000 but nothing really changed. Increasing the BulkCopyTimeout to an even greater amount would only solve the timeout. I really would like to know why the network is not being used fully. Anyone else had this problem? Any suggestions on networking or bulk copy will be appreciated. And please let me know if you need more information. Thanks. UPDATE I have tweaked several options that increase the performance of SqlBulkCopy, such as setting the transaction logging to simple and providing a table lock to SqlBulkCopy instead of the default row lock. Also some tables are better optimized for certain batch sizes. Overall, the duration of the copy was decreased by some 15%. And what we will do is execute the copy of each database simultaneously on different servers. But I am still having a timeout issue when copying one of the databases. When copying one of the larger databases, there is a table for which I consistently get the following exception: System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. It is thrown about 16 after it starts copying the table which is no where near my BulkCopyTimeout. Even though I get the exception that table is fully copied in the end. Also, if I truncate that table and restart my process for that table only, the tables is copied over without any issues. But going through the process of copying that entire database fails always for that one table. I have tried executing the entire process and reseting the connection before copying that faulty table, but it still errored out. My SqlBulkCopy and Reader are closed after each table. Any suggestions as to what else could be causing the script to fail at the point each time?

    Read the article

  • SQLServer:Namespaces preventing access to query data

    - by Brian
    Hi A beginners question, hopefully easily answered. I've got an xml file I want to load into SQLServer 2008 and extract the useful informaiton. I'm starting simple and just trying to extract the name (\gpx\name). The code I have is: DECLARE @x xml; SELECT @x = xCol.BulkColumn FROM OPENROWSET (BULK 'C:\Data\EM.gpx', SINGLE_BLOB) AS xCol; -- confirm the xml data is in @x select @x as XML_Data -- try and get the name of the gpx section SELECT c.value('name[1]', 'varchar(200)') as Name from @x.nodes('gpx') x(c) Below is a heavily shortened version of the xml file: <?xml version="1.0" encoding="utf-8"?> <gpx xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns:xsd="http://www.w3.org/2001/XMLSchema" version="1.0" creator="Groundspeak Pocket Query" xsi:schemaLocation="http://www.topografix.com/GPX/1/0 http://www.topografix.com/GPX/1/0/gpx.xsd http://www.groundspeak.com/cache/1/0 http://www.groundspeak.com/cache/1/0/cache.xsd" xmlns="http://www.topografix.com/GPX/1/0"> <name>EM</name> <desc>Geocache file generated by Groundspeak</desc> <author>Groundspeak</author> <email>[email protected]</email> <time>2010-03-24T14:01:36.4931342Z</time> <keywords>cache, geocache, groundspeak</keywords> <wpt lat="51.2586" lon="-2.213067"> <time>2008-03-30T07:00:00Z</time> <name>GC1APHM</name> <desc>Sandman's Noble Hoard by Sandman1973, Unknown Cache (2/3)</desc> <groundspeak:cache id="832000" available="True" archived="False" xmlns:groundspeak="http://www.groundspeak.com/cache/1/0"> <groundspeak:name>Sandman's Noble Hoard</groundspeak:name> <groundspeak:placed_by>Sandman1973</groundspeak:placed_by> </groundspeak:cache> </wpt> </gpx> If the first two lines are replaced with just: <gpx> the above example works correctly, however I then can't access groundspeak:name (/gpx/wpt/groundspeak:cache/groundspeak:name), so my guess its a problem with the namespace. Any help would be appriciated.

    Read the article

  • SSIS - Wizard vs manual vs programming

    - by alchemical
    I'd like to move 26 tables from one DB to another. I see I can do this in the SSIS Import and Export Wizard. I believe the other approach would be to select tools from the toolbar in Data Flow and then configure them all. When is it better to use the wizard and when is it best to create the package manually (with the visual tools) or programmatically? One thing I noticed with the Wizard is that it lets me select multiple tables at once, but I could not find a way to get back to that screen once the package is created, so that I could edit the various tables all in one place.

    Read the article

  • MissingMethodException ( Can`t find PInvoke DLL 'sqlceme30.dll ' ) for Windows Mobile

    - by anyinfonet
    Hello. I have developed a win mobile (v5.0) application and I use ONLY 1 database SQLITE with these references: System.Data.SQLite.dll (assembly version & product version : 1.0.65.0); SQLite.Interop.065.DLL (product version : 1.0 and is a c++ lib for first dll ). After 5 weeks of using of this application, I get today a weird exception and I dont understand what it is? Exception is: MissingMethodException Can`t find PInvoke DLL 'sqlceme30.dll ' at System.Data.SqlServerCe.SqlCeCommand.ReleaseNativeInterfaces() at System.Data.SqlServerCe.SqlCeCommand.Dispose(Boolean disposing) ...... What`s wrong? Anyone know about this to explain me please? By the way : until now I delevoped 3-4 applications (1 year ago )using these references and everything worked fine.

    Read the article

  • How to implement a Digg-like algorithm?

    - by Niklas
    Hi, How to implement a website with a recommendation system similar to stackoverflow/digg/reddit? I.e., users submit content and the website needs to calculate some sort of "hotness" according to how popular the item is. The flow is as follows: Users submit content Other users view and vote on the content (assume 90% of the users only views content and 10% actively votes up or down on content) New content is continuously submitted How do I implement an algorithm that calculates the "hotness" of a submitted item, preferably in real-time? Are there any best-practices or design patterns? I would assume that the algorithm takes the following into consideration: When an item was submitted When each vote was cast When the item was viewed E.g. an item that gets a constant trickle of votes would stay somewhat "hot" constantly while an item that receives a burst of votes when it is first submitted will jump to the top of the "hotness"-list but then fall down as the votes stop coming in. (I am using a MySQL+PHP but I am interested in general design patterns).

    Read the article

  • LINQ weak relation between tables

    - by cleric
    I have two tables with a weak relation. I need get a text value from one table using a key from another. I am using the following C# LINQ code: City = rea.tRealEstateContact.tPostnumre != null ? rea.tRealEstateContact.tPostnumre.Bynavn : string.Empty But when the key cannot be found in the table 1(tPostnumre), an exception is thrown. How should I do this?

    Read the article

  • Synchronization between Client database and Central Database

    - by Indranil Mutsuddy
    Hello, I am trying to develop UI in C# .NET to synchronize 7 instances of backup databases with the central database one by one (All holding same schema) .The backup database( all 7 instances client databases) which is brought to the central server in a removable device such pendrive will consist of mdf and ldf files from each client and will be attached to the server where the central database resides. After all the client backup databases are attached i need to synchronize(update existing data or insert new data to the central database residing in server) each backup database one by one to central database. I want to know as how i can synchronize betweeen a backup database with a central database using C# .NET

    Read the article

  • Referencing a Newly inserted Row's seeded PK in C# Linq

    - by Laurence Burke
    I want to use the primary key that was just created on the dc.submitchanges() to create a new EmployeeAddress row that references the employee to the address. protected void btnAdd_Click(object sender, EventArgs e) { if (txtZip.Text != "" && txtAdd1.Text != "" && txtCity.Text != "") { TestDataClassDataContext dc = new TestDataClassDataContext(); Address addr = new Address() { AddressLine1 = txtAdd1.Text, AddressLine2 = txtAdd2.Text, City = txtCity.Text, PostalCode = txtZip.Text, StateProvinceID = Convert.ToInt32(ddlState.SelectedValue) }; dc.Addresses.InsertOnSubmit(addr); lblSuccess.Visible = true; lblErrMsg.Visible = false; dc.SubmitChanges(); // // TODO: insert new row in EmployeeAddress to reference CurEmp to newly created address // SetAddrList(); } else { lblErrMsg.Text = "Invalid Input"; lblErrMsg.Visible = true; } } protected void SetAddrList() { TestDataClassDataContext dc = new TestDataClassDataContext(); dc.ObjectTrackingEnabled = false; var addList = from addr in dc.Addresses from eaddr in dc.EmployeeAddresses where eaddr.EmployeeID == _curEmpID && addr.AddressID == eaddr.AddressID select new { AddValue = addr.AddressID, AddText = addr.AddressID, }; ddlAddList.DataSource = addList; ddlAddList.DataValueField = "AddValue"; ddlAddList.DataTextField = "AddText"; ddlAddList.DataBind(); ddlAddList.Items.Add(new ListItem("<Add Address>", "-1")); }

    Read the article

  • NSPredicate that is the equivalent of SQL's LIKE

    - by randombits
    I'm looking for a way to use NSPredicate to set a LIKE condition to fetch objects. In addition to that, an OR would be useful as well. I'm trying to do something where if a user searches "James" I can write an NSPredicate that will do the equivalent of: select * from users where firstname LIKE '%James%' OR lastname LIKE '%James%';

    Read the article

  • NSIS takes ownership of IIS system files

    - by Lucas
    I recently encountered an issue with NSIS that I believe is related to an interaction with UAC, but I am at a loss to explain it and I do not know how to prevent it in the future. I have an installer that creates and removes IIS virtual directories using the NsisIIS plugin. The installer appeared worked correctly on my Windows 7 workstation. When the installer was run on a Windows 2008 R2 server it installed properly, but the uninstaller removed all of the virtual directories and put IIS is an unusable state; to the point that I had to remove the Default Web Site and re-add it. What I eventually found was that all of the IIS configuration files under C:\Windows\System32\inetsrv\config had a lock icon on them. Some investigation seem to indicate that this means a user account has taken ownership of the file, however all the files listed SYSTEM as the file owner. I did check a different server that I have not run the installer on, and it does not have the lock icon applied to the IIS files. I have also seen the same lock icon appear on other files that the NSIS installer creates. For instance, I have a Web.Config.tpl file that is processed using the NSIS ReplaceInFile which also appears with the lock icon after the installer finished. After I explicitly grant another user account access to the file, the lock icon goes away. I run the installer under the local Administrator account on the 2008 R2 server, so I do not get the UAC prompt. Here is the relevant code from the install.nsi file RequestExecutionLevel admin Section "Application" APP_SECTION SectionIn RO Call InstallApp SectionEnd Section "un.Uninstaller Section" Delete "$PROGRAMFILES\${PROGRAMFILESDIR}\Uninstall.exe" Call un.InstallApp SectionEnd Function InstallApp File /oname=Web.Config Web.Config.tpl !insertmacro ReplaceInFile Web.Config %CONNECTION_STRING% $CONNECTION_STRING FunctionEnd Function un.InstallApp ReadRegStr $0 HKLM "Software\${REGKEY}" "VirtualDir" NsisIIS::DeleteVDir "$0" Pop $0 FunctionEnd I have three questions stemming from this incident: How did this happen? How can I fix my installer to prevent it from happening again? How can I repair the permissions on the IIS config files.

    Read the article

  • How to differentiate between to similer fields in Linq Join tables

    - by Azhar
    How to differentiate between to select new fields e.g. Description c.Description and lt.Description DataTable lDt = new DataTable(); try { lDt.Columns.Add(new DataColumn("AreaTypeID", typeof(Int32))); lDt.Columns.Add(new DataColumn("CategoryRef", typeof(Int32))); lDt.Columns.Add(new DataColumn("Description", typeof(String))); lDt.Columns.Add(new DataColumn("CatDescription", typeof(String))); EzEagleDBDataContext lDc = new EzEagleDBDataContext(); var lAreaType = (from lt in lDc.tbl_AreaTypes join c in lDc.tbl_AreaCategories on lt.CategoryRef equals c.CategoryID where lt.AreaTypeID== pTypeId select new { lt.AreaTypeID, lt.Description, lt.CategoryRef, c.Description }).ToArray(); for (int j = 0; j< lAreaType.Count; j++) { DataRow dr = lDt.NewRow(); dr["AreaTypeID"] = lAreaType[j].LandmarkTypeID; dr["CategoryRef"] = lAreaType[j].CategoryRef; dr["Description"] = lAreaType[j].Description; dr["CatDescription"] = lAreaType[j].; lDt.Rows.Add(dr); } } catch (Exception ex) { }

    Read the article

  • Inserting Parameters, C#, T-Sql

    - by jpavlov
    I am trying to insert a parameter through an aspx page via text box. I set my parameters up, but evertime I executenonquery, the @Username shows up in the database instead of the actual value. Below is my code. Can anyone shed a little insight? SqlParameter @UserName = new SqlParameter("@UserName", System.Data.SqlDbType.VarChar); @UserName.Direction = ParameterDirection.Input; @UserName.Value = txtUserName.Text; cmd.Parameters.Add(@UserName);

    Read the article

  • Sharing session state between 2 ASP.NET applications using SQL Server

    - by Dave
    Hi I'm working on a site that has a requirement to share session between a cms application and an online store application on the same domain eg. mydomain.com and store.mydomain.com I've made some progress with it and it works on my local build between localhost/cms and localhost/store Basically I have done what is suggested in this article http://blogs.msdn.com/toddca/archive/2007/01/25/sharing-asp-net-session-state-across-applications.aspx and hacked the TempGetAppID Stored Procedure to return the same application id (1). This appears to work as it creates sessions with ids like 'abv5d2urx1asscfwuzw3wp4500000001', which is what I'd expect. My issue is that when I deploy it to our testing environment, it creates a new session when I navigate between the 2 sites. So when I start a session on the cms site, if I navigate to the store, it creates a new session. These are set up as 2 different websites in IIS7. In the web.config files for both sites, the and elements are both the same and are as follows (minus sensitive information) Has anyone got an ideas why this might not be working? I am sharing Forms Authentication across the 2 sites and that works fine. Any help or ideas would be greatly appreciated! Many thanks Dave

    Read the article

  • Convert var to DataTable

    - by cre-johnny07
    I have var item which I want to convert in to a Datatable. How can I do this. var items = (from myObj in this.Context.Orders group myObj by myObj.OrderDate.ToString("yyyy-mm") into ymGroup select new { Date = ymGroup.Key, Start = ymGroup.Min(c => c.OrderId), End = ymGroup.Max(c => c.OrderId) }); I need to convert the items into a DataTable. I don't want to use any foreach loop. How can I do this.?

    Read the article

  • WebSharingAppDemo-CEProviderEndToEnd Queries peerProvider for NeedsScope before any files are batche

    - by Don
    I'm building an application based on the WebSharingAppDemo-CEProviderEndToEnd. When I deploy the server portion on a server, the code gives the error "The path is not valid. Check the directory for the database." during the call to NeedsScope() in the CeWebSyncService.cs file. Obviously the server can't access the client's sdf but what is supposed to happen to make this work? The app uses batching to send the data and the batches have to be marshalled across to the temp directory but this problem is occurring before any files have been batched over. There is nothing for the server to look at to determine whether the peerProivider needs scope. What am I missing? public bool NeedsScope() { Log("NeedsSchema: {0}", this.peerProvider.Connection.ConnectionString); SqlCeSyncScopeProvisioning prov = new SqlCeSyncScopeProvisioning(); return !prov.ScopeExists(this.peerProvider.ScopeName, (SqlCeConnection)this.peerProvider.Connection); }

    Read the article

  • Passing sql results to views hard-codes views to database column names

    - by Galen
    I just realized that i may not be following best practices in regards to the MVC pattern. My issue is that my views "know" information about my database Here's my situation in psuedo code... My controller invokes a method from my model and passes it directly to the view view.records = tableGateway.getRecords() view.display() in my view each records as record print record.name print record.address ... In my view i have record.name and record.address, info that's hard-coded to my database. Is this bad? What other ways around it are there other than iterating over everything in the controller and basically rewriting the records collection. And that just seems silly. Thanks

    Read the article

  • hibernate get unique field result

    - by cometta
    i use below to get unique "departmentCode" , but by using distinct, my list only return 'departmentCode' all other fields are not retrieved from table, how to retrieve other fields as well like 'divisionCode' and make sure 'departmentCode' is always unique? DetachedCriteria crit = DetachedCriteria.forClass(Company.class); crit.setProjection(Projections.distinct(Projections.property("departmentCode")));

    Read the article

  • How do you cast a LinqToSql Table<TEntity> as a Table<IEntity> where TEntity : IEntity?

    - by DanM
    I'm trying to use DbLinq with a SQLite database, but I'm running into a problem when I try to cast an ITable as a Queryable<TEntity>. There is a known bug in DbLinq (Issue 211), which might be the source of my problem, but I wanted to make sure my code is sound and, if it is, find out if there might be something I can do to work around the bug. Here is the generic repository method that attempts to do the cast: public IQueryable<TEntity> GetAll() { return Table.Cast<TEntity>(); // Table is an ITable } This compiles, but if I pass in the interface IPerson for TEntity and the type of the entities in the table is Person (where Person : IPerson), I'm getting this error from DbLinq: S0133: Implement QueryMethod Queryable.Cast. Why am I trying to do this? I have a library project that doesn't know the type of the entity until runtime, but it does know the interface for the entity. So, I'm trying to cast to the interface type so that my library project can consume the data. Questions: Am I attempting an impossible cast or is this definitely a bug in DbLinq? How else could I go about solving my problem?

    Read the article

  • How to perform a Linq2Sql query on the following dataset

    - by Bas
    I have the following tables: Person(Id, FirstName, LastName) { (1, "John", "Doe"), (2, "Peter", "Svendson") (3, "Ola", "Hansen") (4, "Mary", "Pettersen") } Sports(Id, Name) { (1, "Tennis") (2, "Soccer") (3, "Hockey") } SportsPerPerson(Id, PersonId, SportsId) { (1, 1, 1) (2, 1, 3) (3, 2, 2) (4, 2, 3) (5, 3, 2) (6, 4, 1) (7, 4, 2) (8, 4, 3) } Looking at the tables, we can conclude the following facts: John plays Tennis John plays Hockey Peter plays Soccer Peter plays Hockey Ola plays Soccer Mary plays Tennis Mary plays Soccer Mary plays Hockey Now I would like to create a Linq2Sql query which retrieves the following: Get all Persons who play Hockey and Soccer Executing the query should return: Peter and Mary Anyone has any idea's on how to approach this in Linq2Sql?

    Read the article

  • Sync Framework, Local Database Cache, and my DAL

    - by Refracted Paladin
    I am creating a WPF app that needs to allow users to work in a temporary disconnected state and I plan to use a Local Database Cache. My question's are about my data access layer. Do you typically create the whole DAL to point at the Cache or both and create a switching mechanism? Is Entity's a good way to go for my DAL against the Cache? I am used to L2S but my understanding is that I can't use that against SQLCE, correct? Thanks! PS: Any good resources out there for using Sync, Linq, and WPF? Tutorials, videos, etc?

    Read the article

< Previous Page | 568 569 570 571 572 573 574 575 576 577 578 579  | Next Page >