Search Results

Search found 59643 results on 2386 pages for 'data migration'.

Page 111/2386 | < Previous Page | 107 108 109 110 111 112 113 114 115 116 117 118  | Next Page >

  • vb6 ADODB TSQL procedure call quit working after database migration

    - by phill
    This code was once working on sql server 2005. Now isolated in a visual basic 6 sub routine using ADODB to connect to a sql server 2008 database it throws an error saying: "Login failed for user 'admin' " I have since verified the connection string does work if i replace the body of this sub with the alternative code below this sub. When I run the small program with the button, it stops where it is marked below the asterisk line. Any ideas? thanks in advance. Private Sub Command1_Click() Dim cSQLConn As New ADODB.Connection Dim cmdGetInvoices As New ADODB.Command Dim myRs As New ADODB.Recordset Dim dStartDateIn As Date dStartDateIn = "2010/05/01" cSQLConn.ConnectionString = "Provider=sqloledb;" _ & "SERVER=NET-BRAIN;" _ & "Database=DB_app;" _ & "User Id=admin;" _ & "Password=mudslinger;" cSQLConn.Open cmdGetInvoices.CommandTimeout = 0 sProc = "GetUnconvertedInvoices" 'On Error GoTo GetUnconvertedInvoices_Err With cmdGetInvoices .CommandType = adCmdStoredProc .CommandText = "_sp_cwm5_GetUnCvtdInv" .Name = "_sp_cwm5_GetUnCvtdInv" Set oParm1 = .CreateParameter("@StartDate", adDate, adParamInput) .Parameters.Append oParm1 oParm1.Value = dStartDateIn .ActiveConnection = cSQLConn End With With myRs .CursorLocation = adUseClient .LockType = adLockBatchOptimistic .CursorType = adOpenKeyset '.CursorType = adOpenStatic .CacheSize = 5000 '***************************Debug stops here .Open cmdGetInvoices End With If myRs.State = adStateOpen Then Set GetUnconvertedInvoices = myRs Else Set GetUnconvertedInvoices = Nothing End If End Sub Here is the code which validates the connection string is working. Dim cSQLConn As New ADODB.Connection Dim cmdGetInvoices As New ADODB.Command Dim myRs As New ADODB.Recordset cSQLConn.ConnectionString = "Provider=sqloledb;" _ & "SERVER=NET-BRAIN;" _ & "Database=DB_app;" _ & "User Id=admin;" _ & "Password=mudslinger;" cSQLConn.Open cmdGetInvoices.CommandTimeout = 0 sProc = "GetUnconvertedInvoices" With cmdGetInvoices .ActiveConnection = cSQLConn .CommandText = "SELECT top 5 * FROM tarInvoice;" .CommandType = adCmdText End With With myRs .CursorLocation = adUseClient .LockType = adLockBatchOptimistic '.CursorType = adOpenKeyset .CursorType = adOpenStatic '.CacheSize = 5000 .Open cmdGetInvoices End With If myRs.EOF = False Then myRs.MoveFirst Do MsgBox "Record " & myRs.AbsolutePosition & " " & _ myRs.Fields(0).Name & "=" & myRs.Fields(0) & " " & _ myRs.Fields(1).Name & "=" & myRs.Fields(1) myRs.MoveNext Loop Until myRs.EOF = True End If

    Read the article

  • Yet Another SQL Strategy for Versioned Data

    There is a popular design for a database that requires a built-in audit-trail of amendments and additions, where data is never deleted, but merely superseded by a later version. Whilst this is conceptually simple, it has always made for complicated SQL for reporting the latest version of data. Alex joins the debate on the best way of doing this with an example using an indexed view and the filtered index.

    Read the article

  • How to reinstall Ubuntu keeping my data intact?

    - by explorex
    Hi, I want to reinstall Ubuntu keeping my data intact. I have 160 GB hardrive (sata or pata I don't know but it's slim and made in China) with a 40 GB ext3 partition, a 4GB swap memory and 3 other partition with a FAT32 file system. I have around 4GB space on my drive where Linux is installed. I'd like to keep the data intact, especially the Downloads folder, desktop, and /var/www; And I no longer have access to any other machines or external storage devices.

    Read the article

  • Visualizing Data with the Google Maps API: A Journey of 245k Points

    Visualizing Data with the Google Maps API: A Journey of 245k Points What can you do with some awesome geospatial data, the Google Maps API, and a couple of days of hacking and analysis? Brendan and Paul walk through how they used the Maps API to visualize the CLIWOC database, and pass on tips and trick for doing the same with other geospatial datasets. CLIWOC (Climatological Database for the World's Oceans, 1750-1850): www.ucm.es From: GoogleDevelopers Views: 0 0 ratings Time: 00:00 More in Education

    Read the article

  • How can you replicate each row of an R data.frame and specify the number of replications for each ro

    - by wkmor1
    df <- data.frame(var1=c('a', 'b', 'c'), var2=c('d', 'e', 'f'), freq=1:3) What is the simplest way to expand the first two columns of the data.frame above, so that each row appears the number of times specified in the column 'freq'? In other words, go from this: >df var1 var2 freq 1 a d 1 2 b e 2 3 c f 3 To this: >df.expanded var1 var2 1 a d 2 b e 3 b e 4 c f 5 c f 6 c f

    Read the article

  • Protect Your Data with Windows Vista

    Now a day nothing is more important than backing up your data of your computer. But there are still many people who do not understand the importance of protecting data. Therefore when they proceed fo... [Author: Susan Brown - Computers and Internet - May 08, 2010]

    Read the article

  • Set and Verify the Retention Value for Change Data Capture

    - by AllenMWhite
    Last summer I set up Change Data Capture for a client to track changes to their application database to apply those changes to their data warehouse. The client had some issues a short while back and felt they needed to increase the retention period from the default 3 days to 5 days. I ran this query to make that change: sp_cdc_change_job @job_type='cleanup', @retention=7200 The value 7200 represents the number of minutes in a period of 5 days. All was well, but they recently asked how they can verify...(read more)

    Read the article

  • BULK INSERT from one table to another all on the server

    - by steve_d
    I have to copy a bunch of data from one database table into another. I can't use SELECT ... INTO because one of the columns is an identity column. Also, I have some changes to make to the schema. I was able to use the export data wizard to create an SSIS package, which I then edited in Visual Studio 2005 to make the changes desired and whatnot. It's certainly faster than an INSERT INTO, but it seems silly to me to download the data to a different computer just to upload it back again. (Assuming that I am correct that that's what the SSIS package is doing). Is there an equivalent to BULK INSERT that runs directly on the server, allows keeping identity values, and pulls data from a table? (as far as I can tell, BULK INSERT can only pull data from a file) Edit: I do know about IDENTITY_INSERT, but because there is a fair amount of data involved, INSERT INTO ... SELECT is kinda of slow. SSIS/BULK INSERT dumps the data into the table without regards to indexes and logging and whatnot, so it's faster. (Of course creating the clustered index on the table once it's populated is not fast, but it's still faster than the INSERT INTO...SELECT that I tried in my first attempt) Edit 2: The schema changes include (but are not limited to) the following: 1. Splitting one table into two new tables. In the future each will have its own IDENTITY column, but for the migration I think it will be simplest to use the identity from the original table as the identity for the both new tables. Once the migration is over one of the tables will have a one-to-many relationship to the other. 2. Moving columns from one table to another. 3. Deleting some cross reference tables that only cross referenced 1-to-1. Instead the reference will be a foreign key in one of the two tables. 4. Some new columns will be created with default values. 5. Some tables aren’t changing at all, but I have to copy them over due to the "put it all in a new DB" request.

    Read the article

  • Flex Builder AS3 Project migration

    - by Fahim Akhter
    Hi, I am developing a Flash game using as3, I chose flex Builder with an AS3 project. Now I am thinking that if selecting the project to be a flex project instead of as3 project I would have a lot of flex functionality like a swf loader,preloaders and the popup manager etc. The graphic components would obviously have been made in flash and used through the swc (avoiding the heavy mxml components). Need to know what other developers think of this approach.

    Read the article

  • Best practice? - Array/Dictionary as a Core Data Entity Attribute

    - by Run Loop
    I am new to Core Data. I have noticed that collection types are not available as attribute types and would like to know what the most efficient way is of storing array/dictionary type data as an attribute (e.g. the elements that make up an address like street, city, etc. does not require a separate entity and is more conveniently stored as a dictionary/array than separate attributes/fields). Thank you.

    Read the article

  • Which has a faster data transfer rate? WIFI (tablet or cell phone, not LTE) or MicroSD (Class 10)?

    - by techaddict
    Which of the two methods of dta transfer trasfers data at a faster rate for smartphones and tablets? Standard WIFI, or MicroSD Cards? I wonder if it would be actually faster to access data on external storage then it would be to have the MicroSD card in my smartphone or tablet. Currently I have a class 10 32GB MicroSD card in my cell phone. I am looking to get the new Google Nexus tablet but it does not offer expandable internal storage. I wonder if that's really a detriment; because if WIFI is faster than MicroSD, then it would matter almost none at all that you couldn't expand the storage internally. If the case is that WIFI is faster, and people caught onto this, then people could save a lot of money on lower memory ipads/iphones/ipods, tablets, and smartphones!

    Read the article

  • Rake migration aborted

    - by user2537714
    I'm running Ruby 2.0.0 and I installed it correctly. Just loaded up a gem 'devise' and as I tried to migrate my database changes, it wouldn't work: $ rake db:migrate rake aborted! attr_accessible is extracted out of Rails into a gem. Please use new recommended protection model for params(strong_parameters) or add protected_attributes to your Gemfile to use old one. Then, following another Stackoverflow post, they recommended installing Bundler. I did that successfully and got this: $ bundle exec rake db:migrate rake aborted! attr_accessible is extracted out of Rails into a gem. Please use new recommended protection model for params(strong_parameters) or add protected_attributes to your Gemfile to use old one. Is anyone up to the challenge to help?

    Read the article

  • Reading RSS data with Linq to Xml

    - by hakanbilge
     Linq to Xml is the best method, I think, for querying, constructing and writing Xml data. In this article, I'll show how to read Rss data with this powerful Xml technique, Linq. Now, create a Website in Visual Studio, add a Textbox and a  [read more....]

    Read the article

  • Declarative Data Load for Object Properties & .NET UI Controls

    This article details a new practice to prepare the .NET Business Objects using the data retrieved from the Database and binding them to .NET UI Controls dynamically using Reflection through centralized mapping between a typess Properties Vs Data-Columns Vs UI-Controls....Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • SQL Server 2012 disponible en version finale : AlwaysOn, Big Data, Power View, Microsoft tient ses promesses

    SQL Server 2012 disponible en version finale AlwaysOn, Big Data, Power View, la plateforme de gestion et d'analyse d'information de Microsoft tient ses promesses Mise à jour du 03/04/2012 Comme l'avait promis Microsoft, la version finale de SQL Server 2012 est disponible depuis le 1er avril, mais a été annoncée officiellement hier. La plateforme de gestion et d'analyse d'information de Microsoft a été conçue pour être l'environnement de référence des applications critiques d'entreprise, offrir une solution décisionnelle plus complète intégrant le Big Data et permettre une meilleure connexion avec le Cloud. ...

    Read the article

  • Data Binding in ASP.NET 3.5 with a Basic Example

    Data binding is a method of binding ASP.NET 3.5 web controls and the database column fields. This method of binding is necessary to produce a certain level of interactivity within the web control. This article will explain how and why to use data binding in your web applications.... Test Drive the Next Wave of Productivity Find Microsoft Office 2010 and SharePoint 2010 trials, demos, videos, and more.

    Read the article

  • Let&rsquo;s keep informed with &ldquo;Data Explorer&rdquo;

    - by Luca Zavarella
    At Pass Summit 2011 a new project was announced. It’s a Microsoft SQL Azure Lab and its codename is Microsoft “Data Explorer”. According to the official blog (http://blogs.msdn.com/b/dataexplorer/), this new tool provides an innovative way to acquire new knowledge from the data that interest you. In a nutshell, Data Explorer allows you to combine data from multiple sources, to publish and share the result. In addition, you can generate data streams in the RESTful open format (Open Data Protocol), and they can then be used by other applications. Nonetheless we can still use Excel or PowerPivot to analyze the results. Sources can be varied: Excel spreadsheets, text files, databases, Windows Azure Marketplace, etc.. For those who are not familiar with this resource, I strongly suggest you to keep an eye on the data services available to the Marketplace: https://datamarket.azure.com/browse/Data To tell the truth, as I read the above blog post, I was tempted to think of the Data Explorer as a "SSIS on Azure" addressed to the Power User. In fact, reading the response from Tim Mallalieu (Group Program Manager of Data Explorer) to the comment made to his post, I had a positive response to my first impression: “…we originally thinking of ourselves as Self-Service ETL. As we talked to more folks and started partnering with other teams we realized that would be an area that we can add value but that there were more opportunities emerging.” The typical operations of the ETL phase ( processing and organization of data in different formats) can be obtained thanks to Data Explorer Mashup. This is an image of the tool: The flexibility in the manipulation of information is given by Data Explorer Formula Language. This is a formula-based Excel-style specific language: Anyone wishing to know more can check the project page in addition to aforementioned blog: http://www.microsoft.com/en-us/sqlazurelabs/labs/dataexplorer.aspx In light of this new project, there is no doubt about the intention of Microsoft to get closer and closer to the Power User, providing him flexible and very easy to use tools for data analysis. The prime example of this is PowerPivot. The question that remains is always the same: having in a company more Power User will implicitly mean having different data models representing the same reality. But this would inevitably lead to anarchical data management... What do you think about that?

    Read the article

< Previous Page | 107 108 109 110 111 112 113 114 115 116 117 118  | Next Page >