Search Results

Search found 31483 results on 1260 pages for 'database migration'.

Page 616/1260 | < Previous Page | 612 613 614 615 616 617 618 619 620 621 622 623  | Next Page >

  • Linking Excel and Access

    - by Mel
    I run a sports program where i have a master roll of who is in which class in excel. I want to link this to a database in access that stores the other information about each athlete, e.g. address, parents name, school, medical details. I want to be able to add names to class in the excel speadsheet and have this automatically generate a record for that person in access. There also needs to be some failsafe for athletes that are in multiple classes. I was also doing class roles as pivot tables out of the access database so i need to code for classes and also have this allow for athletes in multiple classes/disciplines.

    Read the article

  • Binary search of unaccesible data field in ldap from python

    - by EricR
    I'm interested in reproducing a particular python script. I have a friend who was accessing an ldap database, without authentication. There was a particular field of interest, we'll call it nin (an integer) for reference, and this field wasn't accessible without proper authentication. However, my friend managed to access this field through some sort of binary search (rather than just looping through integers) on the data; he would check the first digit, check if it was greater or less than the starting value, he would augment that until it returned a true value indicating existence, adding digits and continuing checking until he found the exact value of the integer nin. Any ideas on how he went about this? I've access to a similarly set up database.

    Read the article

  • How sophisticated should be DAL?

    - by Andrew Florko
    Basically, DAL (Data Access Layer) should provide simple CRUD (Create/Read/Update/Delete) methods but I always have a temptation to create more sophisticated methods in order to minimize database access roundtrips from Business Logic Layer. What do you think about following extensions to CRUD (most of them are OK I suppose): Read: GetById, GetByName, GetPaged, GetByFilter... e.t.c. methods Create: GetOrCreate methods (model entity is returned from DB or created if not found and returned), Create(lots-of-relations) instead of Create and multiple AssignTo methods calls Update: Merge methods (entities list are updated, created and deleted in one call) Delete: Delete(bool children) - optional children delete, Cleanup methods Where do you usually implement Entity Cache capabilities? DAL or BLL? (My choice is BLL, but I have seen DAL implementations also) Where is the boundary when you decide: this operation is too specific so I should implement it in Business Logic Layer as DAL multiple calls? I often found insufficient BLL operations that were implemented in dozen database roundtrips because developer was afraid to create a bit more sophisticated DAL. Thank you in advance!

    Read the article

  • Recommended Tutorials for php mysql and graphs

    - by Vinit Joshi
    I need help to find a tutorial or anything to help me create a comparison chart. The user is able to search for device names. The information about the device names is in a drop down box dynamically added from the database. I want the user to be able to select two separate devices and view the devices information plotted onto a graph. I'll be grateful for keywords that I can search for or any tutorials that will help me to carry on with this task. Currently on my screen I can see the data that has been inserted into the database. This data is placed inside a table. I have so far used xhtml, php and mysql. I've tried to make the question as clear as possible so sorry if it does confuse anyone.

    Read the article

  • SQL Server 2008 - Script Data as Insert Statements from SSIS Package

    - by Brandon King
    SQL Server 2008 provides the ability to script data as Insert statements using the Generate Scripts option in Management Studio. Is it possible to access the same functionality from within a SSIS package? Here's what I'm trying to accomplish... I have a scheduled job that nightly scripts out all the schema and data for a SQL Server 2008 database and then uses the script to create a "mirror copy" SQLCE 3.5 database. I've been using Narayana Vyas Kondreddi's sp_generate_inserts stored procedure to accomplish this, but it has problems with some datatypes and greater-than-4K columns (holdovers from SQL Server 2000 days). The Script Data function looks like it could solve my problems, if only I could automate it. Any suggestions?

    Read the article

  • Prevent sqlite INJECTION ATTACKS on your own iPhone?

    - by Bonnie
    I always take precautions regarding SQL INJECTION ATTACKS when data is saved between someone's iPhone and a remote database on the cloud. But is it also necessary to do the same... when just saving data (using sqlite) from someone's cell phone, to a database that's just on their own phone? What's the worse they can do? Delete their own data (or tables) on their own phone? (If they really try hard enough.) Thanks.

    Read the article

  • Searching for patterns to create a TCP Connection Pool for high performance messaging

    - by JoeGeeky
    I'm creating a new Client / Server application in C# and expect to have a fairly high rate of connections. That made me think of database connection pools which help mitigate the expense of creating and disposing connections between the client and database. I would like to create a similar capability for my application and haven't been able to find any good examples of how to apply this pattern. Do I really need to spin up an instance of a TcpClient every time I want to send a message to the server and receive a receipt message? Each connection is expected to transport between 1-5KB with each receiving a 1KB response message. I realize this question is somewhat vague, but I am starting from scratch so I am open to suggestions. Even if that means my suppositions are all wrong.

    Read the article

  • mysql - how do I load data from a different configuration...

    - by Ronedog
    I'm not sure if this will make sense, but I'll give it a shot. My hard drive went down and I had to reinstall the os along with all my webserver configuration,etc. I kept a backup of the mysql database, but it doesn't contain all the tables...I added a couple tables after my last backup. I have access to the hard drive and the directory where the mysql data files are stored from the failed hard drive, but I don't know how to retrieve the data into my new mysql database. Is it even possible to get the raw data files from mysql and load them into a different instance? I'd even be happy if there was some way for phpmyadmin to show the data files, then I could dump out to a backup txt file, and reload them into my new configuration. Any help will be appreciated. thanks.

    Read the article

  • Passing parameters to web service method in c#

    - by wafa.cs1
    Hello I'm developing an android application .. which will send location data to a web service to store in the server database. In Java: I've used REST protocol so the URI is: HttpPost request = new HttpPost("http://trafficmapsa.com/GService.asmx/GPSdata? lon="+Lon+"&Lat="+Lat+"&speed="+speed); In Asp.net (c#) web service will be: [WebMethod] public CountryName GPSdata(Double Lon, Double Lat, Double speed) { after passing the data from android ..nothing return in back .. and there is no data in the database!! Is there any library missing in the CS file!! I cannot figure out what is the problem.

    Read the article

  • What is the best way to implement an object cache with Entity Framework?

    - by Harshal
    Say I have a table of "BlogPosts" in a database and i want to be able to cache the ones that were retrieved already in memory, for further reads, I can just use a standard hashtable type memory cache like System.Web.Caching.Cache, but if i then need to update a property on one of these blog posts e.g. blogPost.Title and update the record in DB, i cannot do this without fetching it again from database as the Entity Framework context used to fetch this record when it was loaded into my cache is already disposed? How do I write code so that I am getting an object from my cache, updating one property and just calling the SaveChanges method without incurring an extra read.

    Read the article

  • asp.net webservice user management across pages

    - by nakori
    I'm developing a site that will display confidential readonly information, with data fetched from a WCF service. My question: What is the best approach to user management across different information pages. The service returns a collection with customer info after a secure login. My idea is to have a Customer object class that is stored in session. Is it possible to use things like HttpContext.Current.User.Identity.IsAuthenticated followed by HttpContext.Current.Session["UserId"] without using a database with role-based security? Would I be better off with a combination of local database, Linq to SQL or datasets rather than using just class objects for data fetched from service? thanks, nakori

    Read the article

  • Problem with SQL syntax (probably simple)

    - by Bryan Folds
    I'm doing some custom database work for a module for Drupal and I get the following SQL error when trying to create my table: user warning: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'DEFAULT NULL, rooms INT DEFAULT NULL, adults INT DEFAULT NULL, children' at line 14 query: CREATE TABLE dr_enquiry ( eId INT unsigned NOT NULL auto_increment, eKey VARCHAR(16) NOT NULL, dateSent INT NOT NULL DEFAULT 0, status VARCHAR(30) NOT NULL DEFAULT 'Unanswered', custName VARCHAR(50) NOT NULL, custEmail VARCHAR(200) NOT NULL, custPhone VARCHAR(25) NOT NULL, custCountry VARCHAR(40) NOT NULL, custIP VARCHAR(11) DEFAULT NULL, offerName VARCHAR(100) NOT NULL, offerURL VARCHAR(200) NOT NULL, arrival DATETIME DEFAULT NULL, departure DEFAULT NULL, rooms INT DEFAULT NULL, adults INT DEFAULT NULL, children INT DEFAULT NULL, childAges VARCHAR(32) DEFAULT NULL, toddlers INT DEFAULT NULL, toddlerAges VARCHAR(32) DEFAULT NULL, catering VARCHAR(255) DEFAULT NULL, comments VARCHAR(255) DEFAULT NULL, agent VARCHAR(100) DEFAULT NULL, voucher VARCHAR(100) DEFAULT NULL, PRIMARY KEY (eId) ) /*!40100 DEFAULT CHARACTER SET UTF8 */ in /home/travelco/public_html/includes/database.inc on line 550.

    Read the article

  • MS Access (2010) Enable Design View

    - by Tim GONELLA
    I downloaded the Access template below for doing a home inventory: http://office.microsoft.com/en-us/templates/results.aspx?qu=home%20inventory&ex=1&queryid=0d245f2a%2Dacdc%2D4161%2D92c8%2D8ba16a52ab32&AxInstalled=1&c=0#ai:TC101918100| The design view is not visible, which is a bit of a nuisance. Things I've tried: 1) In options/options/current database/ the check boxes (enable layout view & enable design changes for tables in Datasheet view) are both greyed out. 2) I've unblocked the file using Right-Click-Properties. 3) I've tried copying/exporting the objects to another database. But can only copy/export the tables. 4) I've tried holding shift when opening the DB. 5) Enabling all trust permissions etc. None of these work Does anybody have any suggestions. (I'm using Office 2010) Thanks

    Read the article

  • capistrano put() and upload() both failing

    - by Kyle
    With capistrano, I am deploying a Rails application from Mac OS X 10.5 to CentOS 5.2 Note that deploy.rb and the server environment have not changed in over a year. There is a task within our deploy.rb file called upload: put(File.read( file ),"#{shared_path}/#{filename}", :via => :scp) This fails each and every time with the following exception: No such file or directory - /srv/ourapp/releases/20100104194410/config/database.yml My local copy of config/database.yml is failing to upload properly. I've verified it's not our internet connection, as this happens on three different connections and two different systems. I've also tried to swap out put() for upload() but get the same result; also, dropping :via = :scp, and/or trying to force :sftp instead similarly fails. Relevant info: $ cap -V Capistrano v2.5.10 $ ruby -v ruby 1.8.7 (2008-08-11 patchlevel 72) [i686-darwin9.6.0]

    Read the article

  • transforming binary data using ssis and sql server 2008

    - by Rick
    Hello All - I have a task to import/transform and extract zipped binary files that contain both text data as well as embeded binary data. Within the data is data that is relational in nature and needs to be processed into a defined database structure. Currently I have a C# single threaded app that essentially grabs all the files from the directory (currently there is 13K files of varying sizes) and extracts the data on a single thread line by line inserts to the database. As you could imagine this is a very slow process and unacceptable. There are several different parsing routines used depending on the header record in the file. There are potentially upto a million rows per file when all the data is extracted to the row level of detail. Follow on task is to parse those rows into their appropriate tables based on is content. i.e. the textual content has to be parsed further into "buckets" of like data in the database. That about sums up the big picture. Now for the problem task list. How do i iterate through a packet of data using SSIS? In the app the file is decompressed and then is parsed using streams data type and byte arrays and is routed to the required parsing routine based on the header data of each packet. There is bit swapping involved as well. Should i wrap up the app code into a script task(s) and let it do the custom processing? The data is seperated by year and the sql server tables is partitioned by year as well. I need to be able to "catch" bad file data as well and process by hand most likely. Should i simply load the zipped file to sql as a blob and parse the file with T-SQL? Would that be multi threaded if done that way? Not sure how to do the parsing in tsql that is involved here. Which do you think would be faster? Potentially the data that is currently processed via files could come to us via a socket. Can SSIS collect that data in real time? How would i go about setting that up? Processing these new files from the directorys will become a daily task. I can manage the data once i get it to sql server. Getting it there in a timely fashion seems to be the long pole in the tent for me. I would appreciate any comments or suggestions from the group. Rick

    Read the article

  • VB working with SQL DB - end of row count, keeps looping

    - by Tramd
    I'm adding to a combo box an ID and a name that i'm pulling from a database. My problem is that for some reason my loop doesnt end once it reaches the end of the records in the database table. Here's my code: For intcount = 0 To dtOrders.Rows.Count - 1 cmbSearch.Items.Add(dtOrders.Rows(intcount)("EmployeeID").ToString & " " & dtOrders.Rows(intcount)("EmployeeLastName").ToString & ", " & dtOrders.Rows(intcount)("EmployeeFirstName").ToString) Next Shouldnt the .rows.count - 1 stop it once it reaches the last record? It loops 4 times through.

    Read the article

  • Catching constraint violations in JPA 2.0.

    - by Dennetik
    Consider the following entity class, used with, for example, EclipseLink 2.0.2 - where the link attribute is not the primary key, but unique nontheless. @Entity public class Profile { @Id private Long id; @Column(unique = true) private String link; // Some more attributes and getter and setter methods } When I insert records with a duplicate value for the link attribute, EclipseLink does not throw a EntityExistsException, but throws a DatabaseException, with the message explaining that the unique constraint was violated. This doesn't seem very usefull, as there would not be a simple, database independent, way to catch this exception. What would be the advised way to deal with this? A few things that I have considered are: Checking the error code on the DatabaseException - I fear that this error code, though, is the native error code for the database; Checking the existence of a Profile with the specific value for link beforehand - this obviously would result in an enormous amount of superfluous queries.

    Read the article

  • Flashing and closing PyS60 SIS

    - by Matrich
    I created a py script for S60 2nd Edition FP3 with a background image, database, sound files which are stored in a folder on c drive. After several hours, I have managed to convert it into sis which I sent to my mobile phone. First, I installed the sis on the external memory and it failed to work. I read online that it should be on the same drive as python runtime. So I installed it on phone memory but it just flashes and then closes. I made another sis with a database and it worked so I suspect it could be because it cant find the background image and sound files in the folder. How can I know the cause of this? Also, how can I include the background image and sound files with the sis so that they are automatically installed when installing the sis file?

    Read the article

  • SQL Access INSERT INTO Autonumber Field

    - by KrazyKash
    I'm trying to make a visual basic application which is connected to a Microsoft Access Database using OLEDB. Inside my database I have a user table with the following layout ID - Autonumber Username - Text Password - Text Email - Text To insert data into the table I use the following query INSERT INTO Users (Username, Password, Email) VALUES ('004606', 'Password', '[email protected]') However I seem to get an error with this statement and according to VB it's a syntax error. But then I tried to use the following query INSERT INTO Users (Username) Values ('004606') This query seemed to work absolutely fine... So the problem is I can insert into just one field but not all 3 (excluding the ID field because it's an autonumber). Any help would be appreciated, Thanks

    Read the article

  • Quicken like Windows Forms application

    - by WinFXGuy
    Hi All, I need to build a quicken like application, where data needs to be secure. I don't see any database being used by Quicken. I could use XML, MDF or Access database, but data is not secure in the tables. What is the best option? How does Quicken handle it? My application may also have document attachments as well. The functionality of this application is similar to quicken but not an accounting/financial in functionality. Thanks a bunch!

    Read the article

  • where should I put the EF entity and data annotations in asp.net mvc + entity framework project

    - by giddy
    So I have a DataEntity class generated by EntityFramework4 for my sqlexpress08 database. This data context is exposed via a WCF Data Service/Odata to silverlight and win forms clients. Should the data entities + edmx file (generated by EF4) go in a separate class library? The problem here then is I would specify data annotations for a few entities and then some of them would require specific MVC attributes (like CompareAttribute) so the class library would also reference mvc dlls. There also happen to be entity users which will be encapsulated or wrapped into an IIdentity in the website. So its pretty tied to the mvc website. Or Should it maybe go in a Base folder in the mvc project itself? Mostly the website is data driven around the database, like approve users, change global settings etc. The real business happens in the silverlight and win forms apps. Im using mvc3 rc2 with Razor. Thanks

    Read the article

  • Can Hibernate default a Null String to Empty String

    - by sliver
    In our application we are pulling data from a DB2 mainframe database. If the database has "low values" in a field, hibernate sends a "null" value in the object. This occurs even if the column is defined as "not null". As we are doing XML parsing on this, Castor is having trouble with it. I would like to fix this in Hibernate. Also, all of the hibernate hbm files are generated, so we can't mess with them (they are regened from time to time.) Any way to intercept all Strings and replace nulls with ""?

    Read the article

  • Does anyone have a backup strategy for SQL Azure databases?

    - by Pete
    I'm using SQL Azure on a project and it works great. The problem is that the usual backup features do not exist. I have exported the database a couple of times using SQLAzureMW ( http://sqlazuremw.codeplex.com/ ) but this tool is now choking trying to download the database data with bcp. In any case, it's not as nice a solution as SQL Server backups. Is anyone aware of a commercial or open source tool, or other technique, for making reliable backups of SQL Azure databases? This is really a showstopper.

    Read the article

  • .Remove(object) on a List<T> returned from LINQ to SQL compiled query won't delete the Object right

    - by soldieraman
    I am returning two lists from the database using LINQ to SQL compiled query. While looping the first list I remove duplicates from the second list as I dont want to process already existing objects again. eg. //oldCustomers is a List returned by my Compiled Linq to SQL Statmenet that I have added a .ToList() at the end to //Same goes for newCustomers for (Customer oC in oldCustomers) { //Do some processing newCustomers.Remove(newCusomters.Find(nC=> nC.CustomerID == oC.CusomterID)); } for (Cusomter nC in newCustomers) { //Do some processing } DataContext.SubmitChanges() I expect this to only save the changes that have been made to the customers in my processing and not Remove or Delete any of my customers from the database. Correct? I have tried it and it works fine - but I am trying to know if there is any rare case it might actually get removed

    Read the article

  • What is the best PHP framework for building a website around a heavily relational PostgreSQL databas

    - by Kenaniah
    First of all, the framework of choice needs to have excellent support for PostgreSQL. I don't care about MySQL because it doesn't have half of the features the application I will be porting requires. (And when I say excellent support, I mean that their approach to database drivers has not been solely trained in MySQL). The ideal framework: Should take full advantage of PHP 5.3 and PostgreSQL 8.4 features Should support new technologies such as OpenID and social networking Should support complex relationships between database relations Should have an intelligent validation system Should have a basic library of helpful views (such as pagination, navigation, etc) Should probably be MVC based Should have excellent documentation and an active development community Should namespace classes intelligently What I'm looking for might be more of a library of utilities, as I really don't want to be restricted by the framework in what I can and can not do. I have my own small library of core classes that take care of business logic, and I will most likely want to integrate those with the new framework as well. Thanks!

    Read the article

< Previous Page | 612 613 614 615 616 617 618 619 620 621 622 623  | Next Page >