Search Results

Search found 10101 results on 405 pages for 'temporary tables'.

Page 199/405 | < Previous Page | 195 196 197 198 199 200 201 202 203 204 205 206  | Next Page >

  • Copying a field to another table in Access

    - by Jacques Tardie
    I'm a bit embarassed asking this here, but here goes: I've got two tables, which you can see here: http://img411.imageshack.us/img411/4562/query.jpg I need to copy the effortid from the one table into the other, making sure that the values still maintain the correction relationships. The primary key for each is a combination of loggerid & datetime. What's the best way to do this? Thanks in advance, and don't make fun :)

    Read the article

  • How to select records as columns in SQL

    - by Leigh
    Hi, I have two tables: tblSizes and tblColors. tblColors has columns called ColorName, ColorPrice and SizeID. There is one size to multiple colors. I need to write a query to select the size and all the colors (as columns) for a that size with the price of each size in its respective column. The colors must be returned as columns, for instance: SizeID : Width : Height : Red : Green : Blue 1---------220-----220----£15----£20-----£29 Hope this makes sense Thank you

    Read the article

  • Method hiding with interfaces

    - by fearofawhackplanet
    interface IFoo { int MyReadOnlyVar { get; } } class Foo : IFoo { int MyReadOnlyVar { get; set; } } public IFoo GetFoo() { return new Foo { MyReadOnlyVar = 1 }; } Is the above an acceptable way of implementing a readonly/immutable object? The immutability of IFoo can be broken with a temporary cast to Foo. In general (non-critical) cases, is hiding functionality through interfaces a common pattern? Or is it considered lazy coding? Or even an anti-pattern?

    Read the article

  • Fast find near users using PostGIS

    - by opedge
    I have 5 tables: - users - information about user with current location_id (fk to geo_location_data) - geo_location_data - information about location, with PostGIS geography(POINT, 4326) column - user_friends - relationships between users. I want to find near friends for current user, but it takes a lot of time of executing select query to know if user is a friend and after that execute select using ST_DWithin.. May be something wrong in domain model or in queries?

    Read the article

  • Transaction on Entity FrameWork Refactoring and best performance how can i?

    - by programmerist
    i try to use transaction in Entity FrameWork. i have 3 tables Personel, Prim, Finans. in Prim table you look SatisTutari (int) if i add data in SatisTutari.Text instead of int value adding float value. Trannsaction must be run! Everything is ok but how can i refactoring or give best performance or best writing Transaction coding! i have 3 table so i have 3 entities: CREATE TABLE Personel (PersonelID integer PRIMARY KEY identity not null, Ad varchar(30), Soyad varchar(30), Meslek varchar(100), DogumTarihi datetime, DogumYeri nvarchar(100), PirimToplami float); Go create TABLE Prim (PrimID integer PRIMARY KEY identity not null, PersonelID integer Foreign KEY references Personel(PersonelID), SatisTutari int, Prim float, SatisTarihi Datetime); Go CREATE TABLE Finans (ID integer PRIMARY KEY identity not null, Tutar float); Personel, Prim,Finans my tables. if you look Prim table you can see Prim value float value if i write a textbox not float value my transaction must run. protected void btnSave_Click(object sender, EventArgs e) { using (TestEntities testCtx = new TestEntities()) { using (TransactionScope scope = new TransactionScope()) { Personel personel = new Personel(); Prim prim = new Prim(); Finans finans = new Finans(); //-----------------------------------------------------------------------Step 1 personel.Ad = txtName.Text; personel.Soyad = txtSurName.Text; personel.Meslek = txtMeslek.Text; personel.DogumTarihi = DateTime.Parse(txtSatisTarihi.Text); personel.DogumYeri = txtDogumYeri.Text; personel.PirimToplami = float.Parse(txtPrimToplami.Text); testCtx.AddToPersonel(personel); testCtx.SaveChanges(); //----------------------------------------------------------------------- step 2 prim.PersonelID = personel.PersonelID; prim.SatisTutari = int.Parse(txtSatisTutari.Text); prim.SatisTarihi = DateTime.Parse(txtSatisTarihi.Text); prim.Prim1 = double.Parse(txtPrim.Text); finans.Tutar = prim.SatisTutari * prim.Prim1; testCtx.AddToPrim(prim); testCtx.SaveChanges(); //----------------------------------------------------------------------- step 3 lblTutar.Text = finans.Tutar.Value.ToString(); testCtx.AddToFinans(finans); testCtx.SaveChanges(); scope.Complete(); } } How can i rearrange codes. i need best practice refactoring and best solution for reading easly and performance!!!

    Read the article

  • Subsonic 2.2 and SQL CE

    - by Giuseppe
    Hi, anybody use Subsonic with SQL Server CE 3.5 ? I try but get an error with Substage 2.2, error talking about PK_TABLE. My tables have primary keys and relations. Can someone help me ? By, Giuseppe.

    Read the article

  • MS-ACCESS query to match first few characters in string comparision

    - by neobee
    What query is suitable to compare two tables specied below, however only part of string in location(table1) will matches with the Location(table2). Location(table1) Location(table2) india- north USxcs India-west Indiaasd India- east Indiaavvds India- south Africassdcasv US- north Africavasvdsa us-west UKsacvavsdv uk- east Indiacascsa uk- south UScssca Africa-middle Indiacsasca Africa-south Africaccc Africa-east UKcac only 1st two characters of location(table1) and 1st two characters of location(table2) should match. Please help only any four characters of location(table1)and any two characters of location(table2)should match.

    Read the article

  • Protect files from svn commit.

    - by chrsk
    Hey, imagine a plain webapp with a log4j.properties which is under version control. I can't add it to svn:ignore because its a mandatory file. If i make custom changes for development and i don't want to commit them, i have to watch out for accidently commits. For one file it's easy to handle, with 3 or more files it becomes creepy. Is there a way to disable these files temporary from svn commit? So its easiert to commit? I'm working with svn and subclipse.

    Read the article

  • SQL Server Update with left join and group by having

    - by Marty Trenouth
    I'm making an update to our datbase and would like to update rows that do not have existing items in another table. I can join the tables together, but am having trouble grouping the table to get a count of the number of rows UPDATE dpt SET dpt.active = 0 FROM DEPARTMENT dpt LEFT JOIN DOCUMENTS doc on dpt.ID = doc.DepartmentID GROUP BY dpt.ID HAVING COUNT(doc.ID) = 0 What should I be doing?

    Read the article

  • Storing JSON in an msSQL database?

    - by JKirchartz
    I'm developing a form generator, and wondering if it would be bad mojo to store JSON in an SQL database? I want to keep my database & tables simple, so I was going to have `pKey, formTitle, formJSON` on a table, and then store {["firstName":{"required":"true","type":"text"},"lastName":{"required":"true","type":"text"}} in formJSON. would this slow down the DB server too much to set live? Any input is appreciated.

    Read the article

  • Python: Script works, but seems to deadlock after some time

    - by sberry2A
    I have the following script, which is working for the most part Link to PasteBin The script's job is to start a number of threads which in turn each start a subprocess with Popen. The output from each subprocess is as follows: 1 2 3 . . . n Done Bascially the subprocess is transferring 10M records from tables in one database to different tables in another db with a lot of data massaging/manipulation in between because of the different schemas. If the subprocess fails at any time in it's execution (bad records, duplicate primary keys, etc), or it completes successfully, it will output "Done\n". If there are no more records to select against for transfer then it will output "NO DATA\n" My intent was to create my script "tableTransfer.py" which would spawn a number of these processes, read their output, and in turn output information such as number of updates completed, time remaining, time elapsed, and number of transfers per second. I started running the process last night and checked in this morning to see it had deadlocked. There were not subprocceses running, there are still records to be updated, and the script had not exited. It was simply sitting there, no longer outputting the current information because no subprocces were running to update the total number complete which is what controls updates to the output. This is running on OS X. I am looking for three things: I would like to get rid of the possibility of this deadlock occurring so I don't need to check in on it as frequently. Is there some issue with locking? Am I doing this in a bad way (gThreading variable to control looping of spawning additional thread... etc.) I would appreciate some suggestions for improving my overall methodology. How should I handle ctrl-c exit? Right now I need to kill the process, but assume I should be able to use the signal module or other to catch the signal and kill the threads, is that right? I am not sure whether I should be pasting my entire script here, since I usually just paste snippets. Let me know if I should paste it here as well.

    Read the article

  • Table names, and loop to describe

    - by Greg
    Working in Oracle 10g. Easy way to list all tables names (select table_name from dba_tables where owner = 'me') But now that I have the table names, is there an easy way to loop through them and do a 'describe' on each one in sequence?

    Read the article

  • Reset ID autoincrement ? phpmyadmin

    - by Marcelo
    Hi, I was testing some data in my tables of my database, to see if there was any error, now I cleaned all the testing data, but my id (auto increment) does not start from 1 anymore, can (how do) I reset it ? Sorry for any mistake in English, and thanks for the attention.

    Read the article

  • UNIQUE CONSTRAINT on a column from foreign table in MSSQL2008

    - by bodziec
    Hi, I have two tables: create table [dbo].[Main] ( [ID] [int] identity(1,1) primary key not null, [Sign] [char](1) not null ) create table [dbo].[Names] ( [ID_Main][int] primary key not null, [Name][nvarchar](128) not null, constraint [FK_Main_Users] foreign key ([ID_Main]) references [dbo].[Main]([ID]), constraint [CK_Name] unique ([Name], [Sign]) ) The problem is with the second constraint CK_Name Is there a way to make a constraint target column from a foreign table?

    Read the article

  • [Haskell]Curious about the Hash Table problem

    - by astamatto
    I read that hash tables in haskell are crippled ( citation: http://flyingfrogblog.blogspot.com/2009/04/more-on-haskells-hash-table-problems.html ) and since i like haskell it worried me. Since the blog-post one year has passed and im curious, The hash table problem in haskell was "fixed" in the traditional compilers? (like ghc) ps: I love stack overflow, im a long time visitor but only today i decided to try to post a question.

    Read the article

  • MySQL - query to return CSV in a field?

    - by StackOverflowNewbie
    Assume I have the following tables: TABLE: foo - foo_id (PK) TABLE: tag - tag_id (PK) - name TABLE: foo_tag - foo_tag_id (PK) - foo_id (FK) - tag_id (FK) How do I query this so that I get a result like this: ========================== | foo_id | tags | ========================== | 1 | foo, bar | | 2 | foo | | 3 | bar | -------------------------- Basically, I need all of foo's tags in one column, comma separated. Possible in MySQL?

    Read the article

  • transforming binary data using ssis and sql server 2008

    - by Rick
    Hello All - I have a task to import/transform and extract zipped binary files that contain both text data as well as embeded binary data. Within the data is data that is relational in nature and needs to be processed into a defined database structure. Currently I have a C# single threaded app that essentially grabs all the files from the directory (currently there is 13K files of varying sizes) and extracts the data on a single thread line by line inserts to the database. As you could imagine this is a very slow process and unacceptable. There are several different parsing routines used depending on the header record in the file. There are potentially upto a million rows per file when all the data is extracted to the row level of detail. Follow on task is to parse those rows into their appropriate tables based on is content. i.e. the textual content has to be parsed further into "buckets" of like data in the database. That about sums up the big picture. Now for the problem task list. How do i iterate through a packet of data using SSIS? In the app the file is decompressed and then is parsed using streams data type and byte arrays and is routed to the required parsing routine based on the header data of each packet. There is bit swapping involved as well. Should i wrap up the app code into a script task(s) and let it do the custom processing? The data is seperated by year and the sql server tables is partitioned by year as well. I need to be able to "catch" bad file data as well and process by hand most likely. Should i simply load the zipped file to sql as a blob and parse the file with T-SQL? Would that be multi threaded if done that way? Not sure how to do the parsing in tsql that is involved here. Which do you think would be faster? Potentially the data that is currently processed via files could come to us via a socket. Can SSIS collect that data in real time? How would i go about setting that up? Processing these new files from the directorys will become a daily task. I can manage the data once i get it to sql server. Getting it there in a timely fashion seems to be the long pole in the tent for me. I would appreciate any comments or suggestions from the group. Rick

    Read the article

  • In Javascript, is there a technique where I can execute code after a return?

    - by Christopher Altman
    Is there a technique where I can execute code after a return? I want to return a value then reset the value without introducing a temporary variable. My current code is: function(a){ var b; if(b){ var temp = b; //I want to avoid this step b = false; return temp; }else{ b = a; return false; }; }; I want to avoid the temp var. Is that possible? var b holds a value between function calls because it is a memoization styled function.

    Read the article

< Previous Page | 195 196 197 198 199 200 201 202 203 204 205 206  | Next Page >