Search Results

Search found 41025 results on 1641 pages for 'in memory database'.

Page 78/1641 | < Previous Page | 74 75 76 77 78 79 80 81 82 83 84 85  | Next Page >

  • Optimising Database Calls

    - by Dwaine Bailey
    I have a database that is filled with information for films, which is (in turn) read in to the database from an XML file on a webserver. What happens is the following: Gather/Parse XML and store film info as objects Begin Statement For every film object we found: Check to see if record for film exists in database If no film record, write data for film Commit Statement Currently I just test for the existence of a film using (the very basic): SELECT film_title FROM film WHERE film_id = ? If that returns a row, then the film exists, if not then I need to add it... The only problem is, is that there are many many hundreds of records in the database (lots of films!) and because it has to check for the existence of a film in the database before it can write it, the whole process ends up taking quite a while (about 27 seconds for 210 films) Is there a more efficient method of doing this, or just any suggestions in general? Programming Language is Objective-C, database is in sqlite3 Thanks, Dwaine

    Read the article

  • Why does Git.pm on cygwin complain about 'Out of memory during "large" request?

    - by Charles Ma
    Hi, I'm getting this error while doing a git svn rebase in cygwin Out of memory during "large" request for 268439552 bytes, total sbrk() is 140652544 bytes at /usr/lib/perl5/site_perl/Git.pm line 898, <GEN1> line 3. 268439552 is 256MB. Cygwin's maxium memory size is set to 1024MB so I'm guessing that it has a different maximum memory size for perl? How can I increase the maximum memory size that perl programs can use? update: This is where the error occurs (in Git.pm): while (1) { my $bytesLeft = $size - $bytesRead; last unless $bytesLeft; my $bytesToRead = $bytesLeft < 1024 ? $bytesLeft : 1024; my $read = read($in, $blob, $bytesToRead, $bytesRead); //line 898 unless (defined($read)) { $self->_close_cat_blob(); throw Error::Simple("in pipe went bad"); } $bytesRead += $read; } I've added a print before line 898 to print out $bytesToRead and $bytesRead and the result was 1024 for $bytesToRead, and 134220800 for $bytesRead, so it's reading 1024 bytes at a time and it has already read 128MB. Perl's 'read' function must be out of memory and is trying to request for double it's memory size...is there a way to specify how much memory to request? or is that implementation dependent? UPDATE2: While testing memory allocation in cygwin: This C program's output was 1536MB int main() { unsigned int bit=0x40000000, sum=0; char *x; while (bit > 4096) { x = malloc(bit); if (x) sum += bit; bit >>= 1; } printf("%08x bytes (%.1fMb)\n", sum, sum/1024.0/1024.0); return 0; } While this perl program crashed if the file size is greater than 384MB (but succeeded if the file size was less). open(F, "<400") or die("can't read\n"); $size = -s "400"; $read = read(F, $s, $size); The error is similar Out of memory during "large" request for 536875008 bytes, total sbrk() is 217088 bytes at mem.pl line 6.

    Read the article

  • How to rename database without first stopping SQL instance to flush connections

    - by John Galt
    Is there a way to force a database into single user mode so a script can be run to rename databases? I find I have to Restart the instance of SQL (to force off any connections from a web app, etc.) and then I can run this script: USE master go sp_dboption MDS, "single user", true go sp_dboption StagingMDS, "single user", true go sp_renamedb MDS, LastMonthMDS go sp_renamedb StagingMDS, MDS go sp_dboption LastMonthMDS, "single user", false go sp_dboption MDS, "single user", false go After this script runs, I can restart IIS for my web app and it can connect to the new production database. All the above works well and we've been doing this for years but now we've upgraded to SQL 2008 and the SQL2008 instance also hosts other databases that support other web apps. So, rather than using a Restart of the whole SQL instance to enable subsequent single-user mode on 2 databases, is there a less intrusive way of accomplishing this? Thanks.

    Read the article

  • Postgres: clear entire database before re-creating / re-populating from bash script

    - by Hoff
    hi folks, I'm writing a shell script (will become a cronjob) that will: 1: dump my production database 2: import the dump into my development database Between step 1 and 2, I need to clear the development database (drop all tables?). How is this best accomplished from a shell script? So far, it looks like this: #!/bin/bash time=`date '+%Y'-'%m'-'%d'` # 1. export(dump) the current production database pg_dump -U production_db_name > /backup/dir/backup-${time}.sql # missing step: drop all tables from development database so it can be re-populated # 2. load the backup into the development database psql -U development_db_name < backup/dir/backup-${time}.sql Many thanks in advance! Martin

    Read the article

  • Cannot start Oracle XE 11gR2 Net Listener and Database on Ubuntu 13.04

    - by hydrology
    I have been following the setup step on this article for installing Oracle XE 11g R2 on Ubuntu 13.04. The environment variables PATH, ORACLE_HOME, ORACLE_SID, NLS_LANG ORACLE_BASE have all been set up correctly. simongao:~ 06:16:38$ echo $PATH /usr/lib/lightdm/lightdm:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/home/simongao/adt-bundle-linux-x86_64-20130219/sdk/platform-tools:/u01/app/oracle/product/11.2.0/xe/bin simongao:~ 06:18:36$ echo $ORACLE_HOME /u01/app/oracle/product/11.2.0/xe simongao:~ 06:23:29$ echo $ORACLE_SID XE simongao:~ 06:23:35$ echo $ORACLE_BASE /u01/app/oracle simongao:~ 06:23:37$ sudo echo $LD_LIBRARY_PATH /u01/app/oracle/product/11.2.0/xe/lib simongao:~ 06:23:48$ echo $NLS_LANG /u01/app/oracle/product/11.2.0/xe/bin/nls_lang.sh However, when I try to startup the service, I receive the following error information. simongao:~ 06:18:40$ sudo service oracle-xe start Starting Oracle Net Listener. Starting Oracle Database 11g Express Edition instance. Failed to start Oracle Net Listener using /u01/app/oracle/product/11.2.0/xe/bin/tnslsnr and Oracle Express Database using /u01/app/oracle/product/11.2.0/xe/bin/sqlplus.

    Read the article

  • Timeout Error in SQL Server Database Mail Feature

    - by RedLEON
    I configured database mail profile as gmail smtp server, SSL and port 465. I didn't restart server. And this is first time to use Database Mail feature on that server. When I send a testing mail it give me this error message: The mail could not be sent to the recipients because of the mail server failure. (The operation has timed out) I tried this mail configuration with Thunderbird and I could send messages througt this SMTP. Why is SQL Server giving this eror message? I searched here but didn't find any solution.

    Read the article

  • upgrade on graphic card board memory

    - by farzad
    is it possible that the amount of memory available on a graphic card board be updated/altered? I' mot talking about the shared memory that the graphic controller might use from system memory (RAM). my question is: "is it possible to replace/alter the memory available on the graphic card dedicated memory attached on the graphic board, with a similar memory but with higher capacity?"

    Read the article

  • Connect android to database

    - by danny
    I am doing a school project where we need to create an android application which needs to connect to a database. the application needs to gain and store information for people's profiles on the database. But unfortunatly we are a little bit stuck at this point because there are numerous ways to link the application such as http request through apache or through the SOAP/REST protocol. But it's really hard to find good instructions or tutorials on the problem since I can't really find them. Maybe that's cause i'm probably using the wrong words on google. Unfortunately I have little relevant information. So if anyone can help me with finding relevant links to good online tutorials or howto's than those are very welcome.

    Read the article

  • Postgres pg_dump dumps database in a different order every time

    - by behrk2
    Hello, I am writing a PHP script (which also uses linux bash commands) which will run through test cases by doing the following: I am using a PostgreSQL database (8.4.2)... 1.) Create a DB 2.) Modify the DB 3.) Store a database dump of the DB (pg_dump) 4.) Do regression testing by doing steps 1.) and 2.), and then take another database dump and compare it (diff) with the original database dump from step number 3.) However, I am finding that pg_dump will not always dump the database in the same way. It will dump things in a different order every time. Therefore, when I do a diff on the two database dumps, the comparison will result in the two files being different, when they are actually the same, just in a different order. Is there a different way I can go about doing the pg_dump? Thanks!

    Read the article

  • hosting site on one web server and accessing a database on another

    - by tombull89
    I'm fairly sure this is the right place for this question, but if not please move it to the the right site. I have a number of sites on a 1and1 package (yeah, I know...) and I also have a subdomian that belongs to a college tutor. I have been playing around with PHP and MySQL databases on the subdomian site and would like to know if it is possible to run a database driven (i.e. blog) on one of my 1and1 sites. I could upgrade my package but if I'm only going to gain database functionality I'm not sure if I want to do it. Also, as 1and1 don't use cPanel I'm wondering how the databases would be managed...but I'll worry about that when (if) the time comes. Cheers!

    Read the article

  • What are performance limits of a database?

    - by Tommy
    What are some rough performance limits (read/s, write/s) for a single database server (no master-slave architecture), assuming storage on disk? How many read/s, write/s, depending on the kind of disk? (SSD vs non-SSD) , assuming simple operations (select one row by primary key, update one row, correctly indexed). I assume this limit is dependent on disk seek/write. EDIT: My question is more about getting rough metrics of the number of operations a database supports: to be able to know for example, if a new feature triggering 300 inserts/s can be supported without scaling out with additional servers.

    Read the article

  • Writing datatable to database file, one record at a time in C#

    - by Kevin
    Hi! I want to write a C# program that will read a row from a datatable (named loadDT) and update a database file (named Forecasts.mdb). My datatable looks like this (each day's value is a number representing kilowatts usage forecast): Hour Day1 Day2 Day3 Day4 Day5 Day6 Day7 1 519 520 524 498 501 476 451 My database file looks like this: Day Hour KWForecast 1 1 519 2 1 520 3 1 524 ... and so on. Basically, I want to be able to read one row from the datatable, and then extrapolate that out to my database file, one record at a time. Each row from the datatable will result in seven records written to the database file. Any ideas on how to go about this? I can connect to my database, the connection string works, and I can update and delete from the database. I just can't wrap my head around how to do this one record at a time. Thanks in advance for any help and advice.

    Read the article

  • Storing n-grams in database in < n number of tables.

    - by kurige
    If I was writing a piece of software that attempted to predict what word a user was going to type next using the two previous words the user had typed, I would create two tables. Like so: == 1-gram table == Token | NextWord | Frequency ------+----------+----------- "I" | "like" | 15 "I" | "hate" | 20 == 2-gram table == Token | NextWord | Frequency ---------+------------+----------- "I like" | "apples" | 8 "I like" | "tomatoes" | 12 "I hate" | "tomatoes" | 20 "I hate" | "apples" | 2 Following this example implimentation the user types "I" and the software, using the above database, predicts that the next word the user is going to type is "hate". If the user does type "hate" then the software will then predict that the next word the user is going to type is "tomatoes". However, this implimentation would require a table for each additional n-gram that I choose to take into account. If I decided that I wanted to take the 5 or 6 preceding words into account when predicting the next word, then I would need 5-6 tables, and an exponentially increase in space per n-gram. What would be the best way to represent this in only one or two tables, that has no upper-limit on the number of n-grams I can support?

    Read the article

  • Writing datatable to database file, one record at a time

    - by Kevin
    I want to write a C# program that will read a row from a datatable (named loadDT) and update a database file (named Forecasts.mdb). My datatable looks like this (each day's value is a number representing kilowatts usage forecast): Hour Day1 Day2 Day3 Day4 Day5 Day6 Day7 1 519 520 524 498 501 476 451 My database file looks like this: Day Hour KWForecast 1 1 519 2 1 520 3 1 524 ... and so on. Basically, I want to be able to read one row from the datatable, and then extrapolate that out to my database file, one record at a time. Each row from the datatable will result in seven records written to the database file. Any ideas on how to go about this? I can connect to my database, the connection string works, and I can update and delete from the database. I just can't wrap my head around how to do this one record at a time.

    Read the article

  • How to Script a backup for each database on an MSSQL Engine?

    - by Geo
    We need to backup 40 databases inside an MS SQL Server Engine. We backup each database with the following script: BACKUP DATABASE [dbname1] TO DISK = N'J:\SQLBACKUPS\dbname1.bak' WITH NOFORMAT, INIT, NAME = N'dbname1-Full Database Backup', SKIP, NOREWIND, NOUNLOAD, STATS = 10 GO declare @backupSetId as int select @backupSetId = position from msdb..backupset where database_name=N'dbname1' and backup_set_id=(select max(backup_set_id) from msdb..backupset where database_name=N'dbname1' ) if @backupSetId is null begin raiserror(N'Verify failed. Backup information for database ''dbname1'' not found.', 16, 1) end RESTORE VERIFYONLY FROM DISK = N'J:\SQLBACKUPS\dbname1.bak' WITH FILE = @backupSetId, NOUNLOAD, NOREWIND GO We will like to add to the script the functionality of taking each database and replacing it in the above script. Basically a script that will create and verify each database backup from an engine. I am looking for something like this: For each database in database-list sp_backup(database) // this is the call to the script above. End For any ideas?

    Read the article

  • Memory Leak with Swing Drag and Drop

    - by tom
    I have a JFrame that accepts top-level drops of files. However after a drop has occurred, references to the frame are held indefinitely inside some Swing internal classes. I believe that disposing of the frame should release all of its resources, so what am I doing wrong? Example import java.awt.datatransfer.DataFlavor; import java.io.File; import java.util.List; import javax.swing.JFrame; import javax.swing.JLabel; import javax.swing.TransferHandler; public class DnDLeakTester extends JFrame { public static void main(String[] args) { new DnDLeakTester(); //Prevent main from returning or the jvm will exit while (true) { try { Thread.sleep(10000); } catch (InterruptedException e) { } } } public DnDLeakTester() { super("I'm leaky"); add(new JLabel("Drop stuff here")); setTransferHandler(new TransferHandler() { @Override public boolean canImport(final TransferSupport support) { return (support.isDrop() && support .isDataFlavorSupported(DataFlavor.javaFileListFlavor)); } @Override public boolean importData(final TransferSupport support) { if (!canImport(support)) { return false; } try { final List<File> files = (List<File>) support.getTransferable().getTransferData(DataFlavor.javaFileListFlavor); for (final File f : files) { System.out.println(f.getName()); } } catch (Exception e) { e.printStackTrace(); } return true; } }); setDefaultCloseOperation(DISPOSE_ON_CLOSE); pack(); setVisible(true); } } To reproduce, run the code and drop some files on the frame. Close the frame so it's disposed of. To verify the leak I take a heap dump using JConsole and analyse it with the Eclipse Memory Analysis tool. It shows that sun.awt.AppContext is holding a reference to the frame through its hashmap. It looks like TransferSupport is at fault. What am I doing wrong? Should I be asking the DnD support code to clean itself up somehow? I'm running JDK 1.6 update 19.

    Read the article

  • Loading images to UIScrollview crashes

    - by Icky
    Hello All. I have a Navigationcontroller pushing a UIViewController with a scrollview inside. Within the scrollview I download a certain number of images around 20 (sometimes more) each sized around 150 KB. All these images are added to the scrollview so that their origin is x +imageSize and the following is sorted right to the one before. All in all I think its a lot of data (3-4 MB). On an I pod Touch this sometimes crashes, the IPhone can handle it once, if it has to load the data again (some other images) , it crashes too. I guess its a memory issue but within my code, I download the image, save it to a file on the phone as NSData, read it again from file and add it to a UIImageview which I release. So I have freed the memory I allocated, nevertheless it still crashes. Can anyone help me out? Since Im new to this, I dont know the best way to handle the Images in a scrollview. Besides I create the controller at start from nib, which means I dont have to release it, since I dont use alloc - right? Code: In my rootviewcontroller I do: -(void) showImages { [[self naviController] pushViewController:imagesViewController animated:YES]; [imagesViewController viewWillAppear:YES]; } Then in my Controller handling the scroll View, this is the method to load the images: - (void) loadOldImageData { for (int i = 0; i < 40 ; i++) { NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; NSString *filePath = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"img%d.jpg", i]]; NSData *myImg = [NSData dataWithContentsOfFile:filePath]; UIImage *im = [UIImage imageWithData:myImg]; if([im isKindOfClass:[UIImage class]]) { NSLog(@"IM EXISTS"); UIImageView *imgView = [[UIImageView alloc] initWithImage:im]; CGRect frame = CGRectMake(i*320, 0, 320, 416); imgView.frame = frame; [myScrollView addSubview:imgView]; [imgView release]; //NSLog(@"Adding img %d", i); numberImages = i; NSLog(@"setting numberofimages to %d", numberImages); //NSLog(@"scroll subviews %d", [myScrollView.subviews count]); } } myScrollView.contentSize = CGSizeMake(320 * (numberImages + 1), 416); }

    Read the article

  • GLKit Memory Leak copywithZone

    - by TommyT39
    Running the instruments utility against the game I'm writing shows a bunch of memory leaks related to copy with Zone when I cycle through an array and draw some simple cube objects. Im not sure the best way to track this down as I'm new to OpenGL programming. My program is using ARC and is set to build for IOS 5. I am initializing GLKit to use OPenGl 2.0 and using the BafeEffect so I don't have to write my own shaders etc.. This shouldn't be rocket science. Im guessing that I must be not releasing something within the draw function. Below is the code to my draw function. Could you guys take a look and see if anything stands out as the problem? One other thing to note is that I'm using 15 different textures, the cubes can be 1 of 15 different ones. I have a property set on the cube class for the texture and I set it as I create the cube in there array. But I do load all 15 when my programs view did load starts.They are small .jps files that are less than 75k each and each cube uses the same texture all the way around so shouldn't be too big of an issue. Here is the code to my draw function: - (void)draw { GLKMatrix4 xRotationMatrix = GLKMatrix4MakeXRotation(rotation.x); GLKMatrix4 yRotationMatrix = GLKMatrix4MakeYRotation(rotation.y); GLKMatrix4 zRotationMatrix = GLKMatrix4MakeZRotation(rotation.z); GLKMatrix4 scaleMatrix = GLKMatrix4MakeScale(scale.x, scale.y, scale.z); GLKMatrix4 translateMatrix = GLKMatrix4MakeTranslation(position.x, position.y, position.z); GLKMatrix4 modelMatrix = GLKMatrix4Multiply(translateMatrix,GLKMatrix4Multiply(scaleMatrix,GLKMatrix4Multiply(zRotationMatrix, GLKMatrix4Multiply(yRotationMatrix, xRotationMatrix)))); GLKMatrix4 viewMatrix = GLKMatrix4MakeLookAt(0, 0, 1, 0, 0, -5, 0, 1, 0); effect.transform.modelviewMatrix = GLKMatrix4Multiply(viewMatrix, modelMatrix); effect.transform.projectionMatrix = GLKMatrix4MakePerspective(0.125*M_TAU, 1.0, 2, 0); effect.texture2d0.name = wallTexture.name; [effect prepareToDraw]; glEnable(GL_DEPTH_TEST); glEnable(GL_CULL_FACE); glEnableVertexAttribArray(GLKVertexAttribPosition); glVertexAttribPointer(GLKVertexAttribPosition, 3, GL_FLOAT, GL_FALSE, 0, triangleVertices); glEnableVertexAttribArray(GLKVertexAttribTexCoord0); glVertexAttribPointer(GLKVertexAttribTexCoord0, 2, GL_FLOAT, GL_FALSE, 0, textureCoordinates); glDrawArrays(GL_TRIANGLES, 0, 18); glDisableVertexAttribArray(GLKVertexAttribPosition); glDisableVertexAttribArray(GLKVertexAttribTexCoord0); }

    Read the article

  • How to structure a database with questions and answers?

    - by Andreas Johannessen
    Hi I am going to make a simple application that uses a database. I could need some guidance on how to structure it. I shall make question program. What I have in mind is. One table with questions One table with the difficulity of the question One table with the category of the question However, what do I do with the answers? Have them as seperate columns in the question-table? It sounds like a bad practice.(Also, where do I have the correct answer) Each question will have 5 answers where only one of them is correct.

    Read the article

  • Connect to Oracle database on a different server from PHP

    - by macha
    Hello I have a database engine sitting on a remote server, while my webserver is present locally. I have worked pretty much with client-server architecture, where the server has both the webserver and the database engine. Now I need to connect to an Oracle database which is situated on a different server. Can anybody give me any suggestions?? I believe ODBC_CONNECT might not work. Do I use OCI8 drivers?? How would I connect to my database server. Also I would have a very high number of database calls going back and forth, so is it good to go with persistent connection or do I still use individual database calls?

    Read the article

  • Get Rails to save a record to the database in a non-UTC time

    - by Shaun
    Is there a way to get Rails to save records to the database without it automagically converting the timestamp into UTC before saving? The problem is that I have a few models that pull data from a legacy database that saves everything in Mountain Time and occasionally I have to have my Rails app write to that database. The problem is that every time it does, it converts the time I give it from Mountain Time to UTC, which is 6-7 hours ahead (depending on DST)! Needless to say, this really messes with reporting on that database. If I could get around doing this, I would. Unfortunately, I can't do anything about the fact that this other database uses a different timezone, nor can I really get away from the need for this app to save to that database occasionally. If I could just get Rails to stop trying to help me, it'd be great.

    Read the article

  • iPhone memory management (with specific examples/questions)

    - by donkim
    Hey all. I know this question's been asked but I still don't have a clear picture of memory management in Objective-C. I feel like I have a pretty good grasp of it, but I'd still like some correct answers for the following code. I have a series of examples that I'd love for someone(s) to clarify. Setting a value for an instance variable. Say I have an NSMutableArray variable. In my class, when I initialize it, do I need to call a retain on it? Do I do fooArray = [[[NSMutableArray alloc] init] retain]; or fooArray = [[NSMutableArray alloc] init]; Does doing [[NSMutableArray alloc] init] already set the retain count to 1, so I wouldn't need to call retain on it? On the other hand, if I called a method that I know returns an autoreleased object, I would for sure have to call retain on it, right? Like so: fooString = [[NSString stringWithFormat:@"%d items", someInt] retain]; Properties. I ask about the retain because I'm a bit confused about how @property's automatic setter works. If I had set fooArray to be a @property with retain set, Objective-C will automatically create the following setter, right? - (void)setFooArray:(NSMutableArray *)anArray { [fooArray release]; fooArray = [anArray retain]; } So, if I had code like this: self.fooArray = [[NSMutableArray alloc] init]; (which I believe is valid code), Objective-C creates a setter method that calls retain on the value assigned to fooArray. In this case, will the retain count actually be 2? Correct way of setting a value of a property. I know there are questions on this and (possibly) debates, but which is the right way to set a @property? This? self.fooArray = [[NSMutableArray alloc] init]; Or this? NSMutableArray *anArray = [[NSMutableArray alloc] init]; self.fooArray = anArray; [anArray release]; I'd love to get some clarification on these examples. Thanks!

    Read the article

< Previous Page | 74 75 76 77 78 79 80 81 82 83 84 85  | Next Page >