Search Results

Search found 10189 results on 408 pages for 'db 11gr2'.

Page 362/408 | < Previous Page | 358 359 360 361 362 363 364 365 366 367 368 369  | Next Page >

  • What is a good approach to preloading data?

    - by Bob Horn
    Are there best practices out there for loading data into a database, to be used with a new installation of an application? For example, for application foo to run, it needs some basic data before it can even be started. I've used a couple options in the past: TSQL for every row that needs to be preloaded: IF NOT EXISTS (SELECT * FROM Master.Site WHERE Name = @SiteName) INSERT INTO [Master].[Site] ([EnterpriseID], [Name], [LastModifiedTime], [LastModifiedUser]) VALUES (@EnterpriseId, @SiteName, GETDATE(), @LastModifiedUser) Another option is a spreadsheet. Each tab represents a table, and data is entered into the spreadsheet as we realize we need it. Then, a program can read this spreadsheet and populate the DB. There are complicating factors, including the relationships between tables. So, it's not as simple as loading tables by themselves. For example, if we create Security.Member rows, then we want to add those members to Security.Role, we need a way of maintaining that relationship. Another factor is that not all databases will be missing this data. Some locations will already have most of the data, and others (that may be new locations around the world), will start from scratch. Any ideas are appreciated.

    Read the article

  • [MFC] What is the reciprocal of CComboBox.GetItemData?

    - by Hamish Grubijan
    Instead of associating objects with Combo Box items, I associate long ids representing choices. They come from a database, so it seems natural to do so anyway. Now, I persist the id and not the index of the user's selection, so that the choice is remembered across sessions. If id no longer exists in database - no big deal. The choice will be messed up once. If db does not change, however, then it would be a great success ;) Here is how I get the id : chosenSomethingIndex = cmbSomething.GetCurSel(); lastSomethingId = cmbSomething.GetItemData(chosenSomethingIndex); How do I reverse this? When I load the stored value for user's last choice, I need to convert that id into an index. I can do: cmbSomething.SetCurSel(chosenSomethingIndex); However, how can I attempt (it might not exist) to get an index once I have an id? I am looking for a reciprocal function to GetItemData I am using VS2008, probably latest version of MFC, whatever that is. Thank you.

    Read the article

  • Laravel4: Call to a member function on a non-object

    - by s0hno
    The following code will throw an error Call to a member function `links()` on a non-object routes.php: Route::get('videos', function(){ $data = DB::table('video_data_r')->paginate(5); return View::make('video',$data); }); Corresponding video view: <?php foreach($data as $item): ?> <div class="video_entry"> <a href="<?php echo $item -> url; ?>" target="_blank"><img src="<?php echo $item -> thumb; ?>" /></a> <a href="<?php echo $item -> url; ?>" target="_blank"><?php echo $item -> title; ?>"</a> </div> <?php endforeach; ?> <?php echo $data->links();?> Could you give me a good hint on what looks like a trivial error?

    Read the article

  • Is it possible to filter data used by pivot table based on filtering the rows in a source table in Excel?

    - by Geoffrey Stoel
    I have developed a dashboard in Excel 2007 that uses one source table in a sheet (being filled with a query on our data warehouse) and multiple pivot tables making different cross sections on this data. I use the GETPIVOTDATA in almost a hundred formulas to give me the right value for a specific indicator in my dashboard. This all works fine. However I now have received the question to make the dashboard for 5 different segments. As you can imagine I don't want to create 5 different workbooks for this and need to maintain the dashboard logic on all of them. So my question is the following. Is it possible to automatically (through VBA or any other means) filter the results in my source table which is the source for my pivot tables and thus for my dashboard values. So schematically: DATABASE_VIEW -- SOURCE_TABLE -- 12 pivot tables -- 100 GETPIVOTDATA functions Preferably I would like to load all the segments in the source_table (one view on my database) and then filter the data in the source table, which results in filterd source_dat for my pivots. This way I can (without requerying the db) quickly change between segments in the dashboards (refreshing pivots only). Data in the source table has the column: CUSTOMER_SEGMENT available to filter upon. Any help is appreciated. Geoffrey

    Read the article

  • Does a Postgresql dump create sequences that start with - or after - the last key?

    - by bennylope
    I recently created a SQL dump of a database behind a Django project, and after cleaning the SQL up a little bit was able to restore the DB and all of the data. The problem was the sequences were all mucked up. I tried adding a new user and generated the Python error IntegrityError: duplicate key violates unique constraint. Naturally I figured my SQL dump didn't restart the sequence. But it did: DROP SEQUENCE "auth_user_id_seq" CASCADE; CREATE SEQUENCE "auth_user_id_seq" INCREMENT 1 START 446 MAXVALUE 9223372036854775807 MINVALUE 1 CACHE 1; ALTER TABLE "auth_user_id_seq" OWNER TO "db_user"; I figured out that a repeated attempt at creating a user (or any new row in any table with existing data and such a sequence) allowed for successful object/row creation. That solved the pressing problem. But given that the last user ID in that table was 446 - the same start value in the sequence creation above - it looks like Postgresql was simply trying to start creating rows with that key. Does the SQL dump provide the wrong start key by 1? Or should I invoke some other command to start sequences after the given start ID? Keenly curious.

    Read the article

  • Excessive use of Inner Join for more than 3 tables

    - by Archangel08
    Good Day, I have 4 tables on my DB (not the actual name but almost similar) which are the ff: employee,education,employment_history,referrence employee_id is the name of the foreign key from employee table. Here's the example (not actual) data: **Employee** ID Name Birthday Gender Email 1 John Smith 08-15-2014 Male [email protected] 2 Jane Doe 00-00-0000 Female [email protected] 3 John Doe 00-00-0000 Male [email protected] **Education** Employee_ID Primary Secondary Vocation 1 Westside School Westshore H.S SouthernBay College 2 Eastside School Eastshore H.S NorthernBay College 3 Northern School SouthernShore H.S WesternBay College **Employment_History** Employee_ID WorkOne StartDate Enddate 1 StarBean Cafe 12-31-2012 01-01-2013 2 Coffebucks Cafe 11-01-2012 11-02-2012 3 Latte Cafe 01-02-2013 04-05-2013 Referrence Employee_ID ReferrenceOne Address Contact 1 Abraham Lincoln Lincoln Memorial 0000000000 2 Frankie N. Stein Thunder St. 0000000000 3 Peter D. Pan Neverland Ave. 0000000000 NOTE: I've only included few columns though the rest are part of the query. And below are the codes I've been working on for 3 consecutive days: $sql=mysql_query("SELECT emp.id,emp.name,emp.birthday,emp.pob,emp.gender,emp.civil,emp.email,emp.contact,emp.address,emp.paddress,emp.citizenship,educ.employee_id,educ.elementary,educ.egrad,educ.highschool,educ.hgrad,educ.vocational,educ.vgrad,ems.employee_id,ems.workOne,ems.estartDate,ems.eendDate,ems.workTwo,ems.wstartDate,ems.wendDate,ems.workThree,ems.hstartDate,ems.hendDate FROM employee AS emp INNER JOIN education AS educ ON educ.employee_id='emp.id' INNER JOIN employment_history AS ems ON ems.employee_id='emp.id' INNER JOIN referrence AS ref ON ref.employee_id='emp.id' WHERE emp.id='$id'"); Is it okay to use INNER JOIN this way? Or should I modify my query to get the results that I wanted? I've also tried to use LEFT JOIN but still it doesn't return anything .I didn't know where did I go wrong. You see, as I have thought, I've been using the INNER JOIN in correct manner, (since it was placed before the WHILE CLAUSE). So I couldn't think of what could've possible went wrong. Do you guys have a suggestion? Thanks in advance.

    Read the article

  • EditText items in a scrolling list lose their changes when scrolled off the screen

    - by ianww
    I have a long scrolling list of EditText items created by a SimpleCursorAdapter and prepopulated with values from an SQLite database. I make this by: cursor = db.rawQuery("SELECT _id, criterion, localweight, globalweight FROM " + dbTableName + " ORDER BY criterion", null); startManagingCursor(cursor); mAdapter = new SimpleCursorAdapter(this, R.layout.weight_edit_items, cursor, new String[]{"criterion","localweight","globalweight"}, new int[]{R.id.criterion_edit, R.id.localweight_edit, R.id.globalweight_edit}); this.setListAdapter(mAdapter); The scrolling list is several emulator screens long. The items display OK - scrolling through them shows that each has the correct value from the database. I can make an edit change to any of the EditTexts and the new text is accepted and displayed in the box. But...if I then scroll the list far enough to take the edited item off the screen, when I scroll back to look at it again its value has returned to what it was before I made the changes, ie. my edits have been lost. In trying to sort this out, I've done a getText to look at what's in the EditText after I've done my edits (and before a scroll) and getText returns the original text, even though the EditText is displaying my new text. It seems that the EditText has only accepted my edits superficially and they haven't been bound to the EditText, meaning they get dropped when scrolled off the screen. Can anyone please tell me what's going on here and what I need to do to force the EditText to retain its edits? Thanks Ian

    Read the article

  • Connect Rails model to non-rails database

    - by the_snitch
    I'm creating a new web application (Rails 3 beta), of which pieces of it will access data from a legacy mysql database that a current php application is using. I do not wish to modify the legacy db schema, I just want to be able to read/write to it, as well as the rails application having it's own database using activerecord for the newer stuff. I'm using mysql for the rails app, so I have the adapter installed. How is the best way to do this? For example, I want contacts to come from the old database. Should I create a contacts controller, and manually call sql to get the variables for the views? Or should I create a Contact model, and define attributes that match the fields in the database, and am I able to use it like Contact.mail_address to have it call "SELECT mailaddr FROM contacts WHERE id=Contact.id". Sorry, I've never done much in Rails outside of the standard stuff that is documented well. I'm not sure of what the best approach would be. Ideally, I want the contacts to be presented to my rails application as native as possible, so that I can expose them RESTfully for API access. Any suggestions and code examples would be much appreciated

    Read the article

  • Seeding many to many tables with Entity Framework

    - by Doozer1979
    I have a meeting entity and a users entity which have a many to many relationship. I'm using Autopoco to create seed data for the Users and meetings How do i seed the UserMeetings linking table that is created by EntityFramework with seed data? The linking table has two fields in it; User_Id, and Meeting_ID. I'm looping through the list of users that autopoco creates and attaching a random number of meetings Here's what i've got so far. foreach (var user in userList) { var rand = new Random(); var amountOfMeetingsToAdd = rand.Next(1, 300); for (var i = 0; i <= amountOfMeetingsToAdd; i++) { var randomMeeting = rand.Next(1, MeetingRecords); //Error occurs on This line user.Meetings.Add(_meetings[randomMeeting]); } } I got an 'Object reference not set to an instance of an object.' even though the meeting record that i'm trying to attach does exist. For info all this is happening prior to me saving the context to the DB.

    Read the article

  • SQL query for an access database needed

    - by masfenix
    Hey guys, first off all sorry, i can't login using my yahoo provider. anyways I have this problem. Let me explain it to you, and then I'll show you a picture. I have a access db table. It has 'report id', 'recpient id', and 'recipient name' and 'report req'. What the table "means" is that do the user using that report still require it or can we decommission it. Here is how the data looks like (blocked out company userids and usernames): *check the link below, I cant post pictures cuz yahoo open id provider isnt working. So basically I need to have 3 select queries: 1) Select all the reports where for each report, ALL the users have said no to 'reportreq'. In plain English, i want a listing of all the reports that we have to decommission because no user wants it. 2) Select all the reports where the report is required, and the batchprintcopy is more then 0. This way we can see which report needs to be printed and save paper instead of printing all the reports. 3)A listing of all the reports where the reportreq field is empty. I think i can figure this one out myself. This is using Access/VBA and the data will be exported to an excel spreadsheet. I just a simple query if it exists, OR an alogorithm to do it quickly. I just tried making a "matrix" and it took about 2 hours to populate. https://docs.google.com/uc?id=0B2EMqbpeBpQkMTIyMzA5ZjMtMGQ3Zi00NzRmLWEyMDAtODcxYWM0ZTFmMDFk&hl=en_US

    Read the article

  • How to control table width in code ignator?

    - by riad
    Dear Exparts, In codeIgnator frame work the below is my code working properly.But i cannot control the table width. So,when a long value come into the table then table going to extra large width.But i need to wrap the outcomes data.So,How i can fixed the table width? Pls see my code below.. ///controller code/// $config['base_url'] = base_url().'Search_Controller/songSearchPage/'; $config['total_rows'] = $this->db->count_all('tbl_rbt'); $config['per_page'] = '5'; $config['full_tag_open'] = '<p>'; $config['full_tag_close'] = '</p>'; $this->pagination->initialize($config); //load the model and get results $data[]=array(); $data['extraHeadContent'] = '<script type="text/javascript" src="' . base_url() . 'js/song_search.js"></script>'; $data['results'] = $this->search_model->getSongResult($config['per_page'],$this->uri->segment(3)); // load the HTML Table Class $this->table->set_heading('Song Name','Album Name','Artist Name'); // load the view $this->load->view('song_search_page',$data); /////view code///// <div class="song_element_output"> <?php echo $this->table->generate($results); ?> <?php echo $this->pagination->create_links(); ?> </div> Could anybody can help me to control the table??? Thanks Riad

    Read the article

  • Using entity framework to connect to multiple similar tables in .net MVC.

    - by Dite
    A relative newcomer to .net MVC2 and the entity framework, I am working on a project which requires a single web application, (C# .net 4), to connect to multiple different databases depending on the route of access, (ie subdomain). No problem with this in principle and all the logic is written to transform the subdomain into an entity connection and pass this through to the Entity Model. The problem comes with the fact that the different database whilst being largely similar in structure contain 3 or 4 unique tables bespoke to that instance. To my mind there are two ways to solve this issue, neither of which i am sure will be possible. 1/ Use a separate entity model for each database. -Attempts down this route have through up conflicts where table/sp names are the same across differnt db's, or implicit conversion errors when I try and put the different models in different namespaces. or 2/ Overwrite the classes which refer to the changeable database objects based on the value of a base controller property. -I have found nothing to suggest i can even do this. My question is if either of theser routes can ever work in principle or if i should just give up on the EF and connect to the dtabases directlky using ADO. Perhaps there is another way to solve this problem i haven't thought of? Thanks for any help...

    Read the article

  • (JBoss) Problem with .war project on production, while test works

    - by ikky
    Hello. I have a java project (using Spring mvc) which i have built and deployed on my local computer. It runs on a JBoss application server, and works fine on the local machine. The next step i do, is to copy the deployed project.war from the local machine to the server which has the same development environment as the local machine. I stop the JBoss server, delete the cache, and run the JBoss server again. When i now try to run one of the pages(xx.xxx.xxx:8080/webservice/test.htm), i get this exception: exception org.springframework.web.util.NestedServletException: Request processing failed; nested exception is java.lang.NullPointerException org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:583) root cause java.lang.NullPointerException com.project.db.DBCustomer.isCredentialsCorrect(DBCustomer.java:44) com.project.CreateHController.handleRequest(CreateHController.java:60) org.springframework.web.servlet.mvc.SimpleControllerHandlerAdapter.handle(SimpleControllerHandlerAdapter.java:48) org.springframework.web.servlet.DispatcherServlet.doDispatch(DispatcherServlet.java:875) org.springframework.web.servlet.DispatcherServlet.doService(DispatcherServlet.java:807) org.springframework.web.servlet.FrameworkServlet.processRequest(FrameworkServlet.java:571) org.springframework.web.servlet.FrameworkServlet.doGet(FrameworkServlet.java:501) javax.servlet.http.HttpServlet.service(HttpServlet.java:697) javax.servlet.http.HttpServlet.service(HttpServlet.java:810) org.jboss.web.tomcat.filters.ReplyHeaderFilter.doFilter(ReplyHeaderFilter.java:75) It seems like none of my classes are reachable. Does anyone have any idea of what is wrong? btw: as i said, the project works fine on the local machine.

    Read the article

  • authorise user from mysql database

    - by Jacksta
    I suck at php, and cant find the error here. The script gets 2 variables "username" and "password" from a html from then check them against a MySQL databse. When I run this I get the follow error "Query was empty" <? if ((!$_POST[username]) || (!$_POST[password])) { header("Location: show_login.html"); exit; } $db_name = "testDB"; $table_name = "auth_users"; $connection = @mysql_connect("localhost", "admin", "pass") or die(mysql_error()); $db = @mysql_select_db($db_name, $connection) or die(mysql_error()); $slq = "SELECT * FROM $table_name WHERE username ='$_POST[username]' AND password = password('$_POST[password]')"; $result = @mysql_query($sql, $connection) or die(mysql_error()); $num = mysql_num_rows($result); if ($num != 0) { $msg = "<p>Congratulations, you're authorised!</p>"; } else { header("Location: show_login.html"); exit; } ?> <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> <html xmlns="http://www.w3.org/1999/xhtml"> <head> <meta http-equiv="Content-Type" content="text/html; charset=utf-8" /> <title>Secret Area</title> </head> <body> <? echo "$msg"; ?> </body> </html>

    Read the article

  • Mapping relationships from multiple databases in NHibernate

    - by mannish
    I have a multi-database application configured with NHibernate. The entities that correspond to tables from each database are in their own separate assemblies (an assembly per database if you will). I have a need/desire to relate an entity from one database to an entity of another database. Everything up to this point works as I want it to (the application handles multiple session factories, etc.). The relationship I want is many-to-one, but in reality my application only cares about one side of the relationship (for reasons that aren't relevant). The relevant entities are Project and PMProject, where a Project HAS A PMProject. When I map the many-to-one, I get the following error: NHibernate.MappingException: An association from the table PROJECTS refers to an unmapped class: SDMS.PPRM.PMProject The Project mapping itself reads (ignore the funky column naming; it's an Oracle db): <many-to-one name="PMProject" class="SDMS.PPRM.PMProject" column="PM_PROJECT_ID" cascade="none" /> In the class attribute, I'm referencing the appropriate assembly, but I get that error which seems to tell me it simply can't find the mapping file for PMProject. But that file exists (it's set as embedded resource), the session factory instantiation works without fail; so I'm at a loss on how to tell the Project mapping how/where to look for the appropriate mapping. Is there something I'm missing? A better way to go about this? Thanks in advance.

    Read the article

  • Why would paperclip not assign an ID to my uploaded photos?

    - by Trip
    I just deployed to a cluster server, and my delayed_jobs recipe was overwritten in the process. I solved that, delayed_jobs is up and running but can't find the ID of images that are uploaded. The images are saved correctly : Processing PhotosController#create (for 173.161.167.41 at 2010-06-01 05:09:14) [POST] Parameters: {"Filename"="1.jpg", "gallery_id"="1298", "action"="create", "amp"=nil, "authenticity_token"="qmbnpwFY8a5E3YtS/4fMWF/Z8evCE4hMxqKVJw0I7Ek=", "Upload"="Submit Query", "controller"="photos", "organization_id"="470", "_hq_channel_session"="BAh7CSIYdXNlcl9jcmVkZW50aWFsc19pZGkHIhV1c2VyX2NyZWRlbnRpYWxzIgGAOGRlZDc0NGJlOWU3NTNlNDFlYmVlMDdjMzIzYjA1ZjQxNGE5ZDY4YjNmYjFmNjNkMDQ2OWY2ZDQyOTljZDhiMDFlNmRkMDljNThmMzBmOWJhMTIwNDhkMDI5MTMxYmU5MDczYjIxZmI4YmQxMDVlMTBmNjZmOWFhODE1ZTBjMGM6EF9jc3JmX3Rva2VuIjFxbWJucHdGWThhNUUzWXRTLzRmTVdGL1o4ZXZDRTRoTXhxS1ZKdzBJN0VrPToPc2Vzc2lvbl9pZCIlMjAwMDQ3ZDQ3ZWUyZTgzODIxYzdjOGI3OTdmZGJiMDM=--ac6aa580262938bf5a4d6b9a740722b680eb5d48", "Filedata"=#} [paperclip] Saving attachments. [paperclip] saving /data/HQ_Channel/releases/20100530153454/public/system/photos/9253/original/1.jpg [paperclip] Saving attachments. [paperclip] Saving attachments. Completed in 127ms (View: 2, DB: 91) | 200 OK [http://invent.hqchannel.com/organizations/470/media/galleries/1298/photos?_hq_channel_session=BAh7CSIYdXNlcl9jcmVkZW50aWFsc19pZGkHIhV1c2VyX2NyZWRlbnRpYWxzIgGAOGRlZDc0NGJlOWU3NTNlNDFlYmVlMDdjMzIzYjA1ZjQxNGE5ZDY4YjNmYjFmNjNkMDQ2OWY2ZDQyOTljZDhiMDFlNmRkMDljNThmMzBmOWJhMTIwNDhkMDI5MTMxYmU5MDczYjIxZmI4YmQxMDVlMTBmNjZmOWFhODE1ZTBjMGM6EF9jc3JmX3Rva2VuIjFxbWJucHdGWThhNUUzWXRTLzRmTVdGL1o4ZXZDRTRoTXhxS1ZKdzBJN0VrPToPc2Vzc2lvbl9pZCIlMjAwMDQ3ZDQ3ZWUyZTgzODIxYzdjOGI3OTdmZGJiMDM%3D--ac6aa580262938bf5a4d6b9a740722b680eb5d48&authenticity_token=qmbnpwFY8a5E3YtS%2F4fMWF%2FZ8evCE4hMxqKVJw0I7Ek%3D] And then delayed_jobs keeps spinning around in circles on this one : 2010-06-01T05:09:02-0700: * [Worker(delayed_job host:ip-10-251-197-159 pid:19994)] acquired lock on PhotoJob 2010-06-01T05:09:02-0700: * [JOB] delayed_job host:ip-10-251-197-159 pid:19994 failed with ActiveRecord::RecordNotFound: Couldn't find Photo with ID=9247 - 0 failed attempts 2010-06-01T05:09:02-0700: * [Worker(delayed_job host:ip-10-251-197-159 pid:19994)] acquired lock on PhotoJob 2010-06-01T05:09:02-0700: * [JOB] delayed_job host:ip-10-251-197-159 pid:19994 failed with ActiveRecord::RecordNotFound: Couldn't find Photo with ID=9245 - 0 failed attempts 2010-06-01T05:09:02-0700: * [Worker(delayed_job host:ip-10-251-197-159 pid:19994)] acquired lock on PhotoJob So what I get is that the photos are not being assigned ID's by paperclip. Anyone know where I could poke and pry from here? UPDATE: I created a clone application on a single server. And there are no problems. The images on the cluster do show up (occassionally). If I keep clicking on the folders that lead to photos, it will 50% of the time return a 404 with it not being able to find the photo, and the other half it will present the photo. So the problem has got to be with the server interaction between the ActiveRecord through multiple servers.

    Read the article

  • file cretaed using exec could not be accessed immediately after creation?

    - by Holicreature
    HI I'm using exec in php to execute a command and it will create a .png file in a temp folder.. After creating that i'm trying to open that file and read contents and process them,, but i end up file could not read error.. I think the time taken by the exec to execute and create a file is the cause for the issue.. but i dont know how to fix it? i tried sleep() but it makes my script to run slow <?php error_reporting(E_ALL); extension_loaded('ffmpeg') or die('Error in loading ffmpeg'); //db connection codes $max_width = 120; $max_height = 72; $path ="/path/"; $qry="select id, input_file, output_file from videos where thumbnail='' or thumbnail is null;"; $res=mysql_query($qry); $cnt = 1; while($row = mysql_fetch_array($res,MYSQL_ASSOC)) { $outfile = $row[output_file]; $imgname = $cnt.".png"; $srcfile = "/path/".$outfile; echo "####$srcfile####"; exec("ffmpeg -i ".$srcfile." -r 1 -ss 00:00:05 -f image2 -s 120x72 ".$path.$imgname); $nname = "./temp/".$imgname; echo "nname===== $nname"; $fileo = fopen($nname,"rb"); if($fileo) { $imgData = addslashes(file_get_contents($nname)); .. ... .... } else echo "Could not open<br><br>"; $cnt = $cnt + 1: } ?>

    Read the article

  • Performing a SVD on tweets. Memory problem

    - by plotti
    I have generated a huge csv file as an output from my pos tagging and stemming. It looks like this: word1, word2, word3, ..., word14400 person1 1 2 0 1 person2 0 0 1 0 ... person650 It contains the word counts for each person. Like this I am getting characteristic vectors for each person. I want to run a SVD on this beast, but it seems the matrix is too big to be held in memory to perform the operation. My quesion is: should i reduce the column size by removing words which have a column sum of for example 1, which means that they have been used only once. Do I bias the data too much with this attempt? I tried the rapidminer attempt, by loading the csv into the db. and then sequentially reading it in with batches for processing, like rapidminer proposes. But Mysql can't store that many columns in a table. If i transpose the data, and then retranspose it on import it also takes ages.... -- So in general I am asking for advice how to perform a svd on such a corpus.

    Read the article

  • ASP.NET server data persistence

    - by Wayne Werner
    Hi, I'm not really sure exactly how the question should be phrased, so please be patient if I ask the wrong thing. I'm writing an ASP.NET application using VB as the code behind language. I have a data access class that connects to the DB to run the query (parameterized, of course), and another class to perform the validation tasks - I access this class from my aspx page. What I would like is to be able to store the data server side and wait for the user to choose from a few options based on the validity of the data. But unless my understanding is completely off, having persistent data objects on the server will give problems when multiple users connect? My ultimate goal is that once the data has been validated the end user can't modify it. Currently I'm validating the data, but I still have to retrieve it from the web form AFTER the user says OK, which obviously leaves open the possibility of injecting bad data either accidentally (unlikely) or on purpose (also unlikely for the use, but I'd prefer not to take the chance). So am I completely off in my understanding? If so, can someone point me to a resource that provides some instructions on keeping persistent data on the server, or provide instruction? Thanks!

    Read the article

  • Is there a reason why SSIS significantly slows down after a few minutes?

    - by Mark
    I'm running a fairly substantial SSIS package against SQL 2008 - and I'm getting the same results both in my dev environment (Win7-x64 + SQL-x64-Developer) and the production environment (Server 2008 x64 + SQL Std x64). The symptom is that initial data loading screams at between 50K - 500K records per second, but after a few minutes the speed drops off dramatically and eventually crawls embarrasingly slowly. The database is in Simple recovery model, the target tables are empty, and all of the prerequisites for minimally logged bulk inserts are being met. The data flow is a simple load from a RAW input file to a schema-matched table (i.e. no complex transforms of data, no sorting, no lookups, no SCDs, etc.) The problem has the following qualities and resiliences: Problem persists no matter what the target table is. RAM usage is lowish (45%) - there's plenty of spare RAM available for SSIS buffers or SQL Server to use. Perfmon shows buffers are not spooling, disk response times are normal, disk availability is high. CPU usage is low (hovers around 25% shared between sqlserver.exe and DtsDebugHost.exe) Disk activity primarily on TempDB.mdf, but I/O is very low (< 600 Kb/s) OLE DB destination and SQL Server Destination both exhibit this problem. To sum it up, I expect either disk, CPU or RAM to be exhausted before the package slows down, but instead its as if the SSIS package is taking an afternoon nap. SQL server remains responsive to other queries, and I can't find any performance counters or logged events that betray the cause of the problem. I'll gratefully reward any reasonable answers / suggestions.

    Read the article

  • Usage of autorelease pools for fetch method

    - by Matthias
    Hi, I'm a little bit confused regarding the autorelease pools when programming for the iPhone. I've read a lot and the oppionions seem to me from "Do-NOT-use" to "No problem to use". My specific problem is, I would like to have a class which encapsulates the SQLite3 Access, so I have for example the following method: -(User*)fetchUserWithId:(NSInteger)userId Now, within this method a SQL query is done and a new user object is created with the data from the database and then returned. Within this DB Access class I don't need this object anymore, so I can do a release, but since the calling method needs it, I would do an autorelease, wouldn't I? So, is it okay to use autorelease here oder would it gain too much memory, if this method is called quite frequently? Some websites say, that the autorelease pool is released first at the end of the application, some say, at every event (e.g. user touches something). If I should not use autorelease, how can I make sure, that the object is released correctly? Can I do a release in the fetch method and hope, that the object is still there until the calling method can do a retain? Thanks for your help! Regards Matthias

    Read the article

  • Ideas for storing e-mail messages in a Delphi client server application

    - by user193655
    There are many suggestions here and there for storing e-mail messages. Somehow what I am doing is writing an Outlook addin to send emails from inbox/sent folders directly to my application. So only what is really interesting is saved. And I decide where to save it. Imagine this case: I recieve an email from a customer. It's up to me to decide whether I should save it on the customer or on the order 24 that that customer did. So this is why I am doing the add in, and not some automatic storing of emails = noise after some time. This said, how to store the emails? For the emails that I recieve or send through Outlook the idea could be save the whole file in a blob field (so the eml file), may be I can save also other info (like the subject) in another text field. But the problem comes when I write an email from my application. In this case I am not generating an eml file, I send through MAPI data to Outlook to compose an email that I will send with Outlook (so in this case I cannot save the eml), or I directly send it with Indy. Also in this case I don't have the eml file... One idea could be that the all the emails that I auto compose have a special flag that the Add in recognises and therefore when I send the mail it is stored back to the DB. So in this case I can save the eml also of the mails I sent from my application. May you comment?

    Read the article

  • Social Media Java Design Problem

    - by jboyd
    I need to put something together quickly that will take blog posts and place them on social media sites, the requirements are as follows: Blog Entries are independent records that already exist, they have a published date and a modified date, the blog entry application cannot be changed, at least not substantially A new blog entry, or update needs to be sent to social media sites I currently do not need to update or delete social media communications if the blog entry is edited, or deleted, though I may need to later My design problems here are as follows: how do I know the status of each update how can I figure out what blog entry updates and postings have already been sent out? how can I quickly poll the blog entry table for postings that haven't yet been sent out? Avoiding looking at each Entry record from the DB as an object and asking if it's been sent already. That would be too slow. I cannot hook into any Blog Entry update code, my only option would be to create a trigger that an update queues something to be processed I'm looking for general guiding principles here, the biggest problem I'm having is coming up with any reasonable way to figure out if a blog entry should be sent to our social media sites in the first place

    Read the article

  • How to use MySQL geospatial extensions with spherical geometries

    - by Joshua
    Hi Everyone, I would like to store thousands of latitude/longitude points in a MySQL db. I was successful at setting up the tables and adding the data using the geospatial extensions where the column 'coord' is a Point(lat, lng). Problem: I want to quickly find the 'N' closest entries to latitude 'X' degrees and longitude 'Y' degrees. Since the Distance() function has not yet been implemented, I used GLength() function to calculate the distance between (X,Y) and each of the entries, sorting by ascending distance, and limiting to 'N' results. The problem is that this is not calculating shortest distance with spherical geometry. Which means if Y = 179.9 degrees, the list of closest entries will only include longitudes of starting at 179.9 and decreasing even though closer entries exist with longitudes increasing from -179.9. How does one typically handle the discontinuity in longitude when working with spherical geometries in databases? There has to be an easy solution to this, but I must just be searching for the wrong thing because I have not found anything helpful. Should I just forget the GLength() function and create my own function for calculating angular separation? If I do this, will it still be fast and take advantage of the geospatial extensions? Thanks! josh UPDATE: This is exactly what I am describing above. However, it is only for SQL Server. Apparently SQL Server has a Geometry and Geography datatypes. The geography does exactly what I need. Is there something similar in MySQL?

    Read the article

  • Help! Getting an error copying the data from one column to the same column in a similar recordset..

    - by Mike D
    I have a routine which reads one recordset, and adds/updates rows in a similar recordset. The routine starts off by copying the columns to a new recordset: Here's the code for creating the new recordset.. For X = 1 To aRS.Fields.Count mRS.Fields.Append aRS.Fields(X - 1).Name, aRS.Fields(X - 1).Type, aRS.Fields(X - _ 1).DefinedSize, aRS.Fields(X - 1).Attributes Next X Pretty straight forward. Notice the copying of the name, Type, DefinedSize & Attributes... Further down in the code, (and there's nothing that modifies any of the columns between.. ) I'm copying the values of a row to a row in the new recordset as such: For C = 1 To aRS.Fields.Count mRS.Fields(C - 1) = aRS.Fields(C - 1) Next C When it gets to the last column which is a numeric, it craps with the "Mutliple-Step Operation Generated an error" message. I know that MS says this is an error generated by the provider, which in this case is ADO 2.8. There is no open connect to the DB at this point in time either. I'm pulling what little hair I have left over this one... (and I don't really care at this point that the column index is 'X' in one loop & 'C' in the other... I'll change it later when I get the real problem fixed...)

    Read the article

< Previous Page | 358 359 360 361 362 363 364 365 366 367 368 369  | Next Page >