Search Results

Search found 38336 results on 1534 pages for 'sql wait types'.

Page 612/1534 | < Previous Page | 608 609 610 611 612 613 614 615 616 617 618 619  | Next Page >

  • How can I provide values for non-grouped columns in NHibernate?

    - by ddc0660
    I have a criteria query: Session.CreateCriteria<Sell043Report>() .SetProjection(.ProjectionList() .Add(LambdaProjection.GroupProperty<Sell043Report>(r => r.location)) .Add(LambdaProjection.GroupProperty<Sell043Report>(r => r.agent)) .Add(LambdaProjection.GroupProperty<Sell043Report>(r => r.cusip)) .Add(LambdaProjection.GroupProperty<Sell043Report>(r => r.SettlementDate)) .Add(LambdaProjection.GroupProperty<Sell043Report>(r => r.salePrice)) .Add(LambdaProjection.GroupProperty<Sell043Report>(r => r.foreignFx)) .Add(LambdaProjection.GroupProperty<Sell043Report>(r => r.batchNumber)) .Add(LambdaProjection.GroupProperty<Sell043Report>(r => r.origSaleDate)) .Add(LambdaProjection.GroupProperty<Sell043Report>(r => r.planName)) .Add(LambdaProjection.GroupProperty<Sell043Report>(r => r.dateTimeAdded)) .Add(LambdaProjection.Sum<Sell043Report>(r => r.shares)) .Add(LambdaProjection.Sum<Sell043Report>(r => r.netMoney)) .Add(LambdaProjection.Sum<Sell043Report>(r => r.grossMoney)) .Add(LambdaProjection.Sum<Sell043Report>(r => r.taxWithheld)) .Add(LambdaProjection.Sum<Sell043Report>(r => r.fees))) .List<Sell043Report>(); that generates the following SQL: SELECT this_.location as y0_, this_.agent as y1_, this_.cusip as y2_, this_.SettlementDate as y3_, this_.salePrice as y4_, this_.foreignFx as y5_, this_.batchNumber as y6_, this_.origSaleDate as y7_, this_.planName as y8_, this_.dateTimeAdded as y9_, sum(this_.shares) as y10_, sum(this_.netMoney) as y11_, sum(this_.grossMoney) as y12_, sum(this_.taxWithheld) as y13_, sum(this_.fees) as y14_ FROM MIS_IPS_Sell043Report this_ GROUP BY this_.location, this_.agent, this_.cusip, this_.SettlementDate, this_.salePrice, this_.foreignFx, this_.batchNumber, this_.origSaleDate, this_.planName, this_.dateTimeAdded however the Sell043Report table has additional columns than those listed in the SELECT statement so I'm receiving this error when attempting to get a list of Sell043Reports: System.ArgumentException: The value "System.Object[]" is not of type "xyz.Sell043Report" and cannot be used in this generic collection. I suspect the problem is that I'm not selecting all of the columns for a Sell043Report and so it doesn't know how to map the dataset to the object. I'm trying to achieve something like this: SELECT this_.location as y0_, this_.agent as y1_, this_.cusip as y2_, this_.SettlementDate as y3_, this_.salePrice as y4_, this_.foreignFx as y5_, this_.batchNumber as y6_, this_.origSaleDate as y7_, this_.planName as y8_, this_.dateTimeAdded as y9_, sum(this_.shares) as y10_, sum(this_.netMoney) as y11_, sum(this_.grossMoney) as y12_, sum(this_.taxWithheld) as y13_, sum(this_.fees) as y14_, '' as Address1, '' as Address2 // etc FROM MIS_IPS_Sell043Report this_ GROUP BY this_.location, this_.agent, this_.cusip, this_.SettlementDate, this_.salePrice, this_.foreignFx, this_.batchNumber, this_.origSaleDate, this_.planName, this_.dateTimeAdded How can I do this using NHibernate?

    Read the article

  • Multiple column subselect in mysql 5 (5.1.42)

    - by rubber boots
    This one seems to be a simple problem, but I can't make it work in a single select or nested select. Retrieve the authors and (if any) advisers of a paper (article) into one row. I order to explain the problem, here are the two data tables (pseudo) papers (id, title, c_year) persons (id, firstname, lastname) plus a link table w/one extra attribute (pseudo): paper_person_roles( paper_id person_id act_role ENUM ('AUTHOR', 'ADVISER') ) This is basically a list of written papers (table: papers) and a list of staff and/or students (table: persons) An article my have (1,N) authors. An article may have (0,N) advisers. A person can be in 'AUTHOR' or 'ADVISER' role (but not at the same time). The application eventually puts out table rows containing the following entries: TH: || Paper_ID | Author(s) | Title | Adviser(s) | TD: || 21334 |John Doe, Jeff Tucker|Why the moon looks yellow|Brown, Rayleigh| ... My first approach was like: select/extract a full list of articles into the application, eg.SELECT q.id, q.title FROM papers AS q ORDER BY q.c_year and save the results of the query into an array (in the application). After this step, loop over the array of the returned information and retrieve authors and advisers (if any), via prepared statement (? is the paper's id) from the link table like:APPLICATION_LOOP(paper_ids in array) SELECT p.lastname, p.firstname, r.act_role FROM persons AS p, paper_person_roles AS r WHERE p.id=r.person_id AND r.paper_id = ? # The application does further processing from here (pseudo): foreach record from resulting records if record.act_role eq 'AUTHOR' then join to author_column if record.act_role eq 'ADVISER' then join to avdiser_column end print id, author_column, title, adviser_column APPLICATION_LOOP This works so far and gives the desired output. Would it make sense to put the computation back into the DB? I'm not very proficient in nontrivial SQL and can't find a solution with a single (combined or nested) select call. I tried sth. like SELECT q.title (CONCAT_WS(' ', (SELECT p.firstname, p.lastname AS aunames FROM persons AS p, paper_person_roles AS r WHERE q.id=r.paper_id AND r.act_role='AUTHOR') ) ) AS aulist FROM papers AS q, persons AS p, paper_person_roles AS r in several variations, but no luck ... Maybe there is some chance? Thanks in advance r.b.

    Read the article

  • Database Schema Versioning Strategies

    - by Jack Ryan
    I work on a project that uses a reasonably large database, the live version weighing in at somewhere around 60-80GB. The live database is the only real definitive source of our schema, and because of its size duplicating this database is too slow to be done often. This means we have ended up developing our database schema in a pretty ad hoc way, using sql compare to migrate changes from dev dbs to the live system, and only wiping our dev dbs every month or two. I am hoping to get some pointers on how to improve our database development work flow so that we have a little more control. Some things to think about: Currently nobody is really in charge of the database schema, all developers can change it if they need to, though generally these decisions are talked about before they are done. There are stored procedures, functions, and views in the database. These should probably be dumped to files so they can be reloaded on every build. Schema changes should probably be checked in as scripts. We have started to do this recently. However all our scripts must then be numbered (because there may be dependencies between them), and must be re runnable (because our build script currently runs them all in order). This makes them hard to read because they are full of conditionals that check whether tables or columns already exist. This is a step that is often forgotten by developers. Getting a new database should be quick and easy. This is currently a big problem, it takes several hours to get a copy of last nights backup and restore it onto a dev machine. Some mechanism needs to be in place to allow developers to update static data. We have tables that contain data that is never updated through the application, but does potentially need to be changed when we do a new release (often this drives dropdowns). The whole thing needs to be runnable as part of a build script. Are there any tools that can be used to help to do this? Eventually I would like to be at a point where a new DB can be built from scratch without copying any data from the live system. I don't mind writing some scripts to glue all the steps together but each part should be easily editable so that we continue to use it rather than make changes directly on DBs.

    Read the article

  • Large transactions causing "System.Data.SqlClient.SqlException: Timeout expired" error?

    - by Michael
    My application requires a user to log in and allows them to edit a list of things. However, it seems that if the same user always logs in and out and edits the list, this user will run into a "System.Data.SqlClient.SqlException: Timeout expired." error. I've read comments about increasing the timeout period but I've also read a comment about it possibly caused by uncommitted transactions. And I do have one going in the application. I'll provide the code I'm working with and there is an IF statement in there that I was a little iffy about but it seemed like a reasonable thing to do. I'll just go over what's going on here, there is a list of objects to update or add into the database. New objects created in the application are given an ID of 0 while existing objects have their own ID's generated from the DB. If the user chooses to delete some objects, their IDs are stored in a separate list of Integers. Once the user is ready to save their changes, the two lists are passed into this method. By use of the IF statement, objects with ID of 0 are added (using the Add stored procedure) and those objects with non-zero IDs are updated (using the Update stored procedure). After all this, a FOR loop goes through all the integers in the "removal" list and uses the Delete stored procedure to remove them. A transaction is used for all this. Public Shared Sub UpdateSomethings(ByVal SomethingList As List(Of Something), ByVal RemovalList As List(Of Integer)) Using DBConnection As New SqlConnection(conn) DBConnection.Open() Dim MyTransaction As SqlTransaction MyTransaction = DBConnection.BeginTransaction() Try For Each SomethingItem As Something In SomethingList Using MyCommand As New SqlCommand() MyCommand.Connection = DBConnection If SomethingItem.ID > 0 Then MyCommand.CommandText = "UpdateSomething" Else MyCommand.CommandText = "AddSomething" End If MyCommand.Transaction = MyTransaction MyCommand.CommandType = CommandType.StoredProcedure With MyCommand.Parameters If MyCommand.CommandText = "UpdateSomething" Then .Add("@id", SqlDbType.Int).Value = SomethingItem.ID End If .Add("@stuff", SqlDbType.Varchar).Value = SomethingItem.Stuff End With MyCommand.ExecuteNonQuery() End Using Next For Each ID As Integer In RemovalList Using MyCommand As New SqlCommand("DeleteSomething", DBConnection) MyCommand.Transaction = MyTransaction MyCommand.CommandType = CommandType.StoredProcedure With MyCommand.Parameters .Add("@id", SqlDbType.Int).Value = ID End With MyCommand.ExecuteNonQuery() End Using Next MyTransaction.Commit() Catch ex As Exception MyTransaction.Rollback() 'Exception handling goes here End Try End Using End Sub There are three stored procedures used here as well as some looping so I can see how something can be holding everything up if the list is large enough. Other users can log in to the system at the same time just fine though. I'm using Visual Studio 2008 to debug and am using SQL Server 2000 for the DB.

    Read the article

  • Complex MySQL table select/join with pre-condition

    - by Howard
    Hello, I have the schema below CREATE TABLE `vocabulary` ( `vid` int(10) unsigned NOT NULL auto_increment, `name` varchar(255), PRIMARY KEY vid (`vid`) ); CREATE TABLE `term` ( `tid` int(10) unsigned NOT NULL auto_increment, `vid` int(10) unsigned NOT NULL default '0', `name` varchar(255), PRIMARY KEY tid (`tid`) ); CREATE TABLE `article` ( `aid` int(10) unsigned NOT NULL auto_increment, `body` text, PRIMARY KEY aid (`aid`) ); CREATE TABLE `article_index` ( `nid` int(10) unsigned NOT NULL default '0', `tid` int(10) unsigned NOT NULL default '0' ) INSERT INTO `vocabulary` values (1, 'vocabulary 1'); INSERT INTO `vocabulary` values (2, 'vocabulary 2'); INSERT INTO `term` values (1, 1, 'term v1 t1'); INSERT INTO `term` values (2, 1, 'term v1 t2 '); INSERT INTO `term` values (3, 2, 'term v2 t3'); INSERT INTO `term` values (4, 2, 'term v2 t4'); INSERT INTO `term` values (5, 2, 'term v2 t5'); INSERT INTO `article` values (1, ""); INSERT INTO `article` values (2, ""); INSERT INTO `article` values (3, ""); INSERT INTO `article` values (4, ""); INSERT INTO `article` values (5, ""); INSERT INTO `article_index` values (1, 1); INSERT INTO `article_index` values (1, 3); INSERT INTO `article_index` values (2, 2); INSERT INTO `article_index` values (3, 1); INSERT INTO `article_index` values (3, 3); INSERT INTO `article_index` values (4, 3); INSERT INTO `article_index` values (5, 3); INSERT INTO `article_index` values (5, 4); Example. Select term of a defiend vocabulary (with non-zero article index), e.g. vid=2 select a.tid, count(*) as article_count from term t JOIN article_index a ON t.tid = a.tid where t.vid = 2 group by t.tid; +-----+---------------+ | tid | article_count | +-----+---------------+ | 3 | 4 | | 4 | 1 | +-----+------------ Question: Select terms a. of a defiend vocabulary (with non-zero article index, e.g. vid=1 = term {1,2}) b. given that those terms are linked with articles which are linked with terms under vid=2, e.g. = {1}, term with tid=2 is excluded since no linkage to terms under vid=2 SQL: Any idea? Expected result: +-----+---------------+ | tid | article_count | +-----+---------------+ | 1 | 2 | +-----+---------------+

    Read the article

  • How to write php code to input jsonstring and insert to sql server

    - by Romi
    i am trying to OUTPUT a Json String from the phone and to get it uploaded to the sql server i have. I Do not know how to get the output Json and write the php code... i tried many methods but couldnt find a solution. public void post(String string) { HttpClient httpclient = new DefaultHttpClient(); HttpPost httppost = new HttpPost( "http://www.hopscriber.com/xoxoxox/testphp.php"); try { List<NameValuePair> nameValuePairs = new ArrayList<NameValuePair>(); nameValuePairs.add(new BasicNameValuePair("myJson", string)); httppost.setEntity(new UrlEncodedFormEntity(nameValuePairs)); HttpResponse response = httpclient.execute(httppost); String str = inputStreamToString(response.getEntity().getContent()) .toString(); Log.w("SENCIDE", str); } catch (Exception e) { Toast.makeText(getBaseContext(), "notwork", Toast.LENGTH_LONG) .show(); } } private Object inputStreamToString(InputStream is) { // TODO Auto-generated method stub String line = ""; StringBuilder total = new StringBuilder(); // Wrap a BufferedReader around the InputStream BufferedReader rd = new BufferedReader(new InputStreamReader(is)); // Read response until the end try { while ((line = rd.readLine()) != null) { total.append(line); } } catch (IOException e) { e.printStackTrace(); } // Return full string return total; } it outputs a json string as [myJson=[{"name":"FriendTracker","user":"amjgp000000000000000","pack":"org.siislab.tutorial.friendtracker","perm":"org.siislab.tutorial.permission.READ_FRIENDS","level":"Normal"},{"name":"FriendTracker","user":"amjgp000000000000000","pack":"org.siislab.tutorial.friendtracker","perm":"org.siislab.tutorial.permission.WRITE_FRIENDS","level":"Normal"},{"name":"FriendTracker","user":"amjgp000000000000000","pack":"org.siislab.tutorial.friendtracker","perm":"org.siislab.tutorial.permission.FRIEND_SERVICE","level":"Normal"},{"name":"FriendTracker","user":"amjgp000000000000000","pack":"org.siislab.tutorial.friendtracker","perm":"org.siislab.tutorial.permission.FRIEND_NEAR","level":"Dangerous"},{"name":"FriendTracker","user":"amjgp000000000000000","pack":"org.siislab.tutorial.friendtracker","perm":"org.siislab.tutorial.permission.BROADCAST_FRIEND_NEAR","level":"Normal"},{"name":"FriendTracker","user":"amjgp000000000000000","pack":"org.siislab.tutorial.friendtracker","perm":"android.permission.RECEIVE_BOOT_COMPLETED","level":"Normal"},{"name":"FriendTracker","user":"amjgp000000000000000","pack":"org.siislab.tutorial.friendtracker","perm":"android.permission.READ_CONTACTS","level":"Dangerous"},{"name":"FriendTracker","user":"amjgp000000000000000","pack":"org.siislab.tutorial.friendtracker","perm":"android.permission.ACCESS_FINE_LOCATION","level":"Dangerous"},{"name":"FriendTracker","user":"amjgp000000000000000","pack":"org.siislab.tutorial.friendtracker","perm":"android.permission.WRITE_EXTERNAL_STORAGE","level":"Dangerous"},{"name":"FriendTracker","user":"amjgp000000000000000","pack":"org.siislab.tutorial.friendtracker","perm":"android.permission.READ_PHONE_STATE","level":"Dangerous"},{"name":"Tesing","user":"amjgp000000000000000","pack":"com.example.tesing","perm":"null","level":"null"},{"name":"Action Bar","user":"amjgp000000000000000","pack":"name.brucephillips.actionbarexample","perm":"null","level":"null"},.......

    Read the article

  • Database advantages? Access, MySQL, msSQL, or any others?

    - by JimZ
    Dear all Stackoverflowers, I just started to learn programming and now I'm putting this question online based on a quote: no question is silly My work needs to develop a order system based on web, which wants a database system. Since using Excel for years as a general office user, I naturally turn this to Access. However, most people say Access is very limited comparing to MySQL or MSSQL, or any other more professional database system. But after developing some functions for my company's order system, I really find Access can fulfill my request. And I also tried MSSQL to develop, which I found it not quite convenient to use. I have searched in stackoverflow and find no general answer about my doubt. Now I am sincerely hoping some experienced and professional developers could clear my doubts. Now I'm listing some Access advantages, which I don't think other database system have. I hope you could help me also find these advantages in others. 1. Access is portable, I can just copy a xxx.accdb file to my company and continue with development. 2. Access is easy to generate helpful table, for example, it will automatically generate a field that can automatically count, could be used as primary key value. 3. it is more compatable with Excel, to display and filter data. 4. importantly, it nerely needs no environment to setup, just needs MS Office to be installed. ............others However, I also find some points that MSSQL is advantaged: 1. security reasons 2. easy to backup, ( just use BACKUP..... sql statement to do it) 3. can edit stored procedure to save some functions to database ...............others specifically, I wish some friends could tell me how to make other database portable? since I usually work both at home and in office. It's a headache to move MSSQL work to my office, since the version of MSSQL is not the same. Thank you all and best regards, :)

    Read the article

  • Different data for different dates

    - by nsw1475
    I am making a web page which will allow users to input and view data for different dates, and to change the date, there are two buttons, one of which will display the previous day, and one which will show the next day. I know that I can do this by submitting forms and reloading the page every time they press one of these buttons, but I would rather use javascript and not have to submit a form, but I am having troubles getting it to work. Currently, I have the two buttons, and the date stored in a PHP variable, as shown in my code below: <script> function init() { <? $nutrDate = $this->parseDate(date('m/d/Y')); ?> } function nutrPrevDay() { <? $nutrDate = mktime(0, 0, 0, date('m',$nutrDate), date('d',$nutrDate)-1, date('Y',$nutrDate)); ?> alert("<? echo(date("m/d/y", $nutrDate)) ?>"); } function nutrNextDay() { <? $nutrDate = mktime(0, 0, 0, date('m',$nutrDate), date('d',$nutrDate)+1, date('Y',$nutrDate)); ?> } window.onload = init; </script> <p style="text-align:center; font-size:14px; color:#03F"> <button onclick="nutrPrevDay()" style="width:200px" >< Show Previous Day</button> <? echo(date('m/d/Y', $nutrDate)) ?> <button onclick="nutrNextDay()" style="width:200px">Show Next Day ></button> </p> I have the alert in the nutrPrevDay() only as debugging. What happens is when I click on the button, the alert shows that the day is correctly decreased (for example from May 17 to May 16), but only decreases one day, and not one day for every click. Also, I do not know how to make the text on the page (created by the line ) change to display the new date after a button is clicked. So here are my questions: 1) Is it possible to dynamically change data (such as text, and in the future, SQL queries) on a page using javascript without having to reload a page when clicking on a button? 2) If possible, how can I make those changes? 3) How can I fix this so that it will increment and decrement through dates every time a button is clicked?

    Read the article

  • better to wait Drupal 7 to start learning Drupal, or no big difference with Drupal 6?

    - by artmania
    Hi friends, I'm going to start some projects in next weeks. I currently have my custom cms, but i wanna go for opensource cms solution from now on. and as i researched Drupal is COOL for any kind of project from small scope to most complex ones. I have never used Drupal before, I know nothing about it yet. Would it worth to work on Drupal 6 for now, or better to wait for Drupal 7? is there any big difference about the way it works. i would feel silly to spend days-weeks to learn Drupal 6 now and after few months when Drupal 7 beta is out, a new learning curve and having troubles to upgrade to Drupal 7 my current Drupal 6 projects? Thanks a lot!! appreciate advises!

    Read the article

  • How do I wait for "animated scroll to id" to complete before focusing on first form field?

    - by codemonkey613
    So, I have a link at the bottom of website. When it is clicked, it scrolls to a form at the top of the page with the animated style. And as it arrives at top of page, it focuses on the first field in form. Currently, this is my code: $(document).ready(function() { $('#goto-show-form').click(function() { $('html, body').animate({scrollTop: $("#show-form").offset().top}, '500'); $('#first-field').focus(); return false; }); }); What happens is it begins scrolling, then focuses to form field while still in process of scrolling, then returns to last position in scrolling process and continues scrolling up. Instead of being smooth, you can see it cuts back and forth. How can I tell jquery to wait until scrolling is complete before focusing to the form field? Here is the website: http://bit.ly/dfjvmT (The link that starts scroll is "Send us your resume" at the bottom.) Thanks.

    Read the article

  • How do I wait until a console application is idle?

    - by Anthony Mastrean
    I have a console application that starts up, hosts a bunch of services (long-running startup), and then waits for clients to call into it. I have integration tests that start this console application and make "client" calls. How do I wait for the console application to complete its startup before making the client calls? I want to avoid doing Thread.Sleep(int) because that's dependent on the startup time (which may change) and I waste time if the startup is faster. Process.WaitForInputIdle works only on applications with a UI (and I confirmed that it does throw an exception in this case). I'm open to awkward solutions like, have the console application write a temp file when it's ready.

    Read the article

  • Load page, wait, log any JavaScript errors to file?

    - by David
    I would like to check a large number of HTML files with inline JavaScript for JavaScript errors. What I'm envisioning doing is this: Script a browser to load a given page, wait a few seconds, and finally check the browser logs. I'm unsure though both on how to script a browser to load a given page and on how to access the JavaScript error log. I think the type of errors I'm worried about should show up in any modern browser so I would just go with whatever makes it most convenient. I'd be working either under Mac OS X or Linux. Anybody already tackle a similar problem? I've thought a bit about hacking something together based on a unit testing framework -- generate a trivial (assertTrue(true)) test for each page and rely on the errors making it fail -- but I'm hoping for something more elegant. Thank you.

    Read the article

  • Fixing "Lock wait timeout exceeded; try restarting transaction" for a 'stuck" Mysql table?

    - by Tom
    From a script I sent a query like this thousands of times to my local database: update some_table set some_column = some_value I forgot to add the where part, so the same column was set to the same a value for all the rows in the table and this was done thousands of times and the column was indexed, so the corresponding index was probably updated too lots of times. I noticed something was wrong, because it took too long, so I killed the script. I even rebooted my computer since them, but something stuck in the table, because simple queries take a very long time to run and when I try dropping the relevant index it fails with this message: Lock wait timeout exceeded; try restarting transaction It's an innodb table, so stuck the transaction is probably implicit. How can I fix this table and remove the stuck transaction from it?

    Read the article

  • How do you make Python wait so that you can read the output?

    - by anonnoir
    I've always been a heavy user of Notepad2, as it is fast, feature-rich, and supports syntax highlighting. Recently I've been using it for Python. My problem: when I finish editing a certain Python source code, and try to launch it, the screen disappears before I can see the output pop up. Is there any way for me to make the results wait so that I can read it, short of using an input() or time-delay function? Otherwise I'd have to use IDLE, because of the output that stops for you to read. (My apologies if this question is a silly one, but I'm very new at Python and programming in general.)

    Read the article

  • Android how do I wait until a service is actually connected?

    - by Ryan
    I have an Activity calling a Service defined in IDownloaderService.aidl: public class Downloader extends Activity { IDownloaderService downloader = null; // ... In Downloader.onCreate(Bundle) I tried to bindService Intent serviceIntent = new Intent(this, DownloaderService.class); if (bindService(serviceIntent, sc, BIND_AUTO_CREATE)) { // ... and within the ServiceConnection object sc I did this public void onServiceConnected(ComponentName name, IBinder service) { Log.w("XXX", "onServiceConnected"); downloader = IDownloaderService.Stub.asInterface(service); // ... By adding all kinds of Log.xx I found that the code after if(bindService(...)) actually goes BEFORE ServiceConnection.onServiceConnected is being called - that is, when downloader is still null - which gets me into trouble. All the samples in ApiDemos avoid this timing problem by only calling services when triggered by user actions. But what should I do to right use this service after bindService succeeds? How can I wait for ServiceConnection.onServiceConnected being called reliably?

    Read the article

  • How can I wait for the image to load in my new window before printing via child.print()?

    - by libertas
    $("#print").on('click', function () { var child = window.open("image.jpg", "_blank", "location=0,toolbar=0,scrollbars=1,fullscreen=1,menubar=0"); //any way to know/wait for image to load? child.print(); }); Any way for my parent window to know that the child window has completed loading the image prior to calling .print()? If they were trigger happy they would end up printing a blank page. I've tried both: child.attachEvent("onload", function () { child.print(); }); and child.attachEvent("DOMContentLoaded", function () { child.print(); }); //(found this online, apparently it's Firefox only, still didn't work)

    Read the article

  • Wait for inline thread to complete before moving to next method...

    - by Tyler
    Hello, I have an android app where I am doing the following: private void onCreate() { final ProgressDialog dialog = ProgressDialog.show(this, "Please wait..", "Doing stuff..", true); new Thread() { public void run() { //do some serious stuff... dialog.dismiss(); } }.start(); stepTwo(); } And I would like to ensure that my thread is complete before stepTwo(); is called. How can I do this? Thanks!

    Read the article

  • Setting up Rails to work with sqlserver

    - by FortunateDuke
    Ok I followed the steps for setting up ruby and rails on my Vista machine and I am having a problem connecting to the database. Contents of database.yml development: adapter: sqlserver database: APPS_SETUP Host: WindowsVT06\SQLEXPRESS Username: se Password: paswd Run rake db:migrate from myapp directory ---------- rake aborted! no such file to load -- deprecated ADO I have dbi 0.4.0 installed and have created the ADO folder in C:\Ruby\lib\ruby\site_ruby\1.8\DBD\ADO I got the ado.rb from the dbi 0.2.2 What else should I be looking at to fix the issue connecting to the database? Please don't tell me to use MySql or Sqlite or Postgres. *UPDATE* I have installed the activerecord-sqlserver-adapter gem from --source=http://gems.rubyonrails.org Still not working. I have verified that I can connect to the database by logging into SQL Management Studio with the credentials. rake db:migrate --trace PS C:\Inetpub\wwwroot\myapp> rake db:migrate --trace (in C:/Inetpub/wwwroot/myapp) ** Invoke db:migrate (first_time) ** Invoke environment (first_time) ** Execute environment ** Execute db:migrate rake aborted! no such file to load -- deprecated C:/Ruby/lib/ruby/site_ruby/1.8/rubygems/custom_require.rb:27:in `gem_original_require' C:/Ruby/lib/ruby/site_ruby/1.8/rubygems/custom_require.rb:27:in `require' C:/Ruby/lib/ruby/gems/1.8/gems/activesupport-2.1.1/lib/active_support/dependencies.rb:510:in `require' C:/Ruby/lib/ruby/gems/1.8/gems/activesupport-2.1.1/lib/active_support/dependencies.rb:355:in `new_constants_in' C:/Ruby/lib/ruby/gems/1.8/gems/activesupport-2.1.1/lib/active_support/dependencies.rb:510:in `require' C:/Ruby/lib/ruby/site_ruby/1.8/dbi.rb:48 C:/Ruby/lib/ruby/site_ruby/1.8/rubygems/custom_require.rb:27:in `gem_original_require' C:/Ruby/lib/ruby/site_ruby/1.8/rubygems/custom_require.rb:27:in `require' C:/Ruby/lib/ruby/gems/1.8/gems/activesupport-2.1.1/lib/active_support/dependencies.rb:510:in `require' C:/Ruby/lib/ruby/gems/1.8/gems/activesupport-2.1.1/lib/active_support/dependencies.rb:355:in `new_constants_in' C:/Ruby/lib/ruby/gems/1.8/gems/activesupport-2.1.1/lib/active_support/dependencies.rb:510:in `require' C:/Ruby/lib/ruby/gems/1.8/gems/activesupport-2.1.1/lib/active_support/core_ext/kernel/requires.rb:7:in `require_library_ or_gem' C:/Ruby/lib/ruby/gems/1.8/gems/activesupport-2.1.1/lib/active_support/core_ext/kernel/reporting.rb:11:in `silence_warnin gs' C:/Ruby/lib/ruby/gems/1.8/gems/activesupport-2.1.1/lib/active_support/core_ext/kernel/requires.rb:5:in `require_library_ or_gem' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-sqlserver-adapter-1.0.0.9250/lib/active_record/connection_adapters/sqlserver _adapter.rb:29:in `sqlserver_connection' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.1.1/lib/active_record/connection_adapters/abstract/connection_specificatio n.rb:292:in `send' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.1.1/lib/active_record/connection_adapters/abstract/connection_specificatio n.rb:292:in `connection=' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.1.1/lib/active_record/connection_adapters/abstract/connection_specificatio n.rb:260:in `retrieve_connection' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.1.1/lib/active_record/connection_adapters/abstract/connection_specificatio n.rb:78:in `connection' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.1.1/lib/active_record/migration.rb:408:in `initialize' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.1.1/lib/active_record/migration.rb:373:in `new' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.1.1/lib/active_record/migration.rb:373:in `up' C:/Ruby/lib/ruby/gems/1.8/gems/activerecord-2.1.1/lib/active_record/migration.rb:356:in `migrate' C:/Ruby/lib/ruby/gems/1.8/gems/rails-2.1.1/lib/tasks/databases.rake:99 C:/Ruby/lib/ruby/gems/1.8/gems/rake-0.8.2/lib/rake.rb:621:in `call' C:/Ruby/lib/ruby/gems/1.8/gems/rake-0.8.2/lib/rake.rb:621:in `execute' C:/Ruby/lib/ruby/gems/1.8/gems/rake-0.8.2/lib/rake.rb:616:in `each' C:/Ruby/lib/ruby/gems/1.8/gems/rake-0.8.2/lib/rake.rb:616:in `execute' C:/Ruby/lib/ruby/gems/1.8/gems/rake-0.8.2/lib/rake.rb:582:in `invoke_with_call_chain' C:/Ruby/lib/ruby/1.8/monitor.rb:242:in `synchronize' C:/Ruby/lib/ruby/gems/1.8/gems/rake-0.8.2/lib/rake.rb:575:in `invoke_with_call_chain' C:/Ruby/lib/ruby/gems/1.8/gems/rake-0.8.2/lib/rake.rb:568:in `invoke' C:/Ruby/lib/ruby/gems/1.8/gems/rake-0.8.2/lib/rake.rb:2031:in `invoke_task' C:/Ruby/lib/ruby/gems/1.8/gems/rake-0.8.2/lib/rake.rb:2009:in `top_level' C:/Ruby/lib/ruby/gems/1.8/gems/rake-0.8.2/lib/rake.rb:2009:in `each' C:/Ruby/lib/ruby/gems/1.8/gems/rake-0.8.2/lib/rake.rb:2009:in `top_level' C:/Ruby/lib/ruby/gems/1.8/gems/rake-0.8.2/lib/rake.rb:2048:in `standard_exception_handling' C:/Ruby/lib/ruby/gems/1.8/gems/rake-0.8.2/lib/rake.rb:2003:in `top_level' C:/Ruby/lib/ruby/gems/1.8/gems/rake-0.8.2/lib/rake.rb:1982:in `run' C:/Ruby/lib/ruby/gems/1.8/gems/rake-0.8.2/lib/rake.rb:2048:in `standard_exception_handling' C:/Ruby/lib/ruby/gems/1.8/gems/rake-0.8.2/lib/rake.rb:1979:in `run' C:/Ruby/lib/ruby/gems/1.8/gems/rake-0.8.2/bin/rake:31 C:/Ruby/bin/rake:19:in `load' C:/Ruby/bin/rake:19 PS C:\Inetpub\wwwroot\myapp>

    Read the article

  • SMO ConnectionContext.StatementTimeout setting is ignored

    - by Woody
    I am successfully using Powershell with SMO to backup most databases. However, I have several large databases in which I receive a "timeout" error "System.Data.SqlClient.SqlException: Timeout expired". The timout consistently occurs at 10 minutes. I have tried setting ConnectionContext.StatementTimeout to 0, 6000, and to [System.Int32]::MaxValue. The setting made no difference. I have found a number of Google references which indicate setting it to 0 makes it unlimited. No matter what I try, the timeouts consistently occur at 10 minutes. I even set Remote Query Timeout on the server to 0 (via Studio Manager) to no avail. Below is my SMO connection where I set the time out and the actual backup function. Further below is the output from my script. UPDATE Interestingly enough, I wrote the backup function in C# using VS 2008 and the timeout override does work within that environment. I am in the process of incorporating that C# process into my Powershell Script until I can find out why the timeout override does not work with just Powershell. This is extremely annoying! function New-SMOconnection { Param ($server, $ApplicationName= "PowerShell SMO", [int]$StatementTimeout = 0 ) # Write-Debug "Function: New-SMOconnection $server $connectionname $commandtimeout" if (test-path variable:\conn) { $conn.connectioncontext.disconnect() } else { $conn = New-Object('Microsoft.SqlServer.Management.Smo.Server') $server } $conn.connectioncontext.applicationName = $applicationName $conn.ConnectionContext.StatementTimeout = $StatementTimeout $conn.connectioncontext.Connect() $conn } $smo = New-SMOConnection -server $server if ($smo.connectioncontext.isopen -eq $false) { Throw "Could not connect to server $($server)." } Function Backup-Database { Param([string]$dbname) $db = $smo.Databases.get_Item($dbname) if (!$db) {"Database $dbname was not found"; Return} $sqldir = $smo.Settings.BackupDirectory + "\$($smo.name -replace ("\\", "$"))" $s = ($server.Split('\'))[0] $basedir = "\\$s\" + $($sqldir -replace (":", "$")) $dt = get-date -format yyyyMMdd-HHmmss $dbbk = new-object ('Microsoft.SqlServer.Management.Smo.Backup') $dbbk.Action = 'Database' $dbbk.BackupSetDescription = "Full backup of " + $dbname $dbbk.BackupSetName = $dbname + " Backup" $dbbk.Database = $dbname $dbbk.MediaDescription = "Disk" $target = "$basedir\$dbname\FULL" if (-not(Test-Path $target)) { New-Item $target -ItemType directory | Out-Null} $device = "$sqldir\$dbname\FULL\" + $($server -replace("\\", "$")) + "_" + $dbname + "_FULL_" + $dt + ".bak" $dbbk.Devices.AddDevice($device, 'File') $dbbk.Initialize = $True $dbbk.Incremental = $false $dbbk.LogTruncation = [Microsoft.SqlServer.Management.Smo.BackupTruncateLogType]::Truncate If (!$copyonly) { If ($kill) {$smo.KillAllProcesses($dbname)} $dbbk.SqlBackupAsync($server) } $dbbk } Started SQL backups for server LCFSQLxxx\SQLxxx at 05/06/2010 15:33:16 Statement TimeOut value set to 0. DatabaseName : OperationsManagerDW StartBackupTime : 5/6/2010 3:33:16 PM EndBackupTime : 5/6/2010 3:43:17 PM StartCopyTime : 1/1/0001 12:00:00 AM EndCopyTime : 1/1/0001 12:00:00 AM CopiedFiles : Status : Failed ErrorMessage : System.Data.SqlClient.SqlException: Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. The backup or restore was aborted. 10 percent processed. 20 percent processed. 30 percent processed. 40 percent processed. 50 percent processed. 60 percent processed. 70 percent processed. at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection) at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj) at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj) at System.Data.SqlClient.SqlCommand.RunExecuteNonQueryTds(String methodName, Boolean async) at System.Data.SqlClient.SqlCommand.InternalExecuteNonQuery(DbAsyncResult result, String methodName, Boolean sendToPipe) at System.Data.SqlClient.SqlCommand.ExecuteNonQuery() at Microsoft.SqlServer.Management.Common.ServerConnection.ExecuteNonQuery(String sqlCommand, ExecutionTypes executionType) Ended backups at 05/06/2010 15:43:23

    Read the article

  • LINQ Generic Query with inherited base class?

    - by sah302
    I am trying to write some generic LINQ queries for my entities, but am having issue doing the more complex things. Right now I am using an EntityDao class that has all my generics and each of my object class Daos (such as Accomplishments Dao) inherit it, am example: using LCFVB.ObjectsNS; using LCFVB.EntityNS; namespace AccomplishmentNS { public class AccomplishmentDao : EntityDao<Accomplishment>{} } Now my entityDao has the following code: using LCFVB.ObjectsNS; using LCFVB.LinqDataContextNS; namespace EntityNS { public abstract class EntityDao<ImplementationType> where ImplementationType : Entity { public ImplementationType getOneByValueOfProperty(string getProperty, object getValue) { ImplementationType entity = null; if (getProperty != null && getValue != null) { //Nhibernate Example: //ImplementationType entity = default(ImplementationType); //entity = Me.session.CreateCriteria(Of ImplementationType)().Add(Expression.Eq(getProperty, getValue)).UniqueResult(Of InterfaceType)() LCFDataContext lcfdatacontext = new LCFDataContext(); //Generic LINQ Query Here lcfdatacontext.GetTable<ImplementationType>(); lcfdatacontext.SubmitChanges(); lcfdatacontext.Dispose(); } return entity; } public bool insertRow(ImplementationType entity) { if (entity != null) { //Nhibernate Example: //Me.session.Save(entity, entity.Id) //Me.session.Flush() LCFDataContext lcfdatacontext = new LCFDataContext(); //Generic LINQ Query Here lcfdatacontext.GetTable<ImplementationType>().InsertOnSubmit(entity); lcfdatacontext.SubmitChanges(); lcfdatacontext.Dispose(); return true; } else { return false; } } } }             I have gotten the insertRow function working, however I am not even sure how to go about doing getOnebyValueOfProperty, the closest thing I could find on this site was: http://stackoverflow.com/questions/2157560/generic-linq-to-sql-query How can I pass in the column name and the value I am checking against generically using my current set-up? It seems like from that link it's impossible since using a where predicate because entity class doesn't know what any of the properties are until I pass them in. Lastly, I need some way of setting a new object as the return type set to the implementation type, in nhibernate (what I am trying to convert from) it was simply this line that did it: ImplentationType entity = default(ImplentationType); However default is an nhibernate command, how would I do this for LINQ? EDIT: getOne doesn't seem to work even when just going off the base class (this is a partial class of the auto generated LINQ classes). I even removed the generics. I tried: namespace ObjectsNS { public partial class Accomplishment { public Accomplishment getOneByWhereClause(Expression<Action<Accomplishment, bool>> singleOrDefaultClause) { Accomplishment entity = new Accomplishment(); if (singleOrDefaultClause != null) { LCFDataContext lcfdatacontext = new LCFDataContext(); //Generic LINQ Query Here entity = lcfdatacontext.Accomplishments.SingleOrDefault(singleOrDefaultClause); lcfdatacontext.Dispose(); } return entity; } } } Get the following error: Error 1 Overload resolution failed because no accessible 'SingleOrDefault' can be called with these arguments: Extension method 'Public Function SingleOrDefault(predicate As System.Linq.Expressions.Expression(Of System.Func(Of Accomplishment, Boolean))) As Accomplishment' defined in 'System.Linq.Queryable': Value of type 'System.Action(Of System.Func(Of LCFVB.ObjectsNS.Accomplishment, Boolean))' cannot be converted to 'System.Linq.Expressions.Expression(Of System.Func(Of LCFVB.ObjectsNS.Accomplishment, Boolean))'. Extension method 'Public Function SingleOrDefault(predicate As System.Func(Of Accomplishment, Boolean)) As Accomplishment' defined in 'System.Linq.Enumerable': Value of type 'System.Action(Of System.Func(Of LCFVB.ObjectsNS.Accomplishment, Boolean))' cannot be converted to 'System.Func(Of LCFVB.ObjectsNS.Accomplishment, Boolean)'. 14 LCF Okay no problem I changed: public Accomplishment getOneByWhereClause(Expression<Action<Accomplishment, bool>> singleOrDefaultClause) to: public Accomplishment getOneByWhereClause(Expression<Func<Accomplishment, bool>> singleOrDefaultClause) Error goes away. Alright, but now when I try to call the method via: Accomplishment accomplishment = new Accomplishment(); var result = accomplishment.getOneByWhereClause(x=>x.Id = 4) It doesn't work it says x is not declared. I also tried getOne, and various other Expression =(

    Read the article

  • PROJECT HELP NEEDED. SOME BASIC CONCEPTS GREAT CONFUSION BECAUSE OF LACK OF PROPER MATERIAL PLEASE H

    - by user287745
    Task ATTENDENCE RECORDER AND MANAGEMENT SYSTEM IN DISTRIBUTED ENVIRONMENT ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++ Example implementation needed. a main server in each lab where the operator punches in the attendence of the student. =========================================================== scenerio:- a college, 10 departments, all departments have a computer lab with 60-100 computers, the computers within each lab are interconnected and all computers in any department have to dail to a particular number (THE NUMBER GIVEN BY THE COLLEGE INTERNET DEPARTMENT) to get connected to the internet. therefore safe to assume that there is a central location to which all the computers in the college are connected to. there is a 'students attendence portal' which can be accessed using internet explorer, students enter there id and get the particular attendence record regarding to the labs only. a description of the working is like:- 1) the user will select which department, which year has arrived to the lab 2) the selection will give the user a return of all the students name and there roll numbers belonging to that department; 'with a check box to "TICK MARK IF THE STUDENT IS PRESENT" ' 3) A SUBMIT BUTTON when pressed reads the 'id' of the checkbox to determine the "particular count number of the student" from that an id of the student is constructed and that id is inserted with a present. (there is also date and time and much more to normalize the db and to avoid conflicts and keep historic records etc but that you will have to assume) steps taken till this date:- ( please note we are not computer students, we are to select something of some other line as a project!, as you will read in my many post 'i" have designed small websites just out of liking. have never ever done any thing official to implement like this.) * have made the database fully normalized. * have made the website which does the functions required on the database. Testing :- deployed the db and site on a free aspspider server and it worked. tested from several computers. Now the problem please help thank youuuuuuuu a practical demonstration has to be done within the college network. no internet! we have been assigned a lab - 60 computers- to demonstrate. (please dont give replies as 60 computers only! is not a big deal one CPU can manage it. i know that; IT IS A HYPOTHETICAL SITUATION WHERE WE ASSUME THAT 60 IS NOT 60 BUT ITS LIKE 60,000 COMPUTERS) 1a) make a web server, yes iis and put files in www folder and configure server to run aspx files- although a link to a step by step guide will be appreciated)\ ? which version of windows should i ask for xp or win server 2000 something? 2a) make a database server. ( well yes install sql server 2005, okay but then what? just put the database file on a pc share it and append the connection string to the share? ) 3a) make the site accessible from the remaining computers ? http://localhost/sitename ? all users "being operators of the particular lab" have the right to edit, write or delete(in dispute), thereby any "users" who hate our program can make the database inconsistent by accessing te same record and doing different edits and then complaining? so how to prevent this? you know something like when the db table is being written to others can only read but not write.. one big confusion:- IN DISTRIBUTED ENVIRONMENT "how to implement this" where does "distributed environment" come in! meaning :- alright the labs are in different departments but the "database server will be one" the "web server will be one" so whats distributed!?

    Read the article

  • Slow queries in Rails- not sure if my indexes are being used.

    - by Max Williams
    I'm doing a quite complicated find with lots of includes, which rails is splitting into a sequence of discrete queries rather than do a single big join. The queries are really slow - my dataset isn't massive, with none of the tables having more than a few thousand records. I have indexed all of the fields which are examined in the queries but i'm worried that the indexes aren't helping for some reason: i installed a plugin called "query_reviewer" which looks at the queries used to build a page, and lists problems with them. This states that indexes AREN'T being used, and it features the results of calling 'explain' on the query, which lists various problems. Here's an example find call: Question.paginate(:all, {:page=>1, :include=>[:answers, :quizzes, :subject, {:taggings=>:tag}, {:gradings=>[:age_group, :difficulty]}], :conditions=>["((questions.subject_id = ?) or (questions.subject_id = ? and tags.name = ?))", "1", 19, "English"], :order=>"subjects.name, (gradings.difficulty_id is null), gradings.age_group_id, gradings.difficulty_id", :per_page=>30}) And here are the generated sql queries: SELECT DISTINCT `questions`.id FROM `questions` LEFT OUTER JOIN `taggings` ON `taggings`.taggable_id = `questions`.id AND `taggings`.taggable_type = 'Question' LEFT OUTER JOIN `tags` ON `tags`.id = `taggings`.tag_id LEFT OUTER JOIN `subjects` ON `subjects`.id = `questions`.subject_id LEFT OUTER JOIN `gradings` ON gradings.question_id = questions.id WHERE (((questions.subject_id = '1') or (questions.subject_id = 19 and tags.name = 'English'))) ORDER BY subjects.name, (gradings.difficulty_id is null), gradings.age_group_id, gradings.difficulty_id LIMIT 0, 30 SELECT `questions`.`id` AS t0_r0 <..etc...> FROM `questions` LEFT OUTER JOIN `answers` ON answers.question_id = questions.id LEFT OUTER JOIN `quiz_questions` ON (`questions`.`id` = `quiz_questions`.`question_id`) LEFT OUTER JOIN `quizzes` ON (`quizzes`.`id` = `quiz_questions`.`quiz_id`) LEFT OUTER JOIN `subjects` ON `subjects`.id = `questions`.subject_id LEFT OUTER JOIN `taggings` ON `taggings`.taggable_id = `questions`.id AND `taggings`.taggable_type = 'Question' LEFT OUTER JOIN `tags` ON `tags`.id = `taggings`.tag_id LEFT OUTER JOIN `gradings` ON gradings.question_id = questions.id LEFT OUTER JOIN `age_groups` ON `age_groups`.id = `gradings`.age_group_id LEFT OUTER JOIN `difficulties` ON `difficulties`.id = `gradings`.difficulty_id WHERE (((questions.subject_id = '1') or (questions.subject_id = 19 and tags.name = 'English'))) AND `questions`.id IN (602, 634, 666, 698, 730, 762, 613, 645, 677, 709, 741, 592, 624, 656, 688, 720, 752, 603, 635, 667, 699, 731, 763, 614, 646, 678, 710, 742, 593, 625) ORDER BY subjects.name, (gradings.difficulty_id is null), gradings.age_group_id, gradings.difficulty_id SELECT count(DISTINCT `questions`.id) AS count_all FROM `questions` LEFT OUTER JOIN `answers` ON answers.question_id = questions.id LEFT OUTER JOIN `quiz_questions` ON (`questions`.`id` = `quiz_questions`.`question_id`) LEFT OUTER JOIN `quizzes` ON (`quizzes`.`id` = `quiz_questions`.`quiz_id`) LEFT OUTER JOIN `subjects` ON `subjects`.id = `questions`.subject_id LEFT OUTER JOIN `taggings` ON `taggings`.taggable_id = `questions`.id AND `taggings`.taggable_type = 'Question' LEFT OUTER JOIN `tags` ON `tags`.id = `taggings`.tag_id LEFT OUTER JOIN `gradings` ON gradings.question_id = questions.id LEFT OUTER JOIN `age_groups` ON `age_groups`.id = `gradings`.age_group_id LEFT OUTER JOIN `difficulties` ON `difficulties`.id = `gradings`.difficulty_id WHERE (((questions.subject_id = '1') or (questions.subject_id = 19 and tags.name = 'English'))) Actually, looking at these all nicely formatted here, there's a crazy amount of joining going on here. This can't be optimal surely. Anyway, it looks like i have two questions. 1) I have an index on each of the ids and foreign key fields referred to here. The second of the above queries is the slowest, and calling explain on it (doing it directly in mysql) gives me the following: +----+-------------+----------------+--------+---------------------------------------------------------------------------------+-------------------------------------------------+---------+------------------------------------------------+------+----------------------------------------------+ | id | select_type | table | type | possible_keys | key | key_len | ref | rows | Extra | +----+-------------+----------------+--------+---------------------------------------------------------------------------------+-------------------------------------------------+---------+------------------------------------------------+------+----------------------------------------------+ | 1 | SIMPLE | questions | range | PRIMARY,index_questions_on_subject_id | PRIMARY | 4 | NULL | 30 | Using where; Using temporary; Using filesort | | 1 | SIMPLE | answers | ref | index_answers_on_question_id | index_answers_on_question_id | 5 | millionaire_development.questions.id | 2 | | | 1 | SIMPLE | quiz_questions | ref | index_quiz_questions_on_question_id | index_quiz_questions_on_question_id | 5 | millionaire_development.questions.id | 1 | | | 1 | SIMPLE | quizzes | eq_ref | PRIMARY | PRIMARY | 4 | millionaire_development.quiz_questions.quiz_id | 1 | | | 1 | SIMPLE | subjects | eq_ref | PRIMARY | PRIMARY | 4 | millionaire_development.questions.subject_id | 1 | | | 1 | SIMPLE | taggings | ref | index_taggings_on_taggable_id_and_taggable_type,index_taggings_on_taggable_type | index_taggings_on_taggable_id_and_taggable_type | 263 | millionaire_development.questions.id,const | 1 | | | 1 | SIMPLE | tags | eq_ref | PRIMARY | PRIMARY | 4 | millionaire_development.taggings.tag_id | 1 | Using where | | 1 | SIMPLE | gradings | ref | index_gradings_on_question_id | index_gradings_on_question_id | 5 | millionaire_development.questions.id | 2 | | | 1 | SIMPLE | age_groups | eq_ref | PRIMARY | PRIMARY | 4 | millionaire_development.gradings.age_group_id | 1 | | | 1 | SIMPLE | difficulties | eq_ref | PRIMARY | PRIMARY | 4 | millionaire_development.gradings.difficulty_id | 1 | | +----+-------------+----------------+--------+---------------------------------------------------------------------------------+-------------------------------------------------+---------+------------------------------------------------+------+----------------------------------------------+ The query_reviewer plugin has this to say about it - it lists several problems: Table questions: Using temporary table, Long key length (263), Using filesort MySQL must do an extra pass to find out how to retrieve the rows in sorted order. To resolve the query, MySQL needs to create a temporary table to hold the result. The key used for the index was rather long, potentially affecting indices in memory 2) It looks like rails isn't splitting this find up in a very optimal way. Is it, do you think? Am i better off doing several find queries manually rather than one big combined one? Grateful for any advice, max

    Read the article

  • Accessing Oracle 6i and 9i/10g Databases using C#

    - by Mike M
    Hi all, I am making two build files using NAnt. The first aims to automatically compile Oracle 6i forms and reports and the second aims to compile Oracle 9i/10g forms and reports. Within the NAnt task is a C# script which prompts the developer for database credentials (username, password, database) in order to compile the forms and reports. I want to then run these credentials against the relevant database to ensure the credentials entered are correct and, if they are not, prompt the user to re-enter their credentials. My script currently looks as follows: class GetInput { public static void ScriptMain(Project project) { Console.Clear(); Console.WriteLine("==================================================================="); Console.WriteLine("Welcome to the Compile and Deploy Oracle Forms and Reports Facility"); Console.WriteLine("==================================================================="); Console.WriteLine(); Console.WriteLine("Please enter the acronym of the project to work on from the following list:"); Console.WriteLine(); Console.WriteLine("--------"); Console.WriteLine("- BCS"); Console.WriteLine("- COPEN"); Console.WriteLine("- FCDD"); Console.WriteLine("--------"); Console.WriteLine(); Console.Write("Selection: "); project.Properties["project.type"] = Console.ReadLine(); Console.WriteLine(); Console.Write("Please enter username: "); string username = Console.ReadLine(); project.Properties["username"] = username; string password = ReturnPassword(); project.Properties["password"] = password Console.WriteLine(); Console.Write("Please enter database: "); string database = Console.ReadLine(); project.Properties["database"] = database Console.WriteLine(); //Call method to verify user credentials Console.WriteLine(); Console.WriteLine("Compiling files..."; } public static string ReturnPassword() { Console.Write("Please enter password: "); string password = ""; ConsoleKeyInfo nextKey = Console.ReadKey(true); while (nextKey.Key != ConsoleKey.Enter) { if (nextKey.Key == ConsoleKey.Backspace) { if (password.Length > 0) { password = password.Substring(0, password.Length - 1); Console.Write(nextKey.KeyChar); Console.Write(" "); Console.Write(nextKey.KeyChar); } } else { password += nextKey.KeyChar; Console.Write("*"); } nextKey = Console.ReadKey(true); } return password; } } Having done a bit of research, I find that you can connect to Oracle databases using the System.Data.OracleClient namespace clicky. However, as mentioned in the link, Microsoft is discontinuing support for this so it is not a desireable solution. I have also fonud that Oracle provides its own classes for connecting to Oracle databases clicky. However, this only seems to support connecting to Oracle 9 or newer databases (clicky) so it is not feasible solution as I also need to connect to Oracle 6i databases. I could achieve this by calling a bat script from within the C# script, but I would much prefer to have a single build file for simplicity. Ideally, I would like to run a series of commands such as is contained in the following .bat script: rem -- Set Database SID -- set ORACLE_SID=%DBSID% sqlplus -s %nameofuser%/%password%@%dbsid% set cmdsep on set cmdsep '"'; --" set term on set echo off set heading off select '========================================' || CHR(10) || 'Have checked and found both Password and ' || chr(10) || 'Database Identifier are valid, continuing ...' || CHR(10) || '========================================' from dual; exit; This requires me to set the environment variable of ORACLE_SID and then run sqlplus in silent mode (-s) followed by a series of sql set commands (set x), the actual select statement and an exit command. Can I achieve this within a c# script without calling a bat script, or am I forced to call a bat script? Thanks in advance!

    Read the article

  • Data won't save to SQL database, getting error "close() was never explicitly called on database"

    - by SnowLeppard
    I have a save button in the activity where the user enters data which does this: String subjectName = etName.getText().toString(); String subjectColour = etColour.getText().toString(); SQLDatabase entrySubject = new SQLDatabase(AddSubject.this); entrySubject.open(); entrySubject.createSubjectEntry(subjectName, subjectColour); entrySubject.close(); Which refers to this SQL database class: package com.***.schooltimetable; import android.content.ContentValues; import android.content.Context; import android.database.Cursor; import android.database.SQLException; import android.database.sqlite.SQLiteDatabase; import android.database.sqlite.SQLiteOpenHelper; public class SQLDatabase { public static final String KEY_SUBJECTS_ROWID = "_id"; public static final String KEY_SUBJECTNAME = "name"; public static final String KEY_COLOUR = "colour"; private static final String DATABASE_NAME = "Database"; private static final String DATABASE_TABLE_SUBJECTS = "tSubjects"; private static final int DATABASE_VERSION = 1; private DbHelper ourHelper; private final Context ourContext; private SQLiteDatabase ourDatabase; private static class DbHelper extends SQLiteOpenHelper { public DbHelper(Context context) { super(context, DATABASE_NAME, null, DATABASE_VERSION); // TODO Auto-generated constructor stub } @Override public void onCreate(SQLiteDatabase db) { // TODO Auto-generated method stub db.execSQL("CREATE TABLE " + DATABASE_TABLE_SUBJECTS + " (" + KEY_SUBJECTS_ROWID + " INTEGER PRIMARY KEY AUTOINCREMENT, " + KEY_SUBJECTNAME + " TEXT NOT NULL, " + KEY_COLOUR + " TEXT NOT NULL);"); } @Override public void onUpgrade(SQLiteDatabase db, int oldVersion, int newVersion) { // TODO Auto-generated method stub db.execSQL("DROP TABLE IF EXISTS " + DATABASE_TABLE_SUBJECTS); onCreate(db); } } public SQLDatabase(Context c) { ourContext = c; } public SQLDatabase open() throws SQLException { ourHelper = new DbHelper(ourContext); ourDatabase = ourHelper.getWritableDatabase(); return this; } public void close() { ourHelper.close(); } public long createSubjectEntry(String subjectName, String subjectColour) { // TODO Auto-generated method stub ContentValues cv = new ContentValues(); cv.put(KEY_SUBJECTNAME, subjectName); cv.put(KEY_COLOUR, subjectColour); return ourDatabase.insert(DATABASE_TABLE_SUBJECTS, null, cv); } public String[][] getSubjects() { // TODO Auto-generated method stub String[] Columns = new String[] { KEY_SUBJECTNAME, KEY_COLOUR }; Cursor c = ourDatabase.query(DATABASE_TABLE_SUBJECTS, Columns, null, null, null, null, null); String[][] Result = new String[1][]; // int iRow = c.getColumnIndex(KEY_LESSONS_ROWID); int iName = c.getColumnIndex(KEY_SUBJECTNAME); int iColour = c.getColumnIndex(KEY_COLOUR); for (c.moveToFirst(); !c.isAfterLast(); c.moveToNext()) { Result[0][c.getPosition()] = c.getString(iName); Result[1][c.getPosition()] = c.getString(iColour); Settings.subjectCount = c.getPosition(); TimetableEntry.subjectCount = c.getPosition(); } return Result; } This class has other variables and other variations of the same methods for multiple tables in the database, i've cut out the irrelevant ones. I'm not sure what I need to close and where, I've got the entrySubject.close() in my activity. I used the methods for the database from the NewBoston tutorials. Can anyone see what I've done wrong, or where my problem is? Thanks.

    Read the article

  • "Row not found or changed" Problem

    - by winston schröder
    Hi there, I'm working on a SQL CE Database and get into the "Row nor found or changed" exception. The exception only occurs when I try to update. On the first Run after the insert it shows up a MemberChangeConflict which says, that my Column Created_at has in all three values (current, original, database) the same. But in a second attempt it doesn't appear anymore. The DataContext is instanciated on Startup and freed on Exit of my Local(!) Application. I use a sqlmetal generated mapping and code file. In the map I added some Associations and set the timemstamp columns UpdateCheck property to Always while all other have the setting never. The Timestamp Column is marked as isVersion="true", the Id Column as Primary Key. Since I don't dispose the datacontext I expected to be using implicit transaction. When I run the SubmitChanges Method within a TransactionScope. Can anyone tell me how I can update the timestamp within the code ? I know about the Problems one has to deal with if you dispose the datacontext. So I decided not to do this since I use a Single User Local DB Cache File. (I did already use a version where I disposed the datacontext after every usage, but this version had a real bad performance and error rate, so I decided to choose the other variant.) LibDB.Client.Vehicles tmp = null; try { tmp = e.Parameter as LibDB.Client.Vehicles; if (tmp == null) return; if (!this._dc.Vehicles.Contains(tmp)) { this._dc.Vehicles.Attach(tmp); } this.ShowChangesReport(this._dc.GetChangeSet()); using (TransactionScope ts = new TransactionScope()) { try { this._dc.SubmitChanges(); ts.Complete(); } catch (ChangeConflictException cce) { Console.WriteLine("Optimistic concurrency error."); Console.WriteLine(cce.Message); Console.ReadLine(); foreach (ObjectChangeConflict occ in this._dc.ChangeConflicts) { MetaTable metatable = this._dc.Mapping.GetTable(occ.Object.GetType()); LibDB.Client.Vehicles entityInConflict = (LibDB.Client.Vehicles)occ.Object; Console.WriteLine("Table name: {0}", metatable.TableName); Console.Write("Vin: "); Console.WriteLine(entityInConflict.Vin); foreach (MemberChangeConflict mcc in occ.MemberConflicts) { object currVal = mcc.CurrentValue; object origVal = mcc.OriginalValue; object databaseVal = mcc.DatabaseValue; MemberInfo mi = mcc.Member; Console.WriteLine("Member: {0}", mi.Name); Console.WriteLine("current value: {0}", currVal); Console.WriteLine("original value: {0}", origVal); Console.WriteLine("database value: {0}", databaseVal); } throw cce; } } catch (Exception ex) { this.ShowChangeConflicts(this._dc.ChangeConflicts); Console.WriteLine(ex.Message); } } this.ShowChangesReport(this._dc.GetChangeSet());

    Read the article

< Previous Page | 608 609 610 611 612 613 614 615 616 617 618 619  | Next Page >