Search Results

Search found 20931 results on 838 pages for 'mysql insert'.

Page 710/838 | < Previous Page | 706 707 708 709 710 711 712 713 714 715 716 717  | Next Page >

  • Rails + RSpec problem

    - by FancyDancy
    I have just installed Rspec and Rspec-rails. When i try to run the test, it says: rake aborted! Command /opt/local/bin/ruby -I"lib" "/opt/local/lib/ruby/gems/1.8/gems/rspec-1.3.0/bin/spec" "spec/controllers/free_controller_spec.rb" --options "/Volumes/Trash/dev/app/trunk/spec/spec.opts" failed Full log here: http://pastie.org/939211 However, my second "test" application with sqlite works with it. I think the problem is in my DB. My ruby version is 1.8.7, i use mysql as database. My files: specs/spec_helper.rb config/environment.rb config/environments/test.rb List of my gems My test is just: require 'spec_helper' describe FreeController do it "should respond with success" do get 'index' response.should be_success end end I really can't understand the error, so i don't know how to fix it.. Additional question: should i use a fixtures and ActiveRecord, if i going to use Machinist for creating test data? What should i do to disable them?

    Read the article

  • SQL CREATE TABLE Error

    - by Adam M-W
    Hi, I've been stuck on this one simple(ish) thing for the last 1/2 hour so I thought I might try to get a quick answer here. What exactly is incorrect about my SQL syntax, assuming I'm using mysql 5.1 CREATE TABLE 'users' ( 'id' MEDIUMINT(8) UNSIGNED NOT NULL AUTO_INCREMENT PRIMARY KEY, 'username' VARCHAR(20) NOT NULL, 'password' VARCHAR(40) NOT NULL, 'salt' VARCHAR(40) DEFAULT NULL, 'email' VARCHAR(80) NOT NULL, 'created_on' INT(11) UNSIGNED NOT NULL, 'last_login' INT(11) UNSIGNED DEFAULT NULL, 'active' TINYINT(1) UNSIGNED DEFAULT NULL, ) ENGINE InnoDB; Also, does anyone have any good tutorials about how to use Zend_Auth for complete noobs? Thanks.

    Read the article

  • Suggestions for performance improvement surrounding sending email notifications?

    - by jcmoney
    It takes around a couple of seconds for my app to execute the code to send an email right now on a test server with nothing much else running. Not sure if this is typical/expected. I'm also using the php framework Kohana's email helper and not php's mail directly out of convenience if that matters. Is it always just better to schedule a cron job to send emails every 5 min or so? Or should I be able to send emails immediately and I'm just not doing something right? What the script does is insert a row into the db and notifies the relevant group that the row was created. The groups are usually < 20 people so I just do a loop calling Kohana's email helper each time for each member of the group.

    Read the article

  • Javascript XMLHttpRequests in Loop?

    - by usurper
    Hi, I am trying to save an array of records into a mysql database but I always get the abort message in firebug except for the last save. How do I save the records using a loop for XMLHttpRequest? Here is my code: function savingContent() { if (window.XMLHttpRequest) {// code for IE7+, Firefox, Chrome, Opera, Safari xmlhttp=new XMLHttpRequest(); } else {// code for IE6, IE5 xmlhttp=new ActiveXObject("Microsoft.XMLHTTP"); } var rowindex = 0; for (x in globalObj.AddedRows) { var rowData = "?q=" + globalObj.AddedRows[rowindex]; xmlhttp.open("POST", "insertRowData.php"+rowData, true); xmlhttp.setRequestHeader("Content-Type","application/x-www-form-urlencoded"); xmlhttp.setRequestHeader("Content-Length",rowData.length); xmlhttp.send(null); rowindex += 1; }

    Read the article

  • What to do with twitter oauth token once retreived?

    - by mcintyre321
    I'm writing a web app that will use twitter as its primary log on method. I've written code which gets the oauth token back from Twitter. My plan is now to Find the entry in my Users table for the twitter username retreived using the token, or create the entry if necessary Update the Users.TwitterOAuthToken column with the new OAuth token Create a permanent cookie with a random guid on the site and insert a record into my UserCookies table matching Cookie to User when a request comes in I will look for the browser cookie id in the UserCookies table, then use that to figure out the user, and make twitter requests on their behalf Write the oauth token into some pages as a js variable so that javascript can make requests on behalf of the user If the user clears his/her cookies the user will have to log in again to twitter Is this the correct process? Have I created any massive security holes? thanks!

    Read the article

  • Violation of primary key constraint, multiple users

    - by MC.
    Lets say UserA and UserB both have an application open and are working with the same type of data. UserA inserts a record into the table with value 10 (PrimaryKey='A'), UserB does not currently see the value UserA entered and attempts to insert a new value of 20 (PrimaryKey='A'). What I wanted in this situation was a DBConcurrencyException, but instead what I have is a primary key violation. I understand why, but I have no idea how to resolve this. What is a good practice to deal with such a circumstance? I do not want to merge before updating the database because I want an error to inform the user that multiple users updated this data.

    Read the article

  • format.js response doesn't execute

    - by Denis
    Hi, I use Rails 3.0.0 Beta following an action, my rjs view returns a javascript, but the code is not executed In firebug I see in my response $('polaroids').insert("<li>\n <a title=\"4204497503_a0c43c561d.jpg\" href=\"#\">\n <img alt=\"4204497503_a0c43c561d.jpg\" src=\"/system/photos/279/original/4204497503_a0c43c561d.jpg?1268857318\" />\n <\/a>\n<\/li>") Here is the response header Response Headers Etag "7343b21b2f062fb74b7d5f32e3a83c2c" Connection Keep-Alive Content-Type text/javascript; charset=utf-8 Date Wed, 17 Mar 2010 20:21:58 GMT Server WEBrick/1.3.1 (Ruby/1.8.7/2008-08-11) X-Runtime 0.060497 Content-Length 220 Cache-Control max-age=0, private, must-revalidate Set-Cookie _photos_session=BAh7ByIQX2NzcmZfdG9rZW4iMS9OWnpOZUR6UGQ2UDhvbGt5YWpTWXhJcFR2YjRHOEhzZHlIbmdMblRlMWs9Ig9zZXNzaW9uX2lkIiUxNjlhOWYzNjQxODE2N2NjN2FiNmYzY2VkYmU3OTgwYQ%3D%3D--022d7202178b2cc7bf968e558c2ae67ecef1fb74; path=/; HttpOnly

    Read the article

  • Vista Basic theme ribbon issue

    - by Alain Rist
    Under Vista, when in Basic theme, after calling IUIFramework::Destroy() the Vista theme is lost, and enlarging the window does not display outside of the initial area. You can repro it easily with the SimpleRibbon SDK sample. In simpleribbon.cpp, insert in the WndProc switch block: case WM_KEYUP: DestroyFramework(); InvalidateRect(hWnd, NULL, TRUE); break; Compile, run, hit a key and try to enlarge in Vista Basic Theme (no problem in Win7 or Vista aero or Windows classic). How to work around? cheers, AR

    Read the article

  • Are frameworks really necessary for beginners/intermediates? (PHP)

    - by ggfan
    I have been programming for around 6months and currently learning PHP/Mysql. I can create basic functionally sites starting from a plain sheet of paper. Is it necessary I use frameworks to create sites? Currently, everything I do is from scratch. I'll borrow codes from old codes, ask people for help, etc. Are frameworks going to help me much more? Is it alright if I put a site public without using a framework? (I have not looked a lot into frameworks, so my knowledge is limited, but I'm just curious)

    Read the article

  • LINQ to SQL auto-generated type for stored procedure

    - by StuffHappens
    Hello. I have the following stored procedure ALTER PROCEDURE [dbo].Test AS BEGIN CREATE TABLE ##table ( ID1 int, ID2 int ) DECLARE @query varchar(MAX); INSERT INTO ##table VALUES(1, 1); SELECT * FROM ##table; END And I try to use it from C# code. I use LINQ to SQL as an O/RM. When I add the procedure to DataBaseContext it says that it can't figure out the return value of this procedure. How to modify the stored procedure so that I can use it with LINQ to SQL. Note: I need to have global template table!

    Read the article

  • Can I use Linq to project a new typed datarow?

    - by itchi
    I currently have a csv file that I'm parsing with an example from here: http://alexreg.wordpress.com/2009/05/03/strongly-typed-csv-reader-in-c/ I then want to loop the records and insert them using a typed dataset xsd to an Oracle database. It's not that difficult, something like: foreach (var csvItem in csvfile) { DataSet.MYTABLEDataTable DT = new DataSet.MYTABLEDataTable(); DataSet.MYTABLERow row = DT.NewMYTABLERow(); row.FIELD1 = csvItem.FIELD1; row.FIELD2 = csvItem.FIELD2; } I was wondering how I would do something with LINQ projection: var test = from csvItem in csvfile select new MYTABLERow { FIELD1 = csvItem.FIELD1, FIELD2 = csvItem.FIELD2 } But I don't think I can create datarows like this without the use of a rowbuilder, or maybe a better constructor for the datarow?

    Read the article

  • Seeking FOSS user admin code

    - by Mawg
    It must be a fairly standard wheel, so I'd rather not reinvent it. Create/modify/delete users. Ditto their passwords & maybe enforce password change every X days. Also, create groups, like "sales", "support", etc and add/remove users. The only unique part should be what they have permission to do (visit certain parts of the site after login, etc) And I'd like to store admin data in an ODBC compliant database (MySql to start with, but I may move on). Is this a new wheel? There doesn't seem to be much of anything on SourceForge, but if I could find something established and trusted I wouldn't even mind paying a few $100 as a trade of for the time needed to develop & test it.

    Read the article

  • How can i test my DB speed? (Learning)

    - by acidzombie24
    I have design a database. Theres no columns with indexing, nor any code for optimizing. I am positive i should index certain columns since i search them a lot. My question is HOW do i test if any part of my database will be slow? ATM I am using sqlite and i will be switching to either MS Sql or MySql based on my host provider. Will creating 100,000 records in each table be enough? Or will that always be fast in sqlite and i need to do 1mil? Do i need 10mil before a database will become slow? Also how do i time it? I am using C# so should i use StopWatch or is there a ADO.NET/Sqlite function i should use?

    Read the article

  • Processing XML file with Huge data

    - by Manish Dhanotiya
    Hi,be m I am working on an application which has below requiements - 1. Download a ZIP file from a server. 2. Uncompress the ZIP file, get the content (which is in XML format) from this file into a String. 3. Pass this content into another method for parsing and further processing. Now, my concerns here is the XML file may be of Huge size say like '100MB', and my JVM has memory of only 512 MB, so how can I get this content into Chunks and pass for Parsing and then insert the data into PL/SQL tables. Since there can be multiple requests running at the same time and considering 512MB of memory what will be the best possible to process this. How I can get the data into Chunks and pass it as Stream for XML parsing. I googled on this, but didnt find any implementation. :( Thanks,

    Read the article

  • move data from one table to another, postgresql edition

    - by IggShaman
    Hi All, I'd like to move some data from one table to another (with a possibly different schema). Straightforward solution that comes into mind is - start a transaction with serializable isolation level; INSERT INTO dest_table SELECT data FROM orig_table,other-tables WHERE <condition>; DELETE FROM orig_table USING other-tables WHERE <condition>; COMMIT; Now what if the amount of data is rather big, and the <condition> is expensive to compute? In PostgreSQL, a RULE or a stored procedure can be used to delete data on the fly, evaluating condition only once. Which solution is better? Are there other options?

    Read the article

  • Database system that is not relational.

    - by paan
    What are the other types of database systems out there. I've recently came across couchDB that handles data in a non relational way. It got me thinking about what other models are other people is using. So, I want to know what other types of data model is out there. (I'm not looking for any specifics, just want to look at how other people are handling data storage, my interest are purely academic) The ones I already know are: RDBMS (mysql,postgres etc..) Document based approach (couchDB, lotus notes) Key/value pair (BerkeleyDB)

    Read the article

  • How should I build a gaming community

    - by Przystojny
    I've been wanting to build my own gaming community site (like http://fragbite.com) for a long time, I have started many times but just quitted after a couple days because it gets very messy I've been palying around with php and mysql off and on for 3 years but i've never gotten in to oop. i have tried but i usually end up with the "old php". I usually build my pages like so that i include a file on top of all pages with neccasary functions, html head etc. And I mix php and html together which I dont mind but if I eventually would get a designer he would not like it i think. I have tried both phpcake and codeigniter and all those popular mvc's but its just to much, like they do all the work. i want to do it myself. but i dont know where to start. What would you do if you were me? Is there maybe some non-oop mvc? (Sorry for my english)

    Read the article

  • If I take a large datatype. Will it affect performance in sql server

    - by Shantanu Gupta
    If i takes larger datatype where i know i should have taken datatype that was sufficient for possible values that i will insert into a table will affect any performance in sql server in terms of speed or any other way. eg. IsActive (0,1,2,3) not more than 3 in any case. I know i must take tinyint but due to some reasons consider it as compulsion, i am taking every numeric field as bigint and every character field as nVarchar(Max) Please give statistics if possible, to let me try to overcoming that compulsion. I need some solid analysis that can really make someone rethink before taking any datatype.

    Read the article

  • How to build Firefox extention to intercept HTTP requests and responses?

    - by didizingo
    Hi, how do I insert a listener to Firefox http requests and responses, so that I popup a window with the address requested and the response body? Note: I have to do this building an extension to Firefox. I need a button to activate or disable the feature. For every request, I need to popup a window with an "Ok" button to allow the request to be made. Likewise, I need to popup a window with the response body from the web server, with an "Ok" button to allow the content to be displayed by the browser. I know that I have to use nsIHttpChannel, as shown here, but I don't know where to put such code on the extention's architecture. I have very little knowledge about javascript. Could anyone help me?

    Read the article

  • Linq To Sql Entity Updated from Trigger

    - by James Helms
    I have a Table called Address. I have a Trigger for insert on that table that does some spacial calculations on the address that determines what neighborhood boundaries it is in. address = new Address { Street = this.Street, City = this.City, State = this.State, ZipCode = this.ZipCode, latitude = this.Latitude, longitude = this.Longitude, YearBuilt = this.YearBuilt, LotSize = this.LotSize, FinishedSize = this.FinishedSize, Bedrooms = this.Bedrooms, Bathrooms = this.Bathrooms, UseCode = this.UseCode, HOA = this.HOA, UpdateDate = DateTime.Now }; db.AddToAddresses(address); db.SaveChanges(); In the database i can clearly see that the Trigger ran and updated the neighborhoodID in the address table for the row. I tried to just reload that record to get the assigned id like this: address = (from a in db.Addresses where a.AddressID == address.AddressID select a).First(); In the debugger i can clearly see that the address.AddressID is correct, entity doesn't update in memory. Is there any work around for this?

    Read the article

  • Is angularjs capable of filtering based on keywords?

    - by Alex90
    I have 30 categories in mysql. I have 450 subcategories which are related to the 30 categories in another table. Categories table id _ title _ Keywords 1 _ Animals _ animal, animals, pet, parrot 2 _ Books _ books, book, educational n _ xxx _ xxx Subcategories table id _ ref _ title _ keywords 1 _ 1 _ cats _ cats, persian cat, bengal cat 2 _ 1 _ dogs _ dogs, labrador, golder retriver 3 _ 2 _ Classic _ The davinci code, books, book, classical books I need to implement the filter to a textfield. If the users enters labrador in the textfield then show the categories or/and subcategories which contain 'labrador' in the keywords. In this case the "dogs" subcategory would appear! I know that this has been done using jquery. But is there anyway to implement this with angularJs? If you got a jsfiddle then it would be awesome! :) Thank you

    Read the article

  • Sorting 1000-2000 elements with many cache misses

    - by Soylent Graham
    I have an array of 1000-2000 elements which are pointers to objects. I want to keep my array sorted and obviously I want to do this as quick as possible. They are sorted by a member and not allocated contiguously so assume a cache miss whenever I access the sort-by member. Currently I'm sorting on-demand rather than on-add, but because of the cache misses and [presumably] non-inlining of the member access the inner loop of my quick sort is slow. I'm doing tests and trying things now, (and see what the actual bottleneck is) but can anyone recommend a good alternative to speeding this up? Should I do an insert-sort instead of quicksorting on-demand, or should I try and change my model to make the elements contigious and reduce cache misses? OR, is there a sort algorithm I've not come accross which is good for data that is going to cache miss?

    Read the article

  • How I can move table to another filegroup ?

    - by denisioru
    Hello, I have MSSQL 2008 Ent and OLTP database with two big tables. How I can move this tables to another filegroup without service interrupting? Now, about 100-130 records inserted and 30-50 records updated each second in this tables. Each table have about 100M records and six fields (including one field geography). I looking for solution via google, but all solutions contain "create second table, insert rows from first table, drop first table, bla bla bla". Can I use partitioning functions for solving this problem? Thank you.

    Read the article

  • JQuery post to php

    - by RussP
    Why is it that I can never get JQuery serialize to work properly. I guess I must be missing something. I can serialize a form data and it shows in an alert: var forminfo = $j('#frmuserinfo').serialize(); alert(forminfo); I then post to my PHP page thus: $j.ajax({ type: "POST", url: "cv-user-process.php", data: "forminfo="+forminfo, cache: false, complete: function(data) { } }); But WHENEVER (not the first time) I try to insert/update the data in the DB I only ever get 1 varaible passed: Here is my PHP script: $testit = mysql_query("UPDATE cv_usersmeta SET inputtest='".$_POST['forminfo']."' WHERE user='X'"); the data passed only ever gets the first variable. why? I think it is more the way I deal with the php but it drives me nuts and always takes me far too long to find where I am going wrong.

    Read the article

  • How to find that 'runas' execution finished?

    - by Radek
    I use ruby 1.9.3p194 (2012-04-20) [i386-mingw32] on Windows7 To do mySQL backup I run runas /savecred /user:yogurt\administrator "cmd.exe /k mysqldump --user=#{dbuser} --password=#{dbpassword} #{dbname} > #{dump}" - mysqldump must be executed as administrator. I do not run my ruby scripts under administrator account. runas starts new cmd.exe and ruby doesn't wait for it to finish. Dump process takes about one minute to finish. After that I zip the dump file and delete it. But I have to make sure that the dump process already finished before I do any other action on that file. Right now I use sleep(60) that works but I wonder if there any better more systematic solution.

    Read the article

< Previous Page | 706 707 708 709 710 711 712 713 714 715 716 717  | Next Page >