Search Results

Search found 63884 results on 2556 pages for 'mysql error 1064'.

Page 567/2556 | < Previous Page | 563 564 565 566 567 568 569 570 571 572 573 574  | Next Page >

  • php search database for row

    - by Brenden Morley
    Okay I got code the code to pull data based on a users account number well here is what im using (And yes I know it isnt safe now that is the reason for my post) <?php include('config.php'); $user_info = fetch_user_info($_GET['AccountNumber']); ?> <html> <body> <div> <?php if ($user_info === false){ $Output = 'http://www.MyDomain.Com/'; echo '<META HTTP-EQUIV=Refresh CONTENT="0; URL='.$Output.'">'; }else{ ?> <center> <title><?php echo $user_info['FirstName'], ' ', $user_info['LastName'], ' - ', $user_info['City'], ', ', $user_info['State']; ?> - Name of site</title> So basically what this code is allowing me to do is have a file called Profile.php And when a user visits this this page it will return the data Like this http://MyDomain.com/Profile.php?AccountNumber=50b9c965b7c3b How can I do this securely cause right now its using a get method really unsafe to retive the account number from the url bar.

    Read the article

  • Application logic for invoicing and subscriptions?

    - by Industrial
    Hi everyone, We're just in the planning stage of a web app that offers subscriptions to our customers. The subscription periods varies and can be prolonged indefinitely by our customers, but are always at least one month (30 days). When a customer signs up, the customer information (billing address, phone number and so on) are stored in a customers table and a subscription is created in the subscriptions table: id | start_date | end_date | customer_id -------------------------------------------------------- 1 | 2010-12-31 | 2011-01-31 | 1 Every month we'll loop through the subscriptions table (cronjob preferably) and create invoices for the past subscription period, which are housed in their own table - invoices. Depending on the customer, invoices are manually printed out and sent by mail, or just emailed to the customer. Due to the nature of our customers and the product, we need to offer a variety of different payment alternatives including wire transfer and card payments, hence some invoices may need to be manually handled and registered as paid by our staff. The 15th every month, the invoices table are looped through and if no payment has been marked for the actual invoice, the according subscription will be removed. If there's a payment registered, the end_date in the subscriptions table is incremented by another 30 days (or what now our period our customer has chosen). Are we looking at headaches by incrementing dates forwards and backwards to handle non-paying customers and extending subscriptions? Would it be a better idea to add new subscriptions as customers extends their subscription?

    Read the article

  • Find the closest locations to a given address

    - by xtine
    I have built an application in CakePHP that lists businesses. There are about 2000 entries, and the latitude and longitude coordinates for each business is in the DB. I now am trying to tackle the search function. There will be an input box where the user can put a street address, city, or zipcode, and then I would like it to return the 11 closest businesses as found from the database. How would I go about doing this?

    Read the article

  • Stored Queries?

    - by phpeffedup
    Is it considered crazy to store common SQL queries for my web app in a database for use in execution? Or is that common practice? Or is it impossible? My thinking is, this way, I avoid hard-coding SQL into my application files, and add another level of abstraction. Is this crazy? Is this what a stored procedure is? Or is that something else?

    Read the article

  • Combining the value of GetLastError and a custom error message

    - by Jessica
    I have a function that returns a different DWORD value for each case there is an error. So I have the following defines: #define ERR_NO_DB_CONNECTION 0x90000 #define ERR_DB_NOT_OPEN 0x90001 #define ERR_DB_LOCKED 0x90002 #define ERR_DB_CONN_LOST 0x90003 Now, I return those values when an error occurs. I need to also return the value of GetLastError in the same return. No, I can't read it later. I tried combining it different ways, eg: return ERR_DB_NOT_OPEN + GetLastError(); and then extract the error by subtracting the value of ERR_DB_NOT_OPEN but since I need to use this in functions where there can be several return values it can get quite complex to do that. Is there any way to achieve this? I mean, combine the value + GetLastError and extract them later? Code is appreciated. Thanks Jess.

    Read the article

  • SQL query, select from 2 tables random

    - by klaus
    Hello all i have a problem that i just CANT get to work like i what it.. i want to show news and reviews (2 tables) and i want to have random output and not the same output here is my query i really hope some one can explain me what i do wrong SELECT anmeldelser.billed_sti , anmeldelser.overskrift , anmeldelser.indhold , anmeldelser.id , anmeldelser.godkendt FROM anmeldelser LIMIT 0,6 UNION ALL SELECT nyheder.id , nyheder.billed_sti , nyheder.overskrift , nyheder.indhold , nyheder.godkendt FROM nyheder ORDER BY rand() LIMIT 0,6

    Read the article

  • Store LAST_INSERT_ID() in a transaction

    - by Oden
    Hi, I use codeigniter's database abstarction, and im doing a transaction with it. My problem is, that i have several inserts into several tables, but i need the insert id from the first insert query. Is there any way to store the last insert id for more than one following insert?

    Read the article

  • How to pass an array of objects trough a jquery $.post?

    - by majc
    Hi, I want to pass the result of a query trough a $.post. function GetAllTasks() { $sql = "select t.id as task_id, description, createdat, createdby, max_requests, max_duration, j.name as job_name from darkfuture.tasks t, darkfuture.jobs j where t.job_id = j.id"; $sqlresult = mysql_query($sql) or die("The list of works failed: ".mysql_error($this->con)); $result = array(); while($row = mysql_fetch_assoc($sqlresult)) { $task = new TasksResult(); $task->id = $row["task_id"]; $task->description = $row["description"]; $task->createdat = $row["createdat"]; $task->createdby = $row["createdby"]; $task->max_requests = $row["max_requests"]; $task->max_duration = $row["max_duration"]; $task->job_id = $row["job_name"]; array_push($result, $task); } mysql_free_result($sqlresult); return $result; } Here is how i call it: $tasksDB = new TasksDB(); $tasks = $tasksDB->GetAllTasks(); Now i want to pass $tasks through here: $.post("views/insert_tasks.php",{'tasks[]': $tasks}, function(data) { }); I know this {'tasks[]': $tasks} it's wrong but i don't know how to do it right. Some help will be appreciated. Thanks in advance!

    Read the article

  • Hibernate/Spring: getHibernateTemplate().save(...) Freezes/Hangs

    - by ashes999
    I'm using Hibernate and Spring with the DAO pattern (all Hibernate dependencies in a *DAO.java class). I have nine unit tests (JUnit) which create some business objects, save them, and perform operations on them; the objects are in a hash (so I'm reusing the same objects all the time). My JUnit setup method calls my DAO.deleteAllObjects() method which calls getSession().createSQLQuery("DELETE FROM <tablename>").executeUpdate() for my business object table (just one). One of my unit tests (#8/9) freezes. I presumed it was a database deadlock, because the Hibernate log file shows my delete statement last. However, debugging showed that it's simply HibernateTemplate.save(someObject) that's freezing. (Eclipse shows that it's freezing on HibernateTemplate.save(Object), line 694.) Also interesting to note is that running this test by itself (not in the suite of 9 tests) doesn't cause any problems. How on earth do I troubleshoot and fix this? Also, I'm using @Entity annotations, if that matters. Edit: I removed reuse of my business objects (use unique objects in every method) -- didn't make a difference (still freezes). Edit: This started trickling into other tests, too (can't run more than one test class without getting something freezing) Transaction configuration: <bean id="txManager" class="org.springframework.jdbc.datasource.DataSourceTransactionManager"> <property name="dataSource" ref="dataSource" /> </bean> <tx:advice id="txAdvice" transaction-manager="txManager"> <!-- the transactional semantics... --> <tx:attributes> <!-- all methods starting with 'get' are read-only --> <tx:method name="get*" read-only="true" /> <tx:method name="find*" read-only="true" /> <!-- other methods use the default transaction settings (see below) --> <tx:method name="*" /> </tx:attributes> </tx:advice> <!-- my bean which is exhibiting the hanging behavior --> <aop:config> <aop:pointcut id="beanNameHere" expression="execution(* com.blah.blah.IMyDAO.*(..))" /> <aop:advisor advice-ref="txAdvice" pointcut-ref="beanNameHere" /> </aop:config>

    Read the article

  • Is a clear and replace more efficient than a loop checking all records?

    - by Matt
    I have a C# List, that is filled from a database.. So far its only 1400 records, but I expect it to grow a LOT.. Routinely I do a check for new data on the entire list.. What I'm trying to figure out is this, is it faster to simply clear the List and reload all the data from the table, or would checking each record be faster.. Intuition tells me that the dump and load method would be faster, but I thought I should check first...

    Read the article

  • Scalably processing large amount of comlpicated database data in PHP, many times a day.

    - by Eph
    I'm soon to be working on a project that poses a problem for me. It's going to require, at regular intervals throughout the day, processing tens of thousands of records, potentially over a million. Processing is going to involve several (potentially complicated) formulas and the generation of several random factors, writing some new data to a separate table, and updating the original records with some results. This needs to occur for all records, ideally, every three hours. Each new user to the site will be adding between 50 and 500 records that need to be processed in such a fashion, so the number will not be steady. The code hasn't been written, yet, as I'm still in the design process, mostly because of this issue. I know I'm going to need to use cron jobs, but I'm concerned that processing records of this size may cause the site to freeze up, perform slowly, or just piss off my hosting company every three hours. I'd like to know if anyone has any experience or tips on similar subjects? I've never worked at this magnitude before, and for all I know, this will be trivial to the server and not pose much of an issue. As long as ALL records are processed before the next three hour period occurs, I don't care if they aren't processed simultaneously (though, ideally, all records belonging to a specific user should be processed in the same batch), so I've been wondering if I should process in batches every 5 minutes, 15 minutes, hour, whatever works, and how best to approach this (and make it scalable in a way that is fair to all users)?

    Read the article

  • SQL: count days in date range?

    - by John Isaacks
    I have a query like this: SELECT COUNT(*) AS amount FROM daily_individual_tracking WHERE sales = 'YES' AND daily_individual_tracking_date BETWEEN '2010-01-01' AND '2010-03-31' I am selected from a date range. Is there a way to also get the total days in the date range?

    Read the article

  • How do I make all the finders on the model ignorecase?

    - by Glex
    I have a model with several attributes, among them title and artist. The case of title and artist should be ignored in all the Active Record finders. Basically, if title or artist are present in the :conditions (or dynamically i.e. find_all_by_artist), then the WHERE artist = :artist should become WHERE UPPER(artist) = UPPER(:artist) or something along these lines. Is there a way of doing it with Rails?

    Read the article

  • What causes Python "Interpreter not Initialized" error?

    - by ?????
    I'm now on my third full day this week of trying to get OpenCV to work with Python. (I have been trying on and off for the past 6 months). I get this error Python 2.7.1 (r271:86882M, Nov 30 2010, 10:35:34) [GCC 4.2.1 (Apple Inc. build 5664)] on darwin Type "help", "copyright", "credits" or "license" for more information. dlopen("/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/readline.so", 2); import readline # dynamically loaded from /Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/lib-dynload/readline.so >>> import cv dlopen("./cv.so", 2); Fatal Python error: Interpreter not initialized (version mismatch?) and then it crashes (core dumps). python -v gives nothing after the dlopen. Any ideas from anyone who actually knows about this error?

    Read the article

  • what's wrong with this code?

    - by user329820
    Hi this is my code which will not work correctly ! what is wrong with its data type :( thanks CREATE TABLE T1 (A INTEGER NOT NULL); CREATE TABLE T3 (A SMALLINT NOT NULL); INSERT T1 VALUES (32768.5); SELECT * FROM T1; INSERT T3 SELECT * FROM T1; SELECT * FROM T3;

    Read the article

  • Having to insert a record, then update the same record warrants 1:1 relationship design?

    - by dianovich
    Let's say an Order has many Line items and we're storing the total cost of an order (based on the sum of prices on order lines) in the orders table. -------------- orders -------------- id ref total_cost -------------- -------------- lines -------------- id order_id price -------------- In a simple application, the order and line are created during the same step of the checkout process. So this means INSERT INTO orders .... -- Get ID of inserted order record INSERT into lines VALUES(null, order_id, ...), ... where we get the order ID after creating the order record. The problem I'm having is trying to figure out the best way to store the total cost of an order. I don't want to have to create an order create lines on an order calculate cost on order based on lines then update record created in 1. in orders table This would mean a nullable total_cost field on orders for starters... My solution thus far is to have an order_totals table with a 1:1 relationship to the orders table. But I think it's redundant. Ideally, since everything required to calculate total costs (lines on an order) is in the database, I would work out the value every time I need it, but this is very expensive. What are your thoughts?

    Read the article

  • Keeping video viewing statistics breakdown by video time in a database

    - by Septagram
    I need to keep a number of statistics about the videos being watched, and one of them is what parts of the video are being watched most. The design I came up with is to split the video into 256 intervals and keep the floating-point number of views for each of them. I receive the data as a number of intervals the user watched continuously. The problem is how to store them. There are two solutions I see. Row per every video segment Let's have a database table like this: CREATE TABLE `video_heatmap` ( `id` int(11) NOT NULL AUTO_INCREMENT, `video_id` int(11) NOT NULL, `position` tinyint(3) unsigned NOT NULL, `views` float NOT NULL, PRIMARY KEY (`id`), UNIQUE KEY `idx_lookup` (`video_id`,`position`) ) ENGINE=MyISAM Then, whenever we have to process a number of views, make sure there are the respective database rows and add appropriate values to the views column. I found out it's a lot faster if the existence of rows is taken care of first (SELECT COUNT(*) of rows for a given video and INSERT IGNORE if they are lacking), and then a number of update queries is used like this: UPDATE video_heatmap SET views = views + ? WHERE video_id = ? AND position >= ? AND position < ? This seems, however, a little bloated. The other solution I came up with is Row per video, update in transactions A table will look (sort of) like this: CREATE TABLE video ( id INT NOT NULL AUTO_INCREMENT, heatmap BINARY (4 * 256) NOT NULL, ... ) ENGINE=InnoDB Then, upon every time a view needs to be stored, it will be done in a transaction with consistent snapshot, in a sequence like this: If the video doesn't exist in the database, it is created. A row is retrieved, heatmap, an array of floats stored in the binary form, is converted into a form more friendly for processing (in PHP). Values in the array are increased appropriately and the array is converted back. Row is changed via UPDATE query. So far the advantages can be summed up like this: First approach Stores data as floats, not as some magical binary array. Doesn't require transaction support, so doesn't require InnoDB, and we're using MyISAM for everything at the moment, so there won't be any need to mix storage engines. (only applies in my specific situation) Doesn't require a transaction WITH CONSISTENT SNAPSHOT. I don't know what are the performance penalties of those. I already implemented it and it works. (only applies in my specific situation) Second approach Is using a lot less storage space (the first approach is storing video ID 256 times and stores position for every segment of the video, not to mention primary key). Should scale better, because of InnoDB's per-row locking as opposed to MyISAM's table locking. Might generally work faster because there are a lot less requests being made. Easier to implement in code (although the other one is already implemented). So, what should I do? If it wasn't for the rest of our system using MyISAM consistently, I'd go with the second approach, but currently I'm leaning to the first one. But maybe there are some reasons to favour one approach or another?

    Read the article

  • chained selects with one table

    - by Owen
    I know I am going about this in an unusual way, every tut I've seen uses multiple tables, but due to the way the rest of my site works I would like to create a chained select which operates using a single table. My table structure is: ---------------------- |Catagory|SubCategory| |01|cat1 |subcat1 | |02|cat1 |subcat2 | |03|cat2 |subcat1 | |04|cat2 |subcat2 | ---------------------- The code I have so far looks like: <tr> <td class="shadow"><strong>Category:</strong> </td> <td class="shadow"> <select id="category" name="category" style="width:150px"> <option selected="selected" value="<?php echo $category ?>"><?php echo $category?></option> <?php include('connect.php'); $result1 = mysql_query("SELECT DISTINCT category FROM categories") or die(mysql_error()); while($row = mysql_fetch_array( $result1 )) { $category = $row['category']; echo "<option value='". $row['category'] ."'>". $row['category'] ."</option>"; } ?> </select> </td> </tr> <tr> <td class="shadow"><strong>Sub Category:</strong> </td> <td class="shadow"> <select id="sub_catgory" name="sub_category" style="width:150px;"> <option selected="selected" value="<?php echo $sub_category ?>"><?php echo $sub_category ?></option> <?php include('connect.php'); $result2 = mysql_query("SELECT sub_category FROM categories WHERE ") or die(mysql_error()); while($row = mysql_fetch_array ($result2 )){ echo "<option value='" . $row['sub_category'] . "'>". $row['sub_category']. "</option>"; } ?> </select> </td> </tr> On the second select I am not sure how to state the WHERE clause. I need it to display the subcategories which have the same category as selected in the first select. PART 2 how would I include AJAX in this to preload the data so i don't need to refresh the page. Could someone either help me finish what I've started here or point me to a good tutorial. thanks

    Read the article

  • how to add data in Table of Jasper Report using Java

    - by Areeb Gillani
    i am here to ask you just a simple question that i am trying to pass data to a jasper report using java but i dont know how to to, because the table data is very dynamic thats y cannot pass sql query. any idea for this. i have a 2D array of object type, where i have all the data... so how can i pass that... Thanx in advance...!:) ConnectionManager con = new ConnectionManager(); con.establishConnection(); String fileName = "Pmc_Bill.jrxml"; String outFileName = "OutputReport.pdf"; HashMap params = new HashMap(); params.put("PName", pname); params.put("PSerial", psrl); params.put("PGender",pgen); params.put("PPhone",pph); params.put("PAge",page); params.put("PRefer",pref); params.put("PDateR",dateNow); try { JasperReport jasperReport = JasperCompileManager.compileReport(fileName); if(jasperReport != null ) System.out.println("so far so good "); // Fill the report using an empty data source JasperPrint jasperPrint = JasperFillManager.fillReport(jasperReport, params, new JRTableModelDataSource(tbl.getModel()));//con.connection); try{ JasperExportManager.exportReportToPdfFile(jasperPrint, outFileName); System.out.printf("File exported sucessfully"); }catch(Exception e){ e.printStackTrace(); } JasperViewer.viewReport(jasperPrint); } catch (JRException e) { JOptionPane.showMessageDialog(null, e); e.printStackTrace(); System.exit(1); }

    Read the article

  • Is there a single query that can update a "sequence number" across multiple groups?

    - by Drarok
    Given a table like below, is there a single-query way to update the table from this: | id | type_id | created_at | sequence | |----|---------|------------|----------| | 1 | 1 | 2010-04-26 | NULL | | 2 | 1 | 2010-04-27 | NULL | | 3 | 2 | 2010-04-28 | NULL | | 4 | 3 | 2010-04-28 | NULL | To this (note that created_at is used for ordering, and sequence is "grouped" by type_id): | id | type_id | created_at | sequence | |----|---------|------------|----------| | 1 | 1 | 2010-04-26 | 1 | | 2 | 1 | 2010-04-27 | 2 | | 3 | 2 | 2010-04-28 | 1 | | 4 | 3 | 2010-04-28 | 1 | I've seen some code before that used an @ variable like the following, that I thought might work: SET @seq = 0; UPDATE `log` SET `sequence` = @seq := @seq + 1 ORDER BY `created_at`; But that obviously doesn't reset the sequence to 1 for each type_id. If there's no single-query way to do this, what's the most efficient way? Data in this table may be deleted, so I'm planning to run a stored procedure after the user is done editing to re-sequence the table.

    Read the article

  • Getting all database entries into organized array

    - by Industrial
    Hi everyone, I have just made the update/add/delete part for the "Closure table" way of organizing query hierarchical data that are shown on page 70 in this slideshare: http://www.slideshare.net/billkarwin/sql-antipatterns-strike-back However, I have a bit of an issue getting the full tree back as an multidimensional array from a single query. Here's what I would like to get back: array ( 'topvalue' = array ( 'Subvalue', 'Subvalue2', 'Subvalue3' = array ('Subvalue 1', 'Subvalue 2', 'Subvalue 3' ) ); );

    Read the article

  • Increment my id in my insert request

    - by Mercer
    hello, i have a table with some rows. idClient, name, adress,country,... i want to know how i can do an insert into this table with auto increment my idClient in my sql request..? Thx. edit: i want do a request like this insert into Client values((select max(idClient),...)

    Read the article

  • How can I use an array within a SQL query

    - by ThinkingInBits
    So I'm trying to take a search string (could be any number of words) and turn each value into a list to use in the following IN statement) in addition, I need a count of all these values to use with my having count filter $search_array = explode(" ",$this->search_string); $tag_count = count($search_array); $db = Connect::connect(); $query = "select p.id from photographs p left join photograph_tags c on p.id = c.photograph_id and c.value IN ($search_array) group by p.id having count(c.value) >= $tag_count"; This currently returns no results, any ideas?

    Read the article

  • Confusion using MYSQLI

    - by user1020069
    I just started using mysqli API for PHP. Apparently, every time an object of the class MYSQLI is instantiated, it can setup a connection to the database as it connects to the server unlike mysql_connect, which connects to the server first and then you are required to specify the database to query. Now this is a good problem if the db exists, in my case, the db does not exist on the first ever connection to the server/execution of the problem, hence I must connect without specifying the database, which is fine, since the msyqli constructor does not make this database mandatory. My challenge is essentially, how do I check if the database exists before attempting the first connection. The only way to really do this would be to establish a conection to the server and then use the result of the following query to gauge if the database exists: SELECT COUNT(*) AS `exists` FROM INFORMATION_SCHEMA.SCHEMATA WHERE SCHEMATA.SCHEMA_NAME="dbname" ; If this returns true, then the database exists, but now the challenge is how do I get the mysqli object to query this database rather than having to prefix the name of the database in the query. Thanks much

    Read the article

< Previous Page | 563 564 565 566 567 568 569 570 571 572 573 574  | Next Page >