Search Results

Search found 14348 results on 574 pages for 'planet mysql'.

Page 373/574 | < Previous Page | 369 370 371 372 373 374 375 376 377 378 379 380  | Next Page >

  • SQL validation!

    - by Filip
    I am pretty new in SQL so this may be a stupid question... I have a form in PHP which fills in few fields in my SQL table. I have this code: $sql="INSERT INTO $tbl_name (app_name, app_path, short_desc, full_desc) VALUES ('$_POST[app_name]', '$_POST[app_path]', '$_POST[short_desc]', '$_POST[full_desc]')"; But even app_name and app_path are NOT NULL columns, the query can be executed if there is no text in these fields in the form. So, my question is: How to stop the execution of the query if there is no text in the NOT NULL fields ?

    Read the article

  • php search database for row

    - by Brenden Morley
    Okay I got code the code to pull data based on a users account number well here is what im using (And yes I know it isnt safe now that is the reason for my post) <?php include('config.php'); $user_info = fetch_user_info($_GET['AccountNumber']); ?> <html> <body> <div> <?php if ($user_info === false){ $Output = 'http://www.MyDomain.Com/'; echo '<META HTTP-EQUIV=Refresh CONTENT="0; URL='.$Output.'">'; }else{ ?> <center> <title><?php echo $user_info['FirstName'], ' ', $user_info['LastName'], ' - ', $user_info['City'], ', ', $user_info['State']; ?> - Name of site</title> So basically what this code is allowing me to do is have a file called Profile.php And when a user visits this this page it will return the data Like this http://MyDomain.com/Profile.php?AccountNumber=50b9c965b7c3b How can I do this securely cause right now its using a get method really unsafe to retive the account number from the url bar.

    Read the article

  • Scalably processing large amount of comlpicated database data in PHP, many times a day.

    - by Eph
    I'm soon to be working on a project that poses a problem for me. It's going to require, at regular intervals throughout the day, processing tens of thousands of records, potentially over a million. Processing is going to involve several (potentially complicated) formulas and the generation of several random factors, writing some new data to a separate table, and updating the original records with some results. This needs to occur for all records, ideally, every three hours. Each new user to the site will be adding between 50 and 500 records that need to be processed in such a fashion, so the number will not be steady. The code hasn't been written, yet, as I'm still in the design process, mostly because of this issue. I know I'm going to need to use cron jobs, but I'm concerned that processing records of this size may cause the site to freeze up, perform slowly, or just piss off my hosting company every three hours. I'd like to know if anyone has any experience or tips on similar subjects? I've never worked at this magnitude before, and for all I know, this will be trivial to the server and not pose much of an issue. As long as ALL records are processed before the next three hour period occurs, I don't care if they aren't processed simultaneously (though, ideally, all records belonging to a specific user should be processed in the same batch), so I've been wondering if I should process in batches every 5 minutes, 15 minutes, hour, whatever works, and how best to approach this (and make it scalable in a way that is fair to all users)?

    Read the article

  • (database design):Which tables should be created for all kindes files (images, attached email files,

    - by meyosef
    Hi, I new in database design: I have question with my own few solution,what do you thinks?: Which tables should be created for all kinds files (images, attached email files,text files for store email body, etc..) that stored in my online store? *option 1:use seperate table for files types * files{ id files_types_id FK file_path file_extension } files_types{ id type_name (unique) } *option 2: use bool field for each file type * files{ id file_path file_extension is_image_main is_image_icon is_image_logo is_pdf_file is_text_file } **option 3: use 1 enum field 'file_type' for each file type ** files{ id file_path file_extension file_type (image_main,image_icon,image_logo,image_main,pdf,text) **enum** } Thanks you, Yosef

    Read the article

  • SQL: count days in date range?

    - by John Isaacks
    I have a query like this: SELECT COUNT(*) AS amount FROM daily_individual_tracking WHERE sales = 'YES' AND daily_individual_tracking_date BETWEEN '2010-01-01' AND '2010-03-31' I am selected from a date range. Is there a way to also get the total days in the date range?

    Read the article

  • Stored Queries?

    - by phpeffedup
    Is it considered crazy to store common SQL queries for my web app in a database for use in execution? Or is that common practice? Or is it impossible? My thinking is, this way, I avoid hard-coding SQL into my application files, and add another level of abstraction. Is this crazy? Is this what a stored procedure is? Or is that something else?

    Read the article

  • Find the closest locations to a given address

    - by xtine
    I have built an application in CakePHP that lists businesses. There are about 2000 entries, and the latitude and longitude coordinates for each business is in the DB. I now am trying to tackle the search function. There will be an input box where the user can put a street address, city, or zipcode, and then I would like it to return the 11 closest businesses as found from the database. How would I go about doing this?

    Read the article

  • Is a clear and replace more efficient than a loop checking all records?

    - by Matt
    I have a C# List, that is filled from a database.. So far its only 1400 records, but I expect it to grow a LOT.. Routinely I do a check for new data on the entire list.. What I'm trying to figure out is this, is it faster to simply clear the List and reload all the data from the table, or would checking each record be faster.. Intuition tells me that the dump and load method would be faster, but I thought I should check first...

    Read the article

  • Store LAST_INSERT_ID() in a transaction

    - by Oden
    Hi, I use codeigniter's database abstarction, and im doing a transaction with it. My problem is, that i have several inserts into several tables, but i need the insert id from the first insert query. Is there any way to store the last insert id for more than one following insert?

    Read the article

  • How to pass an array of objects trough a jquery $.post?

    - by majc
    Hi, I want to pass the result of a query trough a $.post. function GetAllTasks() { $sql = "select t.id as task_id, description, createdat, createdby, max_requests, max_duration, j.name as job_name from darkfuture.tasks t, darkfuture.jobs j where t.job_id = j.id"; $sqlresult = mysql_query($sql) or die("The list of works failed: ".mysql_error($this->con)); $result = array(); while($row = mysql_fetch_assoc($sqlresult)) { $task = new TasksResult(); $task->id = $row["task_id"]; $task->description = $row["description"]; $task->createdat = $row["createdat"]; $task->createdby = $row["createdby"]; $task->max_requests = $row["max_requests"]; $task->max_duration = $row["max_duration"]; $task->job_id = $row["job_name"]; array_push($result, $task); } mysql_free_result($sqlresult); return $result; } Here is how i call it: $tasksDB = new TasksDB(); $tasks = $tasksDB->GetAllTasks(); Now i want to pass $tasks through here: $.post("views/insert_tasks.php",{'tasks[]': $tasks}, function(data) { }); I know this {'tasks[]': $tasks} it's wrong but i don't know how to do it right. Some help will be appreciated. Thanks in advance!

    Read the article

  • Hibernate/Spring: getHibernateTemplate().save(...) Freezes/Hangs

    - by ashes999
    I'm using Hibernate and Spring with the DAO pattern (all Hibernate dependencies in a *DAO.java class). I have nine unit tests (JUnit) which create some business objects, save them, and perform operations on them; the objects are in a hash (so I'm reusing the same objects all the time). My JUnit setup method calls my DAO.deleteAllObjects() method which calls getSession().createSQLQuery("DELETE FROM <tablename>").executeUpdate() for my business object table (just one). One of my unit tests (#8/9) freezes. I presumed it was a database deadlock, because the Hibernate log file shows my delete statement last. However, debugging showed that it's simply HibernateTemplate.save(someObject) that's freezing. (Eclipse shows that it's freezing on HibernateTemplate.save(Object), line 694.) Also interesting to note is that running this test by itself (not in the suite of 9 tests) doesn't cause any problems. How on earth do I troubleshoot and fix this? Also, I'm using @Entity annotations, if that matters. Edit: I removed reuse of my business objects (use unique objects in every method) -- didn't make a difference (still freezes). Edit: This started trickling into other tests, too (can't run more than one test class without getting something freezing) Transaction configuration: <bean id="txManager" class="org.springframework.jdbc.datasource.DataSourceTransactionManager"> <property name="dataSource" ref="dataSource" /> </bean> <tx:advice id="txAdvice" transaction-manager="txManager"> <!-- the transactional semantics... --> <tx:attributes> <!-- all methods starting with 'get' are read-only --> <tx:method name="get*" read-only="true" /> <tx:method name="find*" read-only="true" /> <!-- other methods use the default transaction settings (see below) --> <tx:method name="*" /> </tx:attributes> </tx:advice> <!-- my bean which is exhibiting the hanging behavior --> <aop:config> <aop:pointcut id="beanNameHere" expression="execution(* com.blah.blah.IMyDAO.*(..))" /> <aop:advisor advice-ref="txAdvice" pointcut-ref="beanNameHere" /> </aop:config>

    Read the article

  • Keeping video viewing statistics breakdown by video time in a database

    - by Septagram
    I need to keep a number of statistics about the videos being watched, and one of them is what parts of the video are being watched most. The design I came up with is to split the video into 256 intervals and keep the floating-point number of views for each of them. I receive the data as a number of intervals the user watched continuously. The problem is how to store them. There are two solutions I see. Row per every video segment Let's have a database table like this: CREATE TABLE `video_heatmap` ( `id` int(11) NOT NULL AUTO_INCREMENT, `video_id` int(11) NOT NULL, `position` tinyint(3) unsigned NOT NULL, `views` float NOT NULL, PRIMARY KEY (`id`), UNIQUE KEY `idx_lookup` (`video_id`,`position`) ) ENGINE=MyISAM Then, whenever we have to process a number of views, make sure there are the respective database rows and add appropriate values to the views column. I found out it's a lot faster if the existence of rows is taken care of first (SELECT COUNT(*) of rows for a given video and INSERT IGNORE if they are lacking), and then a number of update queries is used like this: UPDATE video_heatmap SET views = views + ? WHERE video_id = ? AND position >= ? AND position < ? This seems, however, a little bloated. The other solution I came up with is Row per video, update in transactions A table will look (sort of) like this: CREATE TABLE video ( id INT NOT NULL AUTO_INCREMENT, heatmap BINARY (4 * 256) NOT NULL, ... ) ENGINE=InnoDB Then, upon every time a view needs to be stored, it will be done in a transaction with consistent snapshot, in a sequence like this: If the video doesn't exist in the database, it is created. A row is retrieved, heatmap, an array of floats stored in the binary form, is converted into a form more friendly for processing (in PHP). Values in the array are increased appropriately and the array is converted back. Row is changed via UPDATE query. So far the advantages can be summed up like this: First approach Stores data as floats, not as some magical binary array. Doesn't require transaction support, so doesn't require InnoDB, and we're using MyISAM for everything at the moment, so there won't be any need to mix storage engines. (only applies in my specific situation) Doesn't require a transaction WITH CONSISTENT SNAPSHOT. I don't know what are the performance penalties of those. I already implemented it and it works. (only applies in my specific situation) Second approach Is using a lot less storage space (the first approach is storing video ID 256 times and stores position for every segment of the video, not to mention primary key). Should scale better, because of InnoDB's per-row locking as opposed to MyISAM's table locking. Might generally work faster because there are a lot less requests being made. Easier to implement in code (although the other one is already implemented). So, what should I do? If it wasn't for the rest of our system using MyISAM consistently, I'd go with the second approach, but currently I'm leaning to the first one. But maybe there are some reasons to favour one approach or another?

    Read the article

  • PHP displaying error for already used Username and empty field

    - by Pixel Reaper
    I want PHP to make sure the username is not already used and also check to see if the field is empty. Sorry I am a huge noob when it comes to php. Here is my code: // Check for an Username: $dup = mysql_query("SELECT user_username FROM users WHERE user_username='".$_POST['user_username']."'"); if(mysql_num_rows($dup) >0){ $errors[] = 'Username already used.'; } else{ $un = mysqli_real_escape_string($dbc, trim($_POST['user_username'])); echo '<b>Congrats, You are now Registered.</b>'; } else { $errors[] = 'You forgot to enter your Username.'; }

    Read the article

  • how to add data in Table of Jasper Report using Java

    - by Areeb Gillani
    i am here to ask you just a simple question that i am trying to pass data to a jasper report using java but i dont know how to to, because the table data is very dynamic thats y cannot pass sql query. any idea for this. i have a 2D array of object type, where i have all the data... so how can i pass that... Thanx in advance...!:) ConnectionManager con = new ConnectionManager(); con.establishConnection(); String fileName = "Pmc_Bill.jrxml"; String outFileName = "OutputReport.pdf"; HashMap params = new HashMap(); params.put("PName", pname); params.put("PSerial", psrl); params.put("PGender",pgen); params.put("PPhone",pph); params.put("PAge",page); params.put("PRefer",pref); params.put("PDateR",dateNow); try { JasperReport jasperReport = JasperCompileManager.compileReport(fileName); if(jasperReport != null ) System.out.println("so far so good "); // Fill the report using an empty data source JasperPrint jasperPrint = JasperFillManager.fillReport(jasperReport, params, new JRTableModelDataSource(tbl.getModel()));//con.connection); try{ JasperExportManager.exportReportToPdfFile(jasperPrint, outFileName); System.out.printf("File exported sucessfully"); }catch(Exception e){ e.printStackTrace(); } JasperViewer.viewReport(jasperPrint); } catch (JRException e) { JOptionPane.showMessageDialog(null, e); e.printStackTrace(); System.exit(1); }

    Read the article

  • How do I make all the finders on the model ignorecase?

    - by Glex
    I have a model with several attributes, among them title and artist. The case of title and artist should be ignored in all the Active Record finders. Basically, if title or artist are present in the :conditions (or dynamically i.e. find_all_by_artist), then the WHERE artist = :artist should become WHERE UPPER(artist) = UPPER(:artist) or something along these lines. Is there a way of doing it with Rails?

    Read the article

  • Getting all database entries into organized array

    - by Industrial
    Hi everyone, I have just made the update/add/delete part for the "Closure table" way of organizing query hierarchical data that are shown on page 70 in this slideshare: http://www.slideshare.net/billkarwin/sql-antipatterns-strike-back However, I have a bit of an issue getting the full tree back as an multidimensional array from a single query. Here's what I would like to get back: array ( 'topvalue' = array ( 'Subvalue', 'Subvalue2', 'Subvalue3' = array ('Subvalue 1', 'Subvalue 2', 'Subvalue 3' ) ); );

    Read the article

  • what's wrong with this code?

    - by user329820
    Hi this is my code which will not work correctly ! what is wrong with its data type :( thanks CREATE TABLE T1 (A INTEGER NOT NULL); CREATE TABLE T3 (A SMALLINT NOT NULL); INSERT T1 VALUES (32768.5); SELECT * FROM T1; INSERT T3 SELECT * FROM T1; SELECT * FROM T3;

    Read the article

  • Is there a single query that can update a "sequence number" across multiple groups?

    - by Drarok
    Given a table like below, is there a single-query way to update the table from this: | id | type_id | created_at | sequence | |----|---------|------------|----------| | 1 | 1 | 2010-04-26 | NULL | | 2 | 1 | 2010-04-27 | NULL | | 3 | 2 | 2010-04-28 | NULL | | 4 | 3 | 2010-04-28 | NULL | To this (note that created_at is used for ordering, and sequence is "grouped" by type_id): | id | type_id | created_at | sequence | |----|---------|------------|----------| | 1 | 1 | 2010-04-26 | 1 | | 2 | 1 | 2010-04-27 | 2 | | 3 | 2 | 2010-04-28 | 1 | | 4 | 3 | 2010-04-28 | 1 | I've seen some code before that used an @ variable like the following, that I thought might work: SET @seq = 0; UPDATE `log` SET `sequence` = @seq := @seq + 1 ORDER BY `created_at`; But that obviously doesn't reset the sequence to 1 for each type_id. If there's no single-query way to do this, what's the most efficient way? Data in this table may be deleted, so I'm planning to run a stored procedure after the user is done editing to re-sequence the table.

    Read the article

  • Confusion using MYSQLI

    - by user1020069
    I just started using mysqli API for PHP. Apparently, every time an object of the class MYSQLI is instantiated, it can setup a connection to the database as it connects to the server unlike mysql_connect, which connects to the server first and then you are required to specify the database to query. Now this is a good problem if the db exists, in my case, the db does not exist on the first ever connection to the server/execution of the problem, hence I must connect without specifying the database, which is fine, since the msyqli constructor does not make this database mandatory. My challenge is essentially, how do I check if the database exists before attempting the first connection. The only way to really do this would be to establish a conection to the server and then use the result of the following query to gauge if the database exists: SELECT COUNT(*) AS `exists` FROM INFORMATION_SCHEMA.SCHEMATA WHERE SCHEMATA.SCHEMA_NAME="dbname" ; If this returns true, then the database exists, but now the challenge is how do I get the mysqli object to query this database rather than having to prefix the name of the database in the query. Thanks much

    Read the article

  • Having to insert a record, then update the same record warrants 1:1 relationship design?

    - by dianovich
    Let's say an Order has many Line items and we're storing the total cost of an order (based on the sum of prices on order lines) in the orders table. -------------- orders -------------- id ref total_cost -------------- -------------- lines -------------- id order_id price -------------- In a simple application, the order and line are created during the same step of the checkout process. So this means INSERT INTO orders .... -- Get ID of inserted order record INSERT into lines VALUES(null, order_id, ...), ... where we get the order ID after creating the order record. The problem I'm having is trying to figure out the best way to store the total cost of an order. I don't want to have to create an order create lines on an order calculate cost on order based on lines then update record created in 1. in orders table This would mean a nullable total_cost field on orders for starters... My solution thus far is to have an order_totals table with a 1:1 relationship to the orders table. But I think it's redundant. Ideally, since everything required to calculate total costs (lines on an order) is in the database, I would work out the value every time I need it, but this is very expensive. What are your thoughts?

    Read the article

  • Linking Post Title to Specific Page ID

    - by ThatMacLad
    I've created a form to update my websites homepage with content but I wanted to know how I could set it up so that a posts title links to a specific post ID. I'd also like to add a Read More link that directs anybody reading the blog to the correct post. Here is my PHP code: <html> <head> <title>Blog Name</title> </head> <body> <?php mysql_connect ('localhost', 'root', 'root') ; mysql_select_db ('tmlblog'); $sql = "SELECT * FROM php_blog ORDER BY timestamp DESC LIMIT 5"; $result = mysql_query($sql) or print ("Can't select entries from table php_blog.<br />" . $sql . "<br />" . mysql_error()); while($row = mysql_fetch_array($result)) { $date = date("l F d Y", $row['timestamp']); $title = stripslashes($row['title']); $entry = stripslashes($row['entry']); $password = $row['password']; $id = $row['id']; if ($password == 1) { echo "<p><strong>" . $title . "</strong></p>"; printf("<p>This is a password protected entry. If you have a password, log in below.</p>"); printf("<form method=\"post\" action=\"post.php?id=%s\"><p><strong><label for=\"username\">Username:</label></strong><br /><input type=\"text\" name=\"username\" id=\"username\" /></p><p><strong><label for=\"pass\">Password:</label></strong><br /><input type=\"password\" name=\"pass\" id=\"pass\" /></p><p><input type=\"submit\" name=\"submit\" id=\"submit\" value=\"submit\" /></p></form>",$id); print "<hr />"; } else { ?> <p><strong><?php echo $title; ?></strong><br /><br /> <?php echo $entry; ?><br /><br /> Posted on <?php echo $date; ?> <hr /></p> <?php } } ?> </body> </html> Thanks for any help. I really appreciate any input!

    Read the article

  • Xampp error on windows

    - by Deepak Kumar
    My problem is when i use xampp i see many error and when i use my web it has no error Notice: Undefined index: action in C:\xampp\htdocs\xyz\index.php on line 3 Notice: Undefined index: usNick in C:\xampp\htdocs\xyz\config.php on line 11 Notice: Use of undefined constant setname - assumed 'setname' in C:\xampp\htdocs\xyz\config.php on line 31 Notice: Use of undefined constant setname - assumed 'setname' in C:\xampp\htdocs\xyz\config.php on line 31 Notice: Undefined index: usNick in C:\xampp\htdocs\xyz\config.php on line 34 Notice: A session had already been started - ignoring session_start() in C:\xampp\htdocs\xyz\data.php on line 2 Notice: Undefined index: r in C:\xampp\htdocs\xyz\data.php on line 4 Notice: Undefined index: ucNick in C:\xampp\htdocs\xyz\data.php on line 8 I have tried many time changing things in Setting, Security, Privileges etc but nothing changed, I want to know if im missing something out Thanks

    Read the article

  • How can I use an array within a SQL query

    - by ThinkingInBits
    So I'm trying to take a search string (could be any number of words) and turn each value into a list to use in the following IN statement) in addition, I need a count of all these values to use with my having count filter $search_array = explode(" ",$this->search_string); $tag_count = count($search_array); $db = Connect::connect(); $query = "select p.id from photographs p left join photograph_tags c on p.id = c.photograph_id and c.value IN ($search_array) group by p.id having count(c.value) >= $tag_count"; This currently returns no results, any ideas?

    Read the article

< Previous Page | 369 370 371 372 373 374 375 376 377 378 379 380  | Next Page >