Search Results

Search found 31356 results on 1255 pages for 'database backups'.

Page 633/1255 | < Previous Page | 629 630 631 632 633 634 635 636 637 638 639 640  | Next Page >

  • Open Source PHP search engine

    - by Ravi Gupta
    I am looking for an open source search engine plugin written in php for my website(eCommerce). Before anybody answer that I have a doubt regarding the search engine. Usually search engine crawl web pages, create indexes and then use them while looking for contents. But will the same model work for eCommerce websites? Yeah, it can crawl products pages, index them but don't you think it would be better if it crawls the database directly and index the products stored in the database? And when a user search for any product, it will simply give us the rows of the table which matches the user query? May be what I am asking is a stupid question but I am new to web development, so kindly help me to understand the concept. I have looked at a search engine called Sphider but didn't get what all I have to do to make it work with an eCommerce website.

    Read the article

  • SQL To Filter A SubSet Of Data

    - by Nick LaMarca
    I have an arraylist that holds a subset of names found in my database. I need to write a query to get a count of the people in the arraylist for certain sections i.e. There is a field "City" in my database from the people in the arraylist of names I want to know how many of them live in Chicago, how many live in New York etc. Can someone help me how I might set up an sql statement to handle this. I think somehow I have to pass the subset of names to sql somehow.

    Read the article

  • Common optimization rules

    - by mafutrct
    This is a dangerous question, so let me try to phrase it correctly. Premature optimization is the root of all evil, but if you know you need it, there is a basic set of rules that should be considered. This set is what I'm wondering about. For instance, imagine you got a list of a few thousand items. How do you look up an item with a specific, unique ID? Of course, you simply use a Dictionary to map the ID to the item. And if you know that there is a setting stored in a database that is required all the time, you simply cache it instead of issuing a database request hundred times a second. I guess there are a few even more basic ideas. I am specifically not looking for "don't do it, for experts: don't do it yet" or "use a profiler" answers, but for really simple, general hints. If you feel this is an argumentative question, you probably misunderstood my intention.

    Read the article

  • Drupal Error After Logging In

    - by Kim
    Hello everyone. I'm kinda new to using drupal and i'm just wondering why I kind of get this error on my new site. See i have this website under WampServer running drupal6-16. Everytime I log in with my pre-created admin account 'admin01' pass: 'admin01' i get redirected to the WampServer localhost which appears to be unusual since the header does not have the WampServer logo. I already tried creating a new drupal website with the same database and the same thing happens. Also, I tried creating another website with a new database but I copied the other website's theme and other contents and the same thing happens. Help me please. I am losing my grip on this. :( Note: I have the same website running on one PC and i am just trying to run it on another PC by copying all its contents. The original copy is working perfectly but I can't seem to get the hook on my new copies to work on other PCs.

    Read the article

  • iPhone SDK: Data Synchronization

    - by buzzappsoftware
    I am looking for an overview of data synchronization techniques available on the iPhone platform. We need the ability to be able to sync a subset of content from a server to a local database residing on the iPhone. On other projects I have worked on, the data synchronization was handled by the database. Is that available in SQLite? If not, any suggestions on techniques? Rolling our own would not be my first choice. Thanks in advance.

    Read the article

  • How do manage the limit of executions to be done by hour (Max: 1000 requests per hour) without a dat

    - by cslavoie
    I am currently developing a script in PHP to fetch webpages. The fact is that by doing so, I occasionally do too much requests to a particular website. In order to control any overflow, I would like to keep trace of how many requests have been done in the last hour or so for each domain. It doesn't need to be perfect, just a good estimate. I doesn't have access to a database, except sqlite2. I would really like something really simple because there will typically be a lot of updates, which is kind of heavy for a sqlite database. If no one has a magical solution, I'll go for sqlite, but I was curious what you can come up with Thank you very much

    Read the article

  • Why Hibernates ignores the name attribute of the @Column annotation?

    - by svachon
    Using Hibernate 3.3.1 and Hibernate Annotations 3.4, the database is DB2/400 V6R1, running that on WebSphere 7.0.0.9 I have the following class @Entity public class Ciinvhd implements Serializable { @Id private String ihinse; @Id @Column(name="IHINV#") private BigDecimal ihinv; .... } For reasons I can't figure, Hibernate ignores the specified column name and uses 'ihinv' to generate the SQL: select ciinvhd0_.ihinse as ihinse13_, ciinvhd0_.ihinv as ihinv13_, ... Which of course gives me the following error: Column IHINV not in table CIINVHD Did anyone had this problem before? I have other entities that are very alike in the way that they are using # in their database field names and that are part of the PK and I don't have this problem with them.

    Read the article

  • Entity framework Update fails when object is linked to a missing child

    - by McKay
    I’m having trouble updating an objects child when the object has a reference to a nonexising child record. eg. Tables Car and CarColor have a relationship. Car.CarColorId CarColor.CarColorId If I load the car with its color record like so this var result = from x in database.Car.Include("CarColor") where x.CarId = 5 select x; I'll get back the Car object and it’s Color object. Now suppose that some time ago a CarColor had been deleted but the Car record in question still contains the CarColorId value. So when I run the query the Color object is null because the CarColor record didn’t exist. My problem here is that when I attach another Color object that does exist I get a Store update, insert error when saving. Car.Color = newColor Database.SaveChanges(); It’s like the context is trying to delete the nonexisting color. How can I get around this?

    Read the article

  • Run a site on Scheme

    - by Lajla
    I can't find this on Google (so maybe it doesn't exist), but I basically'd like to install something on a web server such that I can run a site on Scheme, PHP is starting to annoy me, I want to get rid off it, what I want is: Run Scheme sources towards UTF-8 output (duh) Support for SXML, SXLT et cetera, I plan to compose the damned thing in SXML and - to normal representation on at the end. Ability to read other files from the server, write them, set permissions et cetera Also some things to for instance determine the filesize of files, height of images, mime-types and all that mumbo-jumbo (optionally) connect to a database, but for what I want to do storing the entire database in S-expressions itself is feasible enough I don't need any fancy libraries and other things that come with it like CMS'es and what-not, except the support for SXML but I'm sure I can just find a lib for that anyway that I can load.

    Read the article

  • How do I split the output from mysqldump into smaller files?

    - by lindelof
    I need to move entire tables from one MySQL database to another. I don't have full access to the second one, only phpMyAdmin access. I can only upload (compressed) sql files smaller than 2MB. But the compressed output from a mysqldump of the first database's tables is larger than 10MB. Is there a way to split the output from mysqldump into smaller files? I cannot use split(1) since I cannot cat(1) the files back on the remote server. Or is there another solution I have missed? Edit The --extended-insert=FALSE option to mysqldump suggested by the first poster yields a .sql file that can then be split into importable files, provided that split(1) is called with a suitable --lines option. By trial and error I found that bzip2 compresses the .sql files by a factor of 20, so I needed to figure out how many lines of sql code correspond roughly to 40MB.

    Read the article

  • SQL Server 2008 - Conditional Query

    - by Villager
    Hello, SQL is not one of my strong suits. I have a SQL Server 2008 database. This database has a stored procedure that takes in eight int parameters. For the sake of keeping this question focused, I will use one of these parameters for reference: @isActive int Each of these int parameters will be -1, 0, or 1. -1 means "Unknown" or "Don't Care". Basically, I need to query a table such that if the int parameter is NOT -1, I need to consider it in my WHERE clause. Because there are eight int parameters, an IF-ELSE statement does not seem like a good idea. At the same time, I do not know how else to do this? Is there an elegant way in SQL to add a WHERE conditional if a parameter does NOT equal a value? Thank you!

    Read the article

  • I can't delete record in Codeigniter

    - by jomblo
    I'm learning CRUD in codeigniter. I have table name "posting" and the coloumns are like this (id, title, post). I successed to create a new post (both insert into database and display in the view). But I have problem when I delete my post in the front-end. Here is my code: Model Class Post_Model extends CI_Model{ function index(){ //Here is my homepage code } function delete_post($id) { $this->db->where('id', $id); $this->db->delete('posting'); } } Controller Class Post extends CI_Controller{ function delete() { $this->load->model('Post_Model'); $this->Post_Model->delete_post("id"); redirect('Post/index/', 'refresh'); } } After click "delete" in the homepage, there was nothing happens. While I'm looking into my database, my records still available. Note: (1) to delete record, I'm following the codeigniter manual / user guide, (2) I found a message error (Undefined variable: id) after hiting the "delete" button in the front-end Any help or suggestion, please

    Read the article

  • Mac OS X - Time Machine backup fails verification - What can I do to save the history?

    - by usermac75
    Hi, How do I make Time Machine to make a new complete backup without losing older versions of backed up files? Verbose: I am using the Time Machine backup on my OS X (Snow Leopard) to backup the whole computer to an external drive. I especially like the "history", i.e. the feature that allows you to restore the older version of a file. Problem: I had some data corruption on my external backup drive, I repaired it with the System Tool for doing that, it found some faults. I had the disk tool repair the external drive. After that, the external drive was OK and I could use Time Machine again. I let Time Machine do one more backup. Now I made a verification according to http://superuser.com/questions/47628/verifying-time-machine-backups, namely along sudo diff -qr . $HOME/Desktop 2>&1 | tee $HOME/timemachine-diff.log However: After doing the command above, several differences and missing files were reported, approx. 200 files in sum. Whereas some of the missing files were cache or excluded directories, the differences do bother me, especially as some important documents from me are listed as differing. How can I make sure that the data on the external drive is synced correctly? Is it possible to have Time Machine to do a complete new backup without losing the history? Or to have Time Machine compare all files for differences and re-write all files that are different? Or can I set some flags on the files that do not match to have them copied again? (like the archive-flag in Windows/Dos). I'd rather not touch the files because I would like to keep the date of last change/date of creation) Thank you for your thoughts!

    Read the article

  • Is it a problem if i query again and again to SQL Server 2005 and 2000?

    - by learner
    Window app i am constructing is for very low end machines (Celeron with max 128 RAM). From the following two approaches which one is the best (I don't want that application becomes memory hog for low end machines):- Approach One:- Query the database Select GUID from Table1 where DateTime <= @givendate which is returning me more than 300 thousands records (but only one field i.e. GUID - 300 thousands GUIDs). Now running a loop to achieve next process of this software based on GUID. Second Approach:- Query the database Select Top 1 GUID from Table1 where DateTime <= @givendate with top 1 again and again until all 300 thousands records done. It will return me only one GUID at a time, and I can do my next step of operation. What do you suggest which approach will use the less Memory Resources?? (Speed / performance is not the issue here).

    Read the article

  • how to avoid storing several times a repeated field of a Symfony form?

    - by user454760
    Hello everybody, I am working with Symfony 1.4 and Doctrine. I have a model A with an email field. The form of A displays an input in which the user should insert the email correctly. But as everybody knows, sometimes they don't do it. To fix this I have inserted an extra field in the model (and in the form), called *repeat_email* to prevent the misspellings. Then, in the validation process, after validating all the fields, i use a global validator to compare the data of the two fields. This works, but I don't want to have the email stored two times in the database (I don't want the *repeat_email*). Is there any mechanism to use it in the validation process, but not to store it in the database? Thanks,

    Read the article

  • Error comparing hash to hashed mysql password (output values are equal)

    - by Charlie
    Im trying to compare a hashed password value in a mysql database with the hashed value of an inputted password from a login form. However, when I compare the two values it says they aren't equal. I removed the salt to simply, and then tested what the outputs were and got the same values $password1 = $_POST['password']; $hash = hash('sha256', $password1); ...connect to database, etc... $query = "SELECT * FROM users WHERE username = '$username1'"; $result = mysql_query($query); $userData = mysql_fetch_array($result); if($hash != $userData['password']) //incorrect password { echo $hash."|".$userData['password']; die(); } ...other code... Sample output: 7816ee6a140526f02289471d87a7c4f9602d55c38303a0ba62dcd747a1f50361| 7816ee6a140526f02289471d87a7c4f9602d55c38303a0ba62dcd747a1f50361 Any thoughts?

    Read the article

  • Sync framework 2.0 smart device to server

    - by Oll
    We have a requirement very similar to that shown in the Occasionally Connected Application (OCA) diagram on the Introduction to Sync Framework Database Synchronization article. We can't however find any examples on how the clients at the bottom are syncing between each other. Particularly, how a smart device syncs with another client each having a local .sdf database. There are lots of examples of a smart device to a server over wcf where the server is running full sql but not smart device to server with local cache. Does anyone have any ideas or examples. Thanks

    Read the article

  • php class crash course

    - by rabidmachine9
    Hello people, I'm sorry for this question but I'm getting crazy trying to write my first php clash, it is supposed to connect me to server and to a database but I'm always getting this error: Parse error: syntax error, unexpected ',', expecting T_PAAMAYIM_NEKUDOTAYIM in /Applications/XAMPP/xamppfiles/htdocs/classTest/test.php on line 9 thanks in advance, here is the code: <?php // include 'DBConnect.php'; class DBConnect { function connection($hostname = "localhost", $myname = "root", $pass = ''){ mysql_connect(&hostname, &myname, &pass) or die("Could not connect!"); //Connect to mysql } function choose($dbnam){ mysql_select_db(&dbnam) or die("Couldn't find database!"); //fnd specific db } } $a = new DBConnect(); $connect = $a->connection("localhost", "root"); $a->choose('mydb'); ?>

    Read the article

  • How to store matrix information in MySQL?

    - by dedalo
    Hi, I'm working on an application that analizes music similarity. In order to do that I proccess audio data and store the results in txt files. For each audio file I create 2 files, 1 containing and 16 values (each value can be like this:2.7000023942731723) and the other file contains 16 rows, each row containing 16 values like the one previously shown. I'd like to store the contents of these 2 file in a table of my MySQL database. My table looks like: Name varchar(100) Author varchar (100) in order to add the content of those 2 file I think I need to use the BLOB data type: file1 blob file2 blob My question is how should I store this info in the data base? I'm working with Java where I have a double array containing the 16 values (for the file1) and a matrix containing the file2 info. Should I process the values as strings and add them to the columns in my database? Thanks

    Read the article

  • Github + keep file but dont track changes

    - by Mike
    I have a codeigniter framework thats using github. Within this application I have several files that i will want to have in the repo but not track any changes on.. Example is: i deploy a new installation of this framework to a new client, i want the following files to be downloaded (they have default values 'CHANGEME') and i just have to make changes specific to this client IE(database credentials, email address info, custom css styling). // the production config files i want the files but they need to be updated to specific client needs application/config/production/config.php application/config/production/database.php application/config/production/tank_auth.php // index page, defines the environment (production|development) /index.php // all of the css/js cache (keep the folder but not the contents) /assets/cache/* // production user based styling (color, fonts etc) needs to be updated specific to client needs /assets/frontend/css/user/frontend-user.css currently if i run git clone [email protected]:user123/myRepo.git httpdocs and then i edit the files above, all is great.. until i release a hotfix or patch and run git pull. All of my changes are then overwritten.

    Read the article

  • Intersection() and Except() is too slow with large collections of custom objects

    - by Theo
    I am importing data from another database. My process is importing data from a remote DB into a List<DataModel> named remoteData and also importing data from the local DB into a List<DataModel> named localData. I am then using LINQ to create a list of records that are different so that I can update the local DB to match the data pulled from remote DB. Like this: var outdatedData = this.localData.Intersect(this.remoteData, new OutdatedDataComparer()).ToList(); I am then using LINQ to create a list of records that no longer exist in remoteData, but do exist in localData, so that I delete them from local database. Like this: var oldData = this.localData.Except(this.remoteData, new MatchingDataComparer()).ToList(); I am then using LINQ to do the opposite of the above to add the new data to the local database. Like this: var newData = this.remoteData.Except(this.localData, new MatchingDataComparer()).ToList(); Each collection imports about 70k records, and each of the 3 LINQ operation take between 5 - 10 minutes to complete. How can I make this faster? Here is the object the collections are using: internal class DataModel { public string Key1{ get; set; } public string Key2{ get; set; } public string Value1{ get; set; } public string Value2{ get; set; } public byte? Value3{ get; set; } } The comparer used to check for outdated records: class OutdatedDataComparer : IEqualityComparer<DataModel> { public bool Equals(DataModel x, DataModel y) { var e = string.Equals(x.Key1, y.Key1) && string.Equals(x.Key2, y.Key2) && ( !string.Equals(x.Value1, y.Value1) || !string.Equals(x.Value2, y.Value2) || x.Value3 != y.Value3 ); return e; } public int GetHashCode(DataModel obj) { return 0; } } The comparer used to find old and new records: internal class MatchingDataComparer : IEqualityComparer<DataModel> { public bool Equals(DataModel x, DataModel y) { return string.Equals(x.Key1, y.Key1) && string.Equals(x.Key2, y.Key2); } public int GetHashCode(DataModel obj) { return 0; } }

    Read the article

  • limiting the rate of emails using python

    - by Ali
    I have a python script which reads email addresses from a database for a particular date, example today, and sends out an email message to them one by one. It reads data from MySQL using the MySQLdb module and stores all results in a dictionary and sends out emails using : rows = cursor.fetchall () #All email addresses returned that are supposed to go out on todays date. for row is rows: #send email However, my hosting service only lets me send out 500 emails per hour. How can I limit my script from making sure only 500 emails are sent in an hour and then to check the database if more emails are left for today or not and then to send them in the next hour. The script is activated using a cron job.

    Read the article

  • Stop Entity Framework from updating edmx model with a column that isn't needed

    - by TMan
    I have rowguids in all my tables to help with change tracking in all my tables. I don't want/need these tables in my edmx or my entities. However, I do still need to make changes to other things sometimes so everytime i go to update model from database in the edmx it adds all the rowguids in all my tables everytime and i have to manually delete each one. Is there a way to handle this from happening? Is there a way I can maybe edit the T4 to maybe ignore that 'rowguid' column? Database first Entity framework

    Read the article

  • Need advice on cron job'ing a very large process

    - by Arms
    I have a PHP script that grabs data from an external service and saves data to my database. I need this script to run once every minute for every user in the system (of which I expect to be thousands). My question is, what's the most efficient way to run this per user, per minute? At first I thought I would have a function that grabs all the user Ids from my database, iterate over the ids and perform the task for each one, but I think that as the number of users grow, this will take longer, and no longer fall within 1 minute intervals. Perhaps I should queue the user Ids, and perform the task individually for each one? In which case, I'm actually unsure of how to proceed. Thanks in advance for any advice.

    Read the article

< Previous Page | 629 630 631 632 633 634 635 636 637 638 639 640  | Next Page >