Search Results

Search found 33223 results on 1329 pages for 'database firewall'.

Page 681/1329 | < Previous Page | 677 678 679 680 681 682 683 684 685 686 687 688  | Next Page >

  • Need help with formulating LINQ query

    - by eponymous23
    I'm building a word anagram program that uses a database which contains one simple table: Words --------------------- varchar(15) alphagram varchar(15) anagram (other fields omitted for brevity) An alphagram is the letters of a word arranged in alphabetical order. For example, the alphagram for OVERFLOW would be EFLOORVW. Every Alphagram in my database has one or more Anagrams. Here's a sample data dump of my table: Alphagram Anagram EINORST NORITES EINORST OESTRIN EINORST ORIENTS EINORST STONIER ADEINRT ANTIRED ADEINRT DETRAIN ADEINRT TRAINED I'm trying to build a LINQ query that would return a list of Alphagrams along with their associated Anagrams. Is this possible?

    Read the article

  • MySQL Hibernate sort on 2 columns

    - by sammichy
    I have a table as follows Table item { ID - Primary Key content - String published_date - When the content was published create_date - When this database entry was created } Every hour (or specified time interval) I run a process to update this table with data from different sources (websites). I want to display the results according to the following rules. 1. The entries created each time the process runs should be grouped together. So the entries from the 2nd process run will always be after the entries from the first process run even if the published_date of an entry from the first run is after the published_date of an entry from the 2nd run. 2. Within the grouping by run, the entries by sorted by published_date 3. Another restriction is that I prefer that data from the same source not be grouped together. If I do the sort by create_date, published_date I will end up with data from source a, data from source b etc. I prefer that the data within each hour be mixed up for better presentation If I add a column to this table and store a counter which increments each time the process is run, it is possible to create a query to sort first by counter and then by published_dt. Is there a way to do it without adding a field? I'm using Hibernate over MySQL. e.g. Hour 1 (run 1) 4 rows collected from site a (rows 1-4) 3 rows collected from site b (rows 5-7) hour 2 (run 2) 2 row collected from site a (rows 8-9) 3 rows collected from site b (rows 10-12) ... After each run, new records are added to the database from each website. The create date is the time when the record was created in the database. The published date is part of the content and is read in from the external source. When the results are displayed I would like rows to be grouped together based on the hour they were published in. So rows 1-7 would be displayed before rows 8-12. Within each hourly grouping, I would like to sort the results by published date (timestamp). This is necessary so that the posts from all the sites collected in that hour are not grouped together but rather mixed in with each other.

    Read the article

  • Delete Entity in Many to Many Relation. Don't get error but entity is not deleted

    - by Shapper
    I have, in EF5, two entities: User and Role. Between User and Role there is a many to many relation. I don't have an entity for the UserRoles database which sets the relation. I have a User and I want to delete a role without loading it from the database. Context context = new Context(); User user = context.Users.First(x => x.Id == 4); user.Roles = new List<Role>(); Role role = new Role { Id = 20 }; context.Roles.Attach(role); user.Roles.Remove(role); context.SaveChanges(); I don't get any error but the role is not removed. Any idea why? Thank you, Miguel

    Read the article

  • Need help with Excel 2007 Formula - Many to many update

    - by Monica
    I'm experienced with database development, but not so much with Excel. I'm looking for help writing an Excel formula that would help my client's spreadsheet behave like a database. This is what I'm looking to do, but I can't figure out how to write it in Excel 2007: "If Q4 (on sheet 2) contains A2 (on sheet 1), append A1 (on sheet 1) with Q5 (on sheet 2)" Some factors: 1) This formula may find multiple instances of A2, so it should not stop after finding the first match 2) The values, as they are created in A1, should be separated with comma and space 3) This is a many to many relationship between Q4 and A2 Thanks for any help with this. I've tried vlookups, match, if statements, but they all fall short in one way or another.

    Read the article

  • How to deploy updates to .NET website in cluster

    - by royappa
    We are operating a corporate web application on a load-balanced cluster that consists of two identical IIS servers talking to a single MSSQL database. To deploy updates I am using this primitive process: 1) Make a copy of the entire site folder (wwwroot\inetpub\whatever) on each IIS box 2) Download the updated, compiled files onto each IIS box from our development area 3) Shut down IIS both web servers 4) Copy the new and updated files into the wwwroot folder (overwriting any same files) 5) Then restart IIS on both machines When there are database changes involved there are a few other steps. The whole process is fairly quick but it is ugly and fraught with danger, so it has to be done with full concentration. I would like to just push one button to make it all happen. And I want a one-click rollback in case there is a problem (that's the reason I make the copy in step #1). I am looking for tools to manage and improve this process. If it also helped us maintain a changelog, that would be nice. Thanks.

    Read the article

  • Github + keep file but dont track changes

    - by Mike
    I have a codeigniter framework thats using github. Within this application I have several files that i will want to have in the repo but not track any changes on.. Example is: i deploy a new installation of this framework to a new client, i want the following files to be downloaded (they have default values 'CHANGEME') and i just have to make changes specific to this client IE(database credentials, email address info, custom css styling). // the production config files i want the files but they need to be updated to specific client needs application/config/production/config.php application/config/production/database.php application/config/production/tank_auth.php // index page, defines the environment (production|development) /index.php // all of the css/js cache (keep the folder but not the contents) /assets/cache/* // production user based styling (color, fonts etc) needs to be updated specific to client needs /assets/frontend/css/user/frontend-user.css currently if i run git clone [email protected]:user123/myRepo.git httpdocs and then i edit the files above, all is great.. until i release a hotfix or patch and run git pull. All of my changes are then overwritten.

    Read the article

  • Creating user generated reports

    - by Marquinio
    Hello Everyone, I need users to create custom reports. These users do not know any technical skills like SQL. We currently have a custom database report design. So basically whatever the user does on the GUI the application will have to generate the appropriate SQL to generate the report structure. Has anyone done this before? I know there are reporting solutions out there but we already have our own database tables for reporting. We already have a section where users can view reports displayed in HTML. Like for example if user selects a "UserID" and "Accounts" fields from GUI, how would I know that my SQL has to join the USER and ACCOUNTS table? I guess I'm just looking for some ideas to help me solve this problem. Thanks.

    Read the article

  • PHP+MYSQL Server Config

    - by Matias
    Hi guys, I am parsing an XML file with PHP and inserting the rows in a MYSQL database. I am using PHP simplexml_load_files to load the XML and a foreach to loop through the array and insert the rows into my database. It works perfectly fine with small files i am testing, but it comes to reality I need to parse a large 500mb XML file and nothing happens. I was wondering what was the right Php.ini config for this case ? I have a VPS Linux Cent OS, with 256 mb of dedicated Memory and MYSQL 5.0.5. I have also set php memory_limit = 256M (maximum of my server) Any suggestions, similar experiences will be greatly appreciated Thanks

    Read the article

  • Using a large list of terms, search through page text and replace words with links

    - by dunc
    A while ago I posted this question asking if it's possible to convert text to HTML links if they match a list of terms from my database. I have a fairly huge list of terms - around 6000. The accepted answer on that question was superb, but having never used XPath, I was at a loss when problems started occurring. At one point, after fiddling with code, I somehow managed to add over 40,000 random characters to our database - the majority of which required manual removal. Since then I've lost faith in that idea and the more simple PHP solutions simply weren't efficient enough to deal with the amount of data and the quantity of terms. My next attempt at a solution is to write a JS script which, once the page has loaded, retrieves the terms and matches them against the text on a page. This answer has an idea which I'd like to attempt. I would use AJAX to retrieve the terms from the database, to build an object such as this: var words = [ { word: 'Something', link: 'http://www.something.com' }, { word: 'Something Else', link: 'http://www.something.com/else' } ]; When the object has been built, I'd use this kind of code: //for each array element $.each(words, function() { //store it ("this" is gonna become the dom element in the next function) var search = this; $('.message').each( function() { //if it's exactly the same if ($(this).text() === search.word) { //do your magic tricks $(this).html('<a href="' + search.link + '">' + search.link + '</a>'); } } ); } ); Now, at first sight, there is a major issue here: with 6,000 terms, will this code be in any way efficient enough to do what I'm trying to do?. One option would possibly be to perform some of the overhead within the PHP script that the AJAX communicates with. For instance, I could send the ID of the post and then the PHP script could use SQL statements to retrieve all of the information from the post and match it against all 6,000 terms.. then the return call to the JavaScript could simply be the matching terms, which would significantly reduce the number of matches the above jQuery would make (around 50 at most). I have no problem with the script taking a few seconds to "load" on the user's browser, as long as it isn't impacting their CPU usage or anything like that. So, two questions in one: Can I make this work? What steps can I take to make it as efficient as possible? Thanks in advance,

    Read the article

  • Open Source PHP search engine

    - by Ravi Gupta
    I am looking for an open source search engine plugin written in php for my website(eCommerce). Before anybody answer that I have a doubt regarding the search engine. Usually search engine crawl web pages, create indexes and then use them while looking for contents. But will the same model work for eCommerce websites? Yeah, it can crawl products pages, index them but don't you think it would be better if it crawls the database directly and index the products stored in the database? And when a user search for any product, it will simply give us the rows of the table which matches the user query? May be what I am asking is a stupid question but I am new to web development, so kindly help me to understand the concept. I have looked at a search engine called Sphider but didn't get what all I have to do to make it work with an eCommerce website.

    Read the article

  • Entity framework Update fails when object is linked to a missing child

    - by McKay
    I’m having trouble updating an objects child when the object has a reference to a nonexising child record. eg. Tables Car and CarColor have a relationship. Car.CarColorId CarColor.CarColorId If I load the car with its color record like so this var result = from x in database.Car.Include("CarColor") where x.CarId = 5 select x; I'll get back the Car object and it’s Color object. Now suppose that some time ago a CarColor had been deleted but the Car record in question still contains the CarColorId value. So when I run the query the Color object is null because the CarColor record didn’t exist. My problem here is that when I attach another Color object that does exist I get a Store update, insert error when saving. Car.Color = newColor Database.SaveChanges(); It’s like the context is trying to delete the nonexisting color. How can I get around this?

    Read the article

  • I can't delete record in Codeigniter

    - by jomblo
    I'm learning CRUD in codeigniter. I have table name "posting" and the coloumns are like this (id, title, post). I successed to create a new post (both insert into database and display in the view). But I have problem when I delete my post in the front-end. Here is my code: Model Class Post_Model extends CI_Model{ function index(){ //Here is my homepage code } function delete_post($id) { $this->db->where('id', $id); $this->db->delete('posting'); } } Controller Class Post extends CI_Controller{ function delete() { $this->load->model('Post_Model'); $this->Post_Model->delete_post("id"); redirect('Post/index/', 'refresh'); } } After click "delete" in the homepage, there was nothing happens. While I'm looking into my database, my records still available. Note: (1) to delete record, I'm following the codeigniter manual / user guide, (2) I found a message error (Undefined variable: id) after hiting the "delete" button in the front-end Any help or suggestion, please

    Read the article

  • SQL To Filter A SubSet Of Data

    - by Nick LaMarca
    I have an arraylist that holds a subset of names found in my database. I need to write a query to get a count of the people in the arraylist for certain sections i.e. There is a field "City" in my database from the people in the arraylist of names I want to know how many of them live in Chicago, how many live in New York etc. Can someone help me how I might set up an sql statement to handle this. I think somehow I have to pass the subset of names to sql somehow.

    Read the article

  • how to avoid storing several times a repeated field of a Symfony form?

    - by user454760
    Hello everybody, I am working with Symfony 1.4 and Doctrine. I have a model A with an email field. The form of A displays an input in which the user should insert the email correctly. But as everybody knows, sometimes they don't do it. To fix this I have inserted an extra field in the model (and in the form), called *repeat_email* to prevent the misspellings. Then, in the validation process, after validating all the fields, i use a global validator to compare the data of the two fields. This works, but I don't want to have the email stored two times in the database (I don't want the *repeat_email*). Is there any mechanism to use it in the validation process, but not to store it in the database? Thanks,

    Read the article

  • php class crash course

    - by rabidmachine9
    Hello people, I'm sorry for this question but I'm getting crazy trying to write my first php clash, it is supposed to connect me to server and to a database but I'm always getting this error: Parse error: syntax error, unexpected ',', expecting T_PAAMAYIM_NEKUDOTAYIM in /Applications/XAMPP/xamppfiles/htdocs/classTest/test.php on line 9 thanks in advance, here is the code: <?php // include 'DBConnect.php'; class DBConnect { function connection($hostname = "localhost", $myname = "root", $pass = ''){ mysql_connect(&hostname, &myname, &pass) or die("Could not connect!"); //Connect to mysql } function choose($dbnam){ mysql_select_db(&dbnam) or die("Couldn't find database!"); //fnd specific db } } $a = new DBConnect(); $connect = $a->connection("localhost", "root"); $a->choose('mydb'); ?>

    Read the article

  • How do I split the output from mysqldump into smaller files?

    - by lindelof
    I need to move entire tables from one MySQL database to another. I don't have full access to the second one, only phpMyAdmin access. I can only upload (compressed) sql files smaller than 2MB. But the compressed output from a mysqldump of the first database's tables is larger than 10MB. Is there a way to split the output from mysqldump into smaller files? I cannot use split(1) since I cannot cat(1) the files back on the remote server. Or is there another solution I have missed? Edit The --extended-insert=FALSE option to mysqldump suggested by the first poster yields a .sql file that can then be split into importable files, provided that split(1) is called with a suitable --lines option. By trial and error I found that bzip2 compresses the .sql files by a factor of 20, so I needed to figure out how many lines of sql code correspond roughly to 40MB.

    Read the article

  • How do you display a PHPmyadmin table onto a web browser/web page?

    - by user1390754
    Just a simple question, i've been searching around and for some reason i cannot find an answer to this. after creating a database/table in Phpmyadmin using xampp, what command do i need to put into my webpage (PHP) to show the table I made? I know the first step involves connecting to the database and i think i've done that properly. This is the code i found from somewhere about connecting (w3 tutorials I believe) $con = mysql_connect("localhost","xxxxxx,""); if (!$con) { die('Could not connect: ' . mysql_error()); }

    Read the article

  • Error comparing hash to hashed mysql password (output values are equal)

    - by Charlie
    Im trying to compare a hashed password value in a mysql database with the hashed value of an inputted password from a login form. However, when I compare the two values it says they aren't equal. I removed the salt to simply, and then tested what the outputs were and got the same values $password1 = $_POST['password']; $hash = hash('sha256', $password1); ...connect to database, etc... $query = "SELECT * FROM users WHERE username = '$username1'"; $result = mysql_query($query); $userData = mysql_fetch_array($result); if($hash != $userData['password']) //incorrect password { echo $hash."|".$userData['password']; die(); } ...other code... Sample output: 7816ee6a140526f02289471d87a7c4f9602d55c38303a0ba62dcd747a1f50361| 7816ee6a140526f02289471d87a7c4f9602d55c38303a0ba62dcd747a1f50361 Any thoughts?

    Read the article

  • SQL Server 2008 - Conditional Query

    - by Villager
    Hello, SQL is not one of my strong suits. I have a SQL Server 2008 database. This database has a stored procedure that takes in eight int parameters. For the sake of keeping this question focused, I will use one of these parameters for reference: @isActive int Each of these int parameters will be -1, 0, or 1. -1 means "Unknown" or "Don't Care". Basically, I need to query a table such that if the int parameter is NOT -1, I need to consider it in my WHERE clause. Because there are eight int parameters, an IF-ELSE statement does not seem like a good idea. At the same time, I do not know how else to do this? Is there an elegant way in SQL to add a WHERE conditional if a parameter does NOT equal a value? Thank you!

    Read the article

  • Parsing a string to date gives 01/01/0001 00:00:00

    - by kawtousse
    String dateimput=request.getParameter("datepicker"); System.out.printl("datepicker:" +dateimput); DateFormat df = new SimpleDateFormat("MM/dd/yyyy"); Date dt = null; try { dt = df.parse(dateimput); System.out.println("date imput is:" +dt); } catch (ParseException e) { e.printStackTrace(); } *datepicker:04/29/2010 (value I currently selected from datepicker). *the field in database is typed date. 1-date imput is:Thu Apr 29 00:00:00 CEST 2010 and in database level it is inserted like that 01/01/0001 00:00:00

    Read the article

  • limiting the rate of emails using python

    - by Ali
    I have a python script which reads email addresses from a database for a particular date, example today, and sends out an email message to them one by one. It reads data from MySQL using the MySQLdb module and stores all results in a dictionary and sends out emails using : rows = cursor.fetchall () #All email addresses returned that are supposed to go out on todays date. for row is rows: #send email However, my hosting service only lets me send out 500 emails per hour. How can I limit my script from making sure only 500 emails are sent in an hour and then to check the database if more emails are left for today or not and then to send them in the next hour. The script is activated using a cron job.

    Read the article

  • iPhone SDK: Data Synchronization

    - by buzzappsoftware
    I am looking for an overview of data synchronization techniques available on the iPhone platform. We need the ability to be able to sync a subset of content from a server to a local database residing on the iPhone. On other projects I have worked on, the data synchronization was handled by the database. Is that available in SQLite? If not, any suggestions on techniques? Rolling our own would not be my first choice. Thanks in advance.

    Read the article

  • Sync framework 2.0 smart device to server

    - by Oll
    We have a requirement very similar to that shown in the Occasionally Connected Application (OCA) diagram on the Introduction to Sync Framework Database Synchronization article. We can't however find any examples on how the clients at the bottom are syncing between each other. Particularly, how a smart device syncs with another client each having a local .sdf database. There are lots of examples of a smart device to a server over wcf where the server is running full sql but not smart device to server with local cache. Does anyone have any ideas or examples. Thanks

    Read the article

  • Is it a problem if i query again and again to SQL Server 2005 and 2000?

    - by learner
    Window app i am constructing is for very low end machines (Celeron with max 128 RAM). From the following two approaches which one is the best (I don't want that application becomes memory hog for low end machines):- Approach One:- Query the database Select GUID from Table1 where DateTime <= @givendate which is returning me more than 300 thousands records (but only one field i.e. GUID - 300 thousands GUIDs). Now running a loop to achieve next process of this software based on GUID. Second Approach:- Query the database Select Top 1 GUID from Table1 where DateTime <= @givendate with top 1 again and again until all 300 thousands records done. It will return me only one GUID at a time, and I can do my next step of operation. What do you suggest which approach will use the less Memory Resources?? (Speed / performance is not the issue here).

    Read the article

< Previous Page | 677 678 679 680 681 682 683 684 685 686 687 688  | Next Page >