Search Results

Search found 40567 results on 1623 pages for 'database performance'.

Page 222/1623 | < Previous Page | 218 219 220 221 222 223 224 225 226 227 228 229  | Next Page >

  • secure data transport between web server and database server

    - by atypicalgeek
    I asked this question in stackoverflow and it was suggested to try here so here goes... I'm planning on provisioning a web server and database server in a server farm environment. They will be in the same network but not in the same domain, both windows server 2008 and the database server is sql server 2008. My question being, what is the best way to secure data in transport between the servers? I've looked into IPSEC and SSL but not sure how to go about implementing either.

    Read the article

  • Performance of fopen vs stat

    - by Alex Marshall
    Hello, I'm writing several C programs for an embedded system where every bit of performance we can squeeze out will matter. Part of that is accessing log files. When determining if a file exists, is there any performance difference between using open / fopen, and stat ? I've been using stat on the assumption that it only has to do a quick check against the file system, whereas fopen would have to actually gain access to a file and manipulate internal data structures before returning. Is there any merit to this ?

    Read the article

  • Grails benchmarks compared to other web MVC platform (Rails, Django, ASP MVC)?

    - by fabien7474
    I have been searching the web for recent benchmarks measuring Grails overall performance compared to its competitors (Rails, Django, ASP.NET MVC...), but I didn't find anything more recent than a 3 years-old article with obsolete grails version (0.5). See here and here. So, starting from grails 1.2, are there any more recent grails benchmarks you are aware of ? Or do you have your own performance tests for grails (compared to others if possible) ?

    Read the article

  • Database Design: A proper table design for large number of column values.

    - by Jake
    I wish to perform an experiment many different times. After every trial, I am left with a "large" set of output statistics -- let's say, 1000. I would like to store the outputs of my experiments in a table, but what's the best way...? Option 1 Have a table with 1000 columns. Seems like a bad idea. What if the number of statistics one day exceeds the maximum number of columns? Option 2 Have a table with three columns. Let's say, ID, StatisticType, and StatisticValue. That way, you can have as many statistics as you want. However, reading a single experiments statistics becomes more complicated. Moreover, what if different statistics are different data types?? Any suggestions?

    Read the article

  • Dump Microsoft SQL Server database to an SQL script

    - by Matt Sheppard
    Is there any way to export a Microsoft SQL Server database to an sql script? I'm looking for something which behaves similarly to mysqldump, taking a database name, and producing a single script which will recreate all the tables, stored procedures, reinsert all the data etc. I've seen http://vyaskn.tripod.com/code.htm#inserts, but I ideally want something to recreate everything (not just the data) which works in a single step to produce the final script.

    Read the article

  • ADO.NET Multiple simultaneous reads from an open database.

    - by Deverill
    Answer not needed - my logic was wrong and this question is invalid. Charles helped me see where I went off-tracks. Thanks I have a utility that moves data from one source to another. In the process of writing the record I check to see if it exists and do an update/insert as necessary. The difficulty I have is that as I'm writing the main record info there is a 2nd table for "custom data" that I have to check to see if it exists and do an update/insert for that as well. Example: I may be loading a pencil sharpener that may or may not exist. While I'm writing it into destination it has characteristics such as style, color, etc. and each of them may or may not exist. As written I seem to need to have 2 DataReaders open, one for the sharpener and one to check for and update color. I am new to ADO.NET, but not to programming and it's more complicated than I listed but for sanity's sake I didn't put all the details. My question is: What am I missing? You can't have 2 readers open at the same time on a connection, yet I can't close the first if I'm stepping through all products. It seems inefficient to have 2 connections, readers, etc. for this. Is there a feature of ADO.NET DBs that I'm missing? How would you do it? Thanks!

    Read the article

  • Linq to Entities performance within ASP.NET Development Server

    - by tster
    I've been evaluating linq to entities and linq to SQL for a project. Obviously each has its own advantages and disadvantages which have been discussed plenty of times here. However, One thing I am seeing with L2E is kind of odd. Using L2S, when using the ASP.NET Development Server, the performance is a little slower for my web service calls. I'm looking at 300ms vs. 250 ms. However, when using L2E, when using ASP.NET Dev Server, the performance is awful. I'm talking 1,250 ms vs. 220 ms. I know I should probably just use local IIS for development, but I'm curious if anyone else has seen this, or knows what is causing it.

    Read the article

  • Duplicate partitioning key performance impact

    - by Anshul
    I've read in some posts that having duplicate partitioning key can have a performance impact. I've two tables like: CREATE TABLE "Test1" ( CREATE TABLE "Test2" ( key text, key text, column1 text, name text, value text, age text, PRIMARY KEY (key, column1) ... ) PRIMARY KEY (key, name,age) ) In Test1 column1 will contain column name and value will contain its corresponding value.The main advantage of Test1 is that I can add any number of column/value pairs to it without altering the table by just providing same partitioning key each time. Now my question is how will each of these table schema's impact the read/write performance if I've millions of rows and number of columns can be upto 50 in each row. How will it impact the compaction/repair time if I'm writing duplicate entries frequently?

    Read the article

  • Is there something like "New Relic" for Perl apps?

    - by Cninroh
    We have successfully migrated all of our PHP and Ruby apps to use New Relic RPM both for Application performance measurements and server monitoring. We are very please with results, which have enabled us to improve the overall performance of the platfrom numeral times. We still have a lot of Perl applications which we need to support for legacy purposes, but in comparison to our New Relic powred apps we are completely blind to whats happening inside the apps and in peak hours. Is there something like "New Relic" for Perl apps?

    Read the article

  • XSD generation from a MS SQL database using schemas

    - by madprog
    I'm willing to use NDbUnit on a MS SQL database which uses schemas. I have to generate the XSD schema from the database. Visual Studio has a tool to do that, but Visual Studio 2005 doesn't include the schema information in the generated XSD. Therefore, NDbUnit fails because the generated SQL queries do not match the database. Worse, when I try to use Proteus, the XSD schema doesn't validate against the database, and Proteus fails with a warning telling that data could be lost if this check was skipped. So, my question is: is there any tool that would generate my XSD schema properly and from the database information?

    Read the article

  • measuring performance - using real clicks vs "ab" command

    - by shanyu
    I have a web site in closed beta, developed in Django, runs with Mysql on Debian. In the last few days, the main page has been showing a slowdown. For every ten clicks, one or two receives extremely slow response (10 secs or more), others are as fast as they used to be. When I was searching for the problem, I ran into this issue that I couldn't grasp: top command shows that when I request the main page, mysql shoots up to 90% - 100% cpu usage. I get the page just as the cpu use gets back to normal. So, I thought, it is db. Then I called ab with parameters -n 1000 -c 5, I got decent performance, about 100 pages per second, just as it was before the slowdown. I would imagine a worse performance as 10-20% of requests take 10 secs to load. Is this conflict between ab and "real" clicks normal, or am I using ab in a wrong configuration?

    Read the article

  • C# ASP.NET update SQL database with values from text boxes

    - by Sir Graystar
    Here's what I have. User enters values into text boxes (personal information etc.) and then presses a save changes button. The values in these text boxes get stored in an SQL database. The problem I have is that when updating the database using the values from the text boxes, the page refreshes and the values in the text boxes are lost (or rather they return to the values that are already in the database as the data from the database is loaded into the text boxes on Page_Load). When I update the database using valuse stored in variables it all works fine. What is the best way to update with the values from the text boxes?

    Read the article

  • Do i need to insert one fake row in database ?

    - by Ankit Rathod
    Hello, I have few tables like example. Users Books UsersBookPurchase UID BookId UserId UName Name BookId Password Price Email This is fine. I am having my own login system but i am also using some 3rd party to validate like OpenID or facebook Authetication. My question is if the user is able to log in successfully using OpenID or facebook Authentication, what steps do i need to do i.e do i have to insert one fake row in Users table because if i do not insert how will integrity be maintained. I mean what user id should i insert in UsersBookPurchase when the person who has logged in using Facebook Authentication has made a purchase because the UserId is reference key from Users table. Please give me a high level overview of what i need to do because this is fairly common scenario. Thanks in advance :)

    Read the article

  • ASP .NET page runs slow in production

    - by Brandi
    I have created an ASP .NET page that works flawlessly and quickly from Visual Studio. It does a very large database read from a database on our network to load a gridview inside of an update panel. It displays progress in an Ajax modalpopupextender. Of course I don't expect it to be instant what with the large db reads, but it takes on the order of seconds, not on the order of minutes. This is all working great until I put it up on the server - it is very, VERY slow when I access it via the internet - takes several minutes to load the database information into the gridview. I'm baffled why it would not perform the exact same as it had from Visual Studio. (It is in release mode and I have taken off the debug flag) I have since been trying things like eliminating unneeded update panels and throwing out the ajax tool. Nothing has made it any faster on production. It is not the database as far as I know, since it has been consistently fast from my computer (from visual studio) and consistently slow from the server. I am wondering, where do I look next? Has anyone else had this problem before? Could this be caused by update panels or Ajax modalpopupextenders in different parts of the application? Why would the live behaviour differ so much from the localhost behaviour? Both the server with the ASP .NET page and the server with the database are servers on our network. I'm using Visual Studio 2008. Thank you in advance for any insight or advice.

    Read the article

  • How to Connect to a Mysql database using PHP?

    - by Karthik
    <?php mysql_connect("localhost", "username", "password") or die(mysql_error()); echo "Connected to MySQL<br />"; ?> This is the code I am using to check if I am able to connect to Mysql. I am having problem connecting using this code. Should I change localhost to the name of the website? I tried ("www.abc.com","login username", "password") even this is not working.

    Read the article

  • Upload database backup from mysql to Amazon S3 or Glacier without creating local file

    - by Rubem Azenha
    Is there a tool that makes possible to backup a Mysql database to Amazon S3 or Amazon Glacier without having o create a local file with the database contents? Something like that: mysqldump -u root -ppass -h host --all-databases | magical-s3-tool s3-bucket backup-yyyy-mm-dd.sql This magical tool would use the pipe data and transfer the backup data directly to S3, without creating a local file.

    Read the article

  • Changing MSSQL Database sorting

    - by plaisthos
    I have a request to change the collation of a MS SQL Database: ALTER DATABASE solarwind95 collate SQL_Latin1_General_CP1_CI_AS but I get this strange error: Meldung 5075, Ebene 16, Status 1, Zeile 1 Das 'Spalte'-Objekt 'CustomPollerAssignment.PollerID' ist von 'Datenbanksortierung' abhängig. Die Datenbanksortierung kann nicht geändert werden, wenn ein schemagebundenes Objekt von ihr abhängig ist. Entfernen Sie die Anhängigkeiten der Datenbanksortierung, und wiederholen Sie den Vorgang. Sorry for the german errror message. I do not know how to switch the language to english, but here is a translation: Translation: Message 5075, Layer 16, Status 1, Row 1 The 'column' object 'CustomPollerAssignment.PollerID' depends on 'Database sorting. The database sorting cannot be changed if a schema bound object depends on it. Remove the dependency of the database sortieren and retry. I got a ton more of the errors like that.

    Read the article

  • Which jsPerf-test should I consider as standard for checking the performance of javascript template-engines

    - by bhargav
    I am on a search for a javascript template engine that has good performance when used in large js applications and is also very suitable for mobile applications. So I have gone through the various jsPerf-tests for these. There seems to be a lot which show different results and it is confusing to find out which is the standard test. Can some one guide me a standard jsPerf that I can refer to and that should also include following templates dust, underscore, hogan, mustache, handlebars. From what I have observed dot.js is a constant performer with good rendering speed, but is it mature enough for larger applications ? What is this "with" and "no with" that is shown in the jspref tests? Can some one explain. In all the tests I have seen popular templates like mustache, handlebars, dust, hogan,etc seems to be behind performance than other templates, so why people are using them leaving out the top performers,is it because of maturity of these template engines? Thanks in advance

    Read the article

  • Code First / Database First / Model First : are they just personnal preferences?

    - by Antoine M
    Merely knowing the internal functionality of each approaches, and after reading a lot of posts, I still can't figure out if each one of them is just a matter of personnal preference for the developper or if they deserve different axes of productivity ? Does one of them should be applyed for some specific productivity needs or MS is just beeing kind offering three different flavours ? Should we consider CF as a sort of improvement over DBF or MF and thinking of it as a futur standard on wich spending a peculiar intelectual investment ... ? Is there a link showing a sort of synthetic table with un-passionate pros and cons for each approach, a little bit like for web-forms and MVC. Sorry for those who will find this question redondant. I know it is.

    Read the article

< Previous Page | 218 219 220 221 222 223 224 225 226 227 228 229  | Next Page >