Search Results

Search found 40567 results on 1623 pages for 'database performance'.

Page 245/1623 | < Previous Page | 241 242 243 244 245 246 247 248 249 250 251 252  | Next Page >

  • jQuery drag drop slower for more DIV items

    Hi there, I have got a hierarchichal tags (with parent child relationship) in my page and it will account to 500 - 4500 (can even grow). When i bound the draggable and droppable for all i saw very bad performance in IE7 and IE6. The custom helper wont move smoothly and was very very slow. Based on some other post i have made the droppable been bound/unbound on mouseover and mouseout events (dynamically). Its better now. But still i dont see the custom helper move very smoothly there is a gap between the mouse cursor and the helper when they move and gets very bad when i access the site from remote. Please help me to address this performance issue. Am totally stuck here.. :(

    Read the article

  • Hibernate overriding database modifications with detached object state

    - by EugeneP
    I'm gonna go with this design: create an object and keep it alive during all web-app session. And I need to synchronize its state with database state. What I want to achieve is that : IF between my db operations, that is, modifications that I persist to a db someone intentionally spoils table rows, then on next saving to a database all those changes WOULD BE OVERWRITTEN with the object state, that always contains valid data. What Hibernate methods do you recommend me to use to persist the modifications in a database? saveOrUpdate() is a possible solution, but maybe there's anything better? Again, I repeat how it looks. First I create an object without collections. Persist it (save()). Then user provides us with additional data. In a serviceLayer, again, we modify our object in memory (say, populate it with collections) and then, persist it again. So every serviceLayer operation of the next step must simply guarantee that database contains the exact persistent copy of this object that we have in memory. If data in a database differ, it MUST BE OVERRIDDEN with the object (kept in memory) state. What Session operations do you recommend?

    Read the article

  • Checking if MySQL Database data does not exist

    - by Ben Sinclair
    I have my songs set-up in my MySQL database. Each song is is either assigned multiple locations or has no locations at all. Only the songs that either have no locations assigned in the database or have the location assigned to the ones specified below should be pulled from the database. Hopefully when you understand my query below it will make sense SELECT s.* FROM roster_songs AS s LEFT JOIN roster_songs_locations AS sl ON sl.song_id = s.id WHERE EXISTS ( SELECT sl2.* FROM roster_songs_locations AS sl2 WHERE s.id != sl2.song_id ) OR ( sl.location_id = '88fb5f94-aaa6-102c-a4fa-1f05bca0eec6' OR sl.location_id = '930555b0-a251-102c-a245-1559817ce81a' ) GROUP BY s.id The query almost works except it pulls out of the database songs that are assigned to sl.location_id's that aren't specified in the above query. I think it has something to do with my EXISTS code picking them up... Any ideas how I can get this to work?

    Read the article

  • How to save big "database-like" class in python

    - by Rafal
    Hi there, I'm doing a project with reasonalby big DataBase. It's not a probper DB file, but a class with format as follows: DataBase.Nodes.Data=[[] for i in range(1,1000)] f.e. this DataBase is all together something like few thousands rows. Fisrt question - is the way I'm doing efficient, or is it better to use SQL, or any other "proper" DB, which I've never used actually. And the main question - I'd like to save my DataBase class with all record, and then re-open it with Python in another session. Is that possible, what tool should I use? cPickle - it seems to be only for strings, any other? In matlab there's very useful functionality named save workspace - it saves all Your variables to a file that You can open at another session - this would be vary useful in python!

    Read the article

  • How to improve the performance of BKPF

    - by rachu patil
    Hi Gurus, I want to get BELNR(Accounting Document Number) from BKPF table by pasing BKPF-XBLNR = VBRP-VGBEL (this is the req...) but it is taking more time resulting into time out error, how to make performance wise good, if even any BAPI is there please let me know. Thanks in advance Regards,

    Read the article

  • ASp.Net MVC 2 Performance

    - by HeavyWave
    What is the latest data on ASP.Net MVC performance? How does it scale and perform under heavy load? I have profiled my ASP.Net MVC 1 application and most of the time is wasted in System.Web.MVC assembly, so I thought it might be a concern.

    Read the article

  • Need to find latitude and longitude of postcodes and store into my database

    - by Matt
    Hey guys, Ive got a database full of UK Postcodes, now i'd like to be able to store the latitude and longitude of these specific postcodes along with the record with the database. Is there anyway that i can obtain this data free without violating any T&C's? I know i could do this using google maps api for each postcode, but i have way over 20,000 postcodes in this database and to get the lat and lng for each of these postcodes each time is not an option really. Thanks in advance, M

    Read the article

  • jQuery mousemove performance

    - by Colby77
    Hi, When I bind a mousemove event to an element it is working smoothly with every browser except Internet Explorer. With IE the CPU usage is way too much and some associated things (eg. tooltip) are ugly. Is there any way I could rid of the performance problem? (yeah I know, don't use IE :))

    Read the article

  • MySQL efficiency as it relates to the database/table size

    - by mlissner
    I'm building a system using django, Sphinx and MySQL that's very quickly becoming quite large. The database currently has about 2000 rows, and I've written a program that's going to populate it with another 40,000 rows in a couple days. Since the database is live right now, and since I've never had a database with this much information in it, I'm worried about some things: Is adding all these rows going to seriously degrade the efficiency of my django app? Will I need to go back through it and optimize all my database calls so they're doing things more cleverly? Or will this make the database slow all around to the extent that I can't do anything about it at all? If you scoff at my 40k rows, then, my next question is, at what point SHOULD I be concerned? I will likely be adding another couple hundred thousand soon, so I worry, and I fret. How is sphinx going to feel about all this? Is it going to freak out when it realizes it has to index all this data? Or will it be fine? Is this normal for it? If it is, at what point should I be concerned that it's too much data for Sphinx? Thanks for any thoughts.

    Read the article

  • mongoid with rails - Database should be a Mongo::DB, not NilClass"

    - by Adam T
    Greetings I am trying to get Mongoid to work with my Rails app and I am getting an error: "Mongoid::Errors::InvalidDatabase in 'Shipment bol should be unique' Database should be a Mongo::DB, not NilClass" I have created the mongoid.yml file in my config directory and have mongodb running as a daemon. The config file is like so: defaults: &defaults host: localhost development: <<: *defaults database: ship-it-development test: <<: *defaults database: ship-it-test production: <<: *defaults host: <%= ENV['MONGOID_HOST'] % port: <%= ENV['MONGOID_PORT'] % database: <%= ENV['MONGOID_DATABASE'] % All of my specs fail with the above error. I am using rails 2.3.8. Anyone have ideas?

    Read the article

  • what's a good way to synchronize a sql server 2008 database from a 2005 database automatically?

    - by Keith Nicholas
    Ok, the scenario is... two servers, on completely different parts of the internet. The sql 2008 database just needs to get data updates and schema changes. It doesn't need to send anything to the 2005 database. Basically just suck data and schema as efficiently as possible automatically as a scheduled task. The database is quite huge.... but the changes per day are probablly around 20/30 megabytes of data/ I can't run any of the inbuilt replication on the 2005 database. I've had a wee look at the Sync Framework, I think that might do what I want, but seems a bit painful and requires a bit of work to get going. I'm wondering if there is tooling out there to make this easier? or?? not quite sure what my options are.

    Read the article

  • How to associate static entity instances in a Session without database retrieval

    - by Michael Hedgpeth
    I have a simple Result class that used to be an Enum but has evolved into being its own class with its own table. public class Result { public static readonly Result Passed = new Result(StatusType.Passed) { Id = [Predefined] }; public static readonly Result NotRun = new Result(StatusType.NotRun) { Id = [Predefined] }; public static readonly Result Running = new Result(StatusType.Running) { Id = [Predefined] }; } Each of these predefined values has a row in the database at their predefined Guid Id. There is then a failed result that has an instance per failure: public class FailedResult : Result { public FailedResult(string description) : base(StatusType.Failed) { . . . } } I then have an entity that has a Result: public class Task { public Result Result { get; set; } } When I save a Task, if the Result is a predefined one, I want NHibernate to know that it doesn't need to save that to the database, nor does it need to fetch it from the database; I just want it to save by Id. The way I get around this is when I am setting up the session, I call a method to load the static entities: protected override void OnSessionOpened(ISession session) { LockStaticResults(session, Result.Passed, Result.NotRun, Result.Running); } private static void LockStaticResults(ISession session, params Result[] results) { foreach (var result in results) { session.Load(result, result.Id); } } The problem with the session.Load method call is it appears to be fetching to the database (something I don't want to do). How could I make this so it does not fetch the database, but trusts that my static (immutable) Result instances are both up to date and a part of the session?

    Read the article

  • Opening read-only OLEDB connection to MS Access back-end database while allowing updates via separat

    - by djdilicious
    I have a back-end MS Access 2002-2003 database which stores blog entries. I created a separate front-end database with the forms for entering blog posts into the backend database. Finally, I have a website utilizing ASP to display the blog entries. The website connects directly to the backend database using an OLEDB connection object. Whenever I open the form for creating a new post in MS Access, loading the blog post page on the website displays the error: Could not use "; file already in use. I would like to be able to display the older blog posts even though the newest one is in the process of being added.

    Read the article

  • Finding key Solr performance metrics

    - by Mike Malloy
    To improve performance of Solr find your slowest searches, monitor query results, cache hit rate and cache size, document cache and filter cache; find problems with Solr update handlers by tracking index operations and document operations. There is a tool from New Relic which may help. http://www.newrelic.com/solr.html

    Read the article

  • A very basic auto-expanding list/array

    - by MainMa
    Hi, I have a method which returns an array of fixed type objects (let's say MyObject). The method creates a new empty Stack<MyObject>. Then, it does some work and pushes some number of MyObjects to the end of the Stack. Finally, it returns the Stack.ToArray(). It does not change already added items or their properties, nor remove them. The number of elements to add will cost performance. There is no need to sort/order the elements. Is Stack a best thing to use? Or must I switch to Collection or List to ensure better performance and/or lower memory cost?

    Read the article

  • Issue in Adding data into database using hibernate

    - by sarah
    Hi , I am getting the following exception "org.hibernate.HibernateException: The database returned no natively generated identity value "while adding data into database i am using the following code.Please let me know what is wrong Sesion session=HibernateUtil.getSession(); Transaction tx = session.beginTransaction(); session.save(user); logger.info("Successfully data insert in database"); tx.commit(); isSaved=true; Thankx

    Read the article

  • Performance tuning of a Hibernate+Spring+MySQL project operation that stores images uploaded by user

    - by Umar
    Hi I am working on a web project that is Spring+Hibernate+MySQL based. I am stuck at a point where I have to store images uploaded by a user into the database. Although I have written some code that works well for now, but I believe that things will mess up when the project would go live. Here's my domain class that carries the image bytes: @Entity public class Picture implements java.io.Serializable{ long id; byte[] data; ... // getters and setters } And here's my controller that saves the file on submit: public class PictureUploadFormController extends AbstractBaseFormController{ ... protected ModelAndView onSubmit(HttpServletRequest request, HttpServletResponse response, Object command, BindException errors) throws Exception{ MutlipartFile file; // getting MultipartFile from the command object ... // beginning hibernate transaction ... Picture p=new Picture(); p.setData(file.getBytes()); pictureDAO.makePersistent(p); // this method simply calls getSession().saveOrUpdate(p) // committing hiernate transaction ... } ... } Obviously a bad piece of code. Is there anyway I could use InputStream or Blob to save the data, instead of first loading all the bytes from the user into the memory and then pushing them into the database? I did some research on hibernate's support for Blob, and found this in Hibernate In Action book: java.sql.Blob and java.sql.Clob are the most efficient way to handle large objects in Java. Unfortunately, an instance of Blob or Clob is only useable until the JDBC transaction completes. So if your persistent class defines a property of java.sql.Clob or java.sql.Blob (not a good idea anyway), you’ll be restricted in how instances of the class may be used. In particular, you won’t be able to use instances of that class as detached objects. Furthermore, many JDBC drivers don’t feature working support for java.sql.Blob and java.sql.Clob. Therefore, it makes more sense to map large objects using the binary or text mapping type, assuming retrieval of the entire large object into memory isn’t a performance killer. Note you can find up-to-date design patterns and tips for large object usage on the Hibernate website, with tricks for particular platforms. Now apparently the Blob cannot be used, as it is not a good idea anyway, what else could be used to improve the performance? I couldn't find any up-to-date design pattern or any useful information on Hibernate website. So any help/recommendations from stackoverflowers will be much appreciated. Thanks

    Read the article

  • Check for Duplicates in a Database Before Entering Data

    - by gamerzfuse
    Before Entering data into a database, I just want to check that the database doesn't have the same username in the database already. I have the username in SQL set as a key, so it can't be duplicated on that end, but I am looking at finding a more user-friendly error message then "KEY already exists". Is there are simple way to check if the variable value already exists in a row? Thanks!

    Read the article

  • .NET TDD with a Database and ADO.NET Entity Framework - Integration Tests

    - by Brian
    Hello, I'm using ADO.NET entity framework, and am using an AdventureWorks database attached to my local database server. For unit testing, what approaches have people taken to work with a database? Obviously, the database has to be in a pre-defined state of change so that the tests can have some isolation from each other... so I need to be able to run through the inserts and updates, then rollback either between tests or after the batch of tests are done. Any advice? Thanks.

    Read the article

  • DataView Vs DataTable.Select()

    - by Aseem Gautam
    Considering the code below: Dataview someView = new DataView(sometable) someView.RowFilter = someFilter; if(someView.count > 0) { …. } Quite a number of articles which say Datatable.Select() is better than using DataViews, but these are prior to VS2008. Solved: The Mystery of DataView's Poor Performance with Large Recordsets Array of DataRecord vs. DataView: A Dramatic Difference in Performance So in a situation where I just want a subset of datarows based on some filter criteria(single query) and what is better DataView or DataTable.Select()?

    Read the article

  • How to create a custom ADO Multi Demensional Catalog with no database

    - by Alan Clark
    Does anyone know of an example of how to dynamically define and build ADO MD (ActiveX Data Objects Multidimensional) catalogs and cube definitions with a set of data other than a database? Background: we have a huge amount of data in our application that we export to a database and then query using the usual SQL joins, groups, sums etc to produce reports. The data in the application is originally in objects and arrays. The problem is the amount of data is so large the export can take 2 hours. So I am trying to figure out a good way of querying the objects in memory, either by a custom OLAP algorithm or library, or ADO MD. But I haven't been able to find an example of using ADO MD without a database behind it. We are using Delphi 2010 so would use ADO ActiveX but I imagine the ADO.NET MD is similar. I realize that if the application data was already stored in a database the problem would solve itself. Also if Delphi had LINQ capability I could query the objects and arrays that way.

    Read the article

  • Best practices or tools for installing a SQL Server database

    - by Maestro1024
    Best practices or tools for installing a SQL Server database I have a SQL Server database designed with the SQL Server GUI database editor/Visual Studio. What is the best way to "install" that database on other systems. Said another way how should I ship this thing? I know I can save the scripts and set the primary/foreign keys with T-SQL but I suspect their is something better. I guess you could have people restore from backup but that does not seem very professional. What other choices are there and what are the pluses and minuses?

    Read the article

  • Sql script, create a database

    - by Blanca
    Hi! I have the next file: create_mysql.sql DROP DATABASE IF EXISTS playence_media; CREATE DATABASE playence_media; USE playence_media; GRANT ALL PRIVILEGES ON . TO 'media'@'localhost' IDENTIFIED BY 'media' WITH GRANT OPTION; But I don't know how to create this database. I would like to do it with my terminal, no other graphics interfaces. Thanks

    Read the article

< Previous Page | 241 242 243 244 245 246 247 248 249 250 251 252  | Next Page >