MySQL efficiency as it relates to the database/table size

Posted by mlissner on Stack Overflow See other posts from Stack Overflow or by mlissner
Published on 2010-04-22T06:25:39Z Indexed on 2010/04/22 6:43 UTC
Read the original article Hit count: 316

I'm building a system using django, Sphinx and MySQL that's very quickly becoming quite large. The database currently has about 2000 rows, and I've written a program that's going to populate it with another 40,000 rows in a couple days. Since the database is live right now, and since I've never had a database with this much information in it, I'm worried about some things:

  1. Is adding all these rows going to seriously degrade the efficiency of my django app? Will I need to go back through it and optimize all my database calls so they're doing things more cleverly? Or will this make the database slow all around to the extent that I can't do anything about it at all?

  2. If you scoff at my 40k rows, then, my next question is, at what point SHOULD I be concerned? I will likely be adding another couple hundred thousand soon, so I worry, and I fret.

  3. How is sphinx going to feel about all this? Is it going to freak out when it realizes it has to index all this data? Or will it be fine? Is this normal for it? If it is, at what point should I be concerned that it's too much data for Sphinx?

Thanks for any thoughts.

© Stack Overflow or respective owner

Related posts about django

Related posts about mysql