Web page database query optimization

Posted by morpheous on Stack Overflow See other posts from Stack Overflow or by morpheous
Published on 2010-06-05T12:42:31Z Indexed on 2010/06/05 17:12 UTC
Read the original article Hit count: 284

I am putting together a web page which is quite 'expensive' in terms of database hits. I don't want to start optimizing at this stage - though with me trying to hit a deadline, I may end up not optimizing at all.

Currently the page requires 18 (that's right eighteen) hits to the db. I am already using joins, and some of the queries are UNIONed to minimize the trips to the db. My local dev machine can handle this (page is not slow) however, I feel if I release this into the wild, the number of queries will quickly overwhelm my database (MySQL).

I could always use memcache or something similar, but I would much rather continue with my other dev work that needs to be completed before the deadline - at least retrieving the page works - its simply a matter of optimization now (if required).

My question therefore is - is 18 db queries for a single page retrieval completely outrageous - (i.e. I should put everything on hold and optimize the hell of the retrieval logic), or shall I continue as normal, meet the deadline and release on schedule and see what happens?

[Edit]

Just to clarify, I have already done the 'obvious' things like using (single and composite) indexes for fields used in the queries. What I haven't yet done is to run a query analyzer to see if my indexes etc are optimal.

© Stack Overflow or respective owner

Related posts about mysql

Related posts about optimization