DB optimization to use it as a queue

Posted by anony on Stack Overflow See other posts from Stack Overflow or by anony
Published on 2010-06-09T12:49:07Z Indexed on 2010/06/09 12:52 UTC
Read the original article Hit count: 150

Filed under:
|

We have a table called worktable which has some columns(key(primary key), ptime, aname, status, content)

we have something called producer which puts in rows in this table and we have consumer which does an order-by on the key column and fetches the first row which has status as 'pending'. The consumer does some processing on this row: 1. updates status to "processing" 2. does some processing using content 3. deletes the row

we are facing contention issues when we try to run multiple consumers(probably due to the order-by which does a full table scan)...

using Advanced queues would be our next step but before we go there we want to check what is the max throughput we can achieve with multiple consumers and producer on the table.

What are the optimizations we can do to get the best numbers possible? Can we do an in-memory processing where a consumer fetches 1000 rows at a time processes and deletes? will that improve? What are other possibilities? partitioning of table? parallelization? Index organized tables?...

© Stack Overflow or respective owner

Related posts about database

Related posts about queue