Predicting advantages of database denormalization

Posted by Janus Troelsen on Programmers See other posts from Programmers or by Janus Troelsen
Published on 2012-10-13T15:05:12Z Indexed on 2012/10/13 15:53 UTC
Read the original article Hit count: 440

I was always taught to strive for the highest Normal Form of database normalization, and we were taught Bernstein's Synthesis algorithm to achieve 3NF. This is all very well and it feels nice to normalize your database, knowing that fields can be modified while retaining consistency.

However, performance may suffer. That's why I am wondering whether there is any way to predict the speedup/slowdown when denormalizing. That way, you can build your list of FD's featuring 3NF and then denormalize as little as possible. I imagine that denormalizing too much would waste space and time, because e.g. giant blobs are duplicated or it because harder to maintain consistency because you have to update multiple fields using a transaction.

Summary: Given a 3NF FD set, and a set of queries, how do I predict the speedup/slowdown of denormalization? Link to papers appreciated too.

© Programmers or respective owner

Related posts about database-design

Related posts about relational-database