Is data integrity possible without normalization?

Posted by shuniar on Programmers See other posts from Programmers or by shuniar
Published on 2012-10-31T00:00:09Z Indexed on 2012/10/31 5:20 UTC
Read the original article Hit count: 352

I am working on an application that requires the storage of location information such as city, state, zip code, latitude, and longitude. I would like to ensure:

  1. Location data is accurate
    • Detroit, CA
      • Detroit IS NOT in California
    • Detroit, MI
      • Detroit IS in Michigan
  2. Cities and states are spelled correctly
    • California not Calefornia
    • Detroit not Detriot
  3. Cities and states are named consistently
    • Valid:
      • CA
      • Detroit
    • Invalid:
      • Cali
      • california
      • DET
      • d-town
      • The D

Also, since city/zip data is not guaranteed to be static, updating this data in a normalized fashion could be difficult, whereas it could be implemented as a de facto location if it is denormalized.

A couple thoughts that come to mind:

  1. A collection of reference tables that store a list of all states and the most common cities and zip codes that can grow over time. It would search the database for an exact or similar match and recommend corrections.
  2. Use some sort of service to validate the location data before it is stored in the database.

Is it possible to fulfill these requirements without normalization, and if so, should I denormalize this data?

© Programmers or respective owner

Related posts about database

Related posts about database-design