Choose Index below for a list of all words and phrases defined in this glossary.


index | Index

Denormalization - definitions

Denormalization - To place normalized data in a duplicate location, thus optimizing the performance of the system.

[Category=Database Management ]

Source: Northwest Database Services, 08 November 2009 09:57:53, External

These advertisers support this free service

De-normalization - A database design activity that restructures a database by introducing derived data, replicated data, and/or repeating data to tune an application system and increase performance.

[Category=Data Warehousing ]

Source: Aexis Business Intelligence, 28 November 2010 13:33:05, External

denormalization - In a relational database, denormalization is an approach to speeding up read performance (data retrieval) in which the administrator selectively adds back specific instances of redundant data after the data structure has been normalized. A denormalized database should not be confused with a database that has never been normalized.

During normalization, the database designer stores different but related types of data in separate logical tables called relations. When a query combines data from multiple tables into a single result table, it is called a join. Multiple joins in the same query can have a negative impact on performance. Introducing denormalization and adding back a small number of redundancies can be a useful for cutting down on the number of joins.

After data has been duplicated, the database designer must take into account how multiple instances of the data will be maintained. One way to denormalize a database is to allow the database management system (DBMS) to store redundant information on disk. This has the added benefit of ensuring the consistency of redundant copies. Another approach is to denormalize the actual logical data design, but this can quickly lead to inconsistent data. Rules called constraints can be used to specify how redundant copies of information are synchronized, but they increase the complexity of the database design and also run the risk of impacting write performance.

See also: object-relational mapping (ORM), association rules

Related glossary terms: columnar database, Not Only SQL (NoSQL), sparsity and density, Sparse and dense, Amazon Dynamo Database (DDB), On-Line Analytical Processing (OLAP), in-memory database, Google BigTable, virtual cube, Cassandra (Apache Cassandra), data classification

[Category=Data Management ]

Source:, 27 July 2013 10:57:07, External

Data Quality Glossary.  A free resource from GRC Data Intelligence. For comments, questions or feedback: