Choose Index below for a list of all words and phrases defined in this glossary.

Normalisation / Normalization

index | Index

Normalisation / Normalization - definitions

Normalisation : The process of associating attributes with the entity types for which they are inherent characteristics. The decomposition of data structures according to a set of dependency rules, designed to give simpler, more stable structures in which certain forms of redundancy are eliminated. A step-by-step process to remove anomalies in data integrity caused by add, delete, and update actions. Also called non-loss decomposition. 

[Category=Data Quality ]

Source: Larry English, External, 03-Feb-2009 13:45

These advertisers support this free service

Normalization - 1) The process of breaking up a table into multiple tables, each of which has a single theme, thereby reducing data redundancy.

2) The technique that reduces or eliminates the possibility that a database is subject to modification anomalies.

See also: Data Redundancy.

[Category=Database Management ]

Source: Northwest Database Services, 12 November 2009 11:05:00, External

Normalization - The process of converting complex data structures into simple, stable data structures.

[Category=Data Governance ]

Source: The Data Governance Institute, 17 December 2009 09:47:39, External

Normalization - The process of organizing data in accordance with the rules of a relational database.

In a completely de-normalized database the customer name and address information would be stored every time a customer made a purchase.

In a normalized database each customer's name and address would be stored only once, in a separate table. Every purchase record would have a reference to the customer table to indicate which customer was involved.

Many individual decisions have to be made in the process of normalizing a de-normalized database. How do we know which customer information refers to the same person? When there is contradictory address information, how do we choose between the various alternatives?

A fully normalized database is usually the most efficient design for an On-Line Transaction Processing System. A data warehouse, with its emphasis on efficient retrieval of data, often benefits from some intentional de-normalization. See the discussion of the Star Schema.

[Category=Data Warehousing ]

Source: SDG Computing Inc., 10 May 2010 09:41:35, SDG Computing, now offline

Normalization - The process of reducing a complex data structure into its simplest, most stable structure. In general, the process entails the removal of redundant attributes, keys and relationships from a conceptual data model.

[Category=Information Management ]

Source:, 29 June 2010 08:10:03, External

Normalization - The process of reducing a complex data structure into its simplest, most stable structure. In general, the process entails the removal of redundant attributes, keys, and relationships from a conceptual data model.

[Category=Data Warehousing ]

Source:, 09 August 2010 07:08:12, External

Normalization - The process of reducing a complex data structure into its simplest, most stable structure. In general, the process entails the removal of redundant attributes, keys, and relationships from a conceptual data model.

[Category=Data Management ]

Source: DataMentors, 24 August 2010 09:18:03, External

Normalization - A technique to eliminate data redundancy.

[Category=Data Warehousing ]

Source: Aexis Business Intelligence, 21 December 2010 08:06:48, External

Normalisation - Another name for relational data analysis.

[Category=Data Quality ]

Source: DAMA UK, 18 July 2011 11:22:56, External

normalization - (1) [Category=Data Management ] The process of organizing, analyzing, and cleaning data to increase efficiency for data use and sharing. Normalization usually includes data structuring and refinement, redundancy and error elimination, and standardization.

(2) [statistics] The process of dividing one numeric attribute value by another to minimize differences in values based on the size of areas or the number of features in each area. For example, normalizing (dividing) total population by total area yields population per unit area, or density.

[Category=Geospatial ]

Source: esri, 21 June 2012 09:41:12, External 

normalization - In creating a database, normalization is the process of organizing it into tables in such a way that the results of using the database are always unambiguous and as intended. Normalization may have the effect of duplicating data within the database and often results in the creation of additional tables. (While normalization tends to increase the duplication of data, it does not introduce redundancy, which is unnecessary duplication.) Normalization is typically a refinement process after the initial exercise of identifying the data objects that should be in the database, identifying their relationships, and defining the tables required and the columns within each table.

A simple example of normalizing data might consist of a table showing:

Customer Item purchased Purchase price
Thomas Shirt $40
Maria Tennis shoes $35
Evelyn Shirt $40
Pajaro Trousers $25

If this table is used for the purpose of keeping track of the price of items and you want to delete one of the customers, you will also delete a price. Normalizing the data would mean understanding this and solving the problem by dividing this table into two tables, one with information about each customer and a product they bought and the second about each product and its price. Making additions or deletions to either table would not affect the other.

Normalization degrees of relational database tables have been defined and include:

First normal form (1NF). This is the "basic" level of normalization and generally corresponds to the definition of any database, namely:

   * It contains two-dimensional tables with rows and columns.
   * Each column corresponds to a sub-object or an attribute of the object represented by the entire table.
   * Each row represents a unique instance of that sub-object or attribute and must be different in some way from any other row (that is, no duplicate rows are possible).
   * All entries in any column must be of the same kind. For example, in the column labeled "Customer," only customer names or numbers are permitted.

Second normal form (2NF). At this level of normalization, each column in a table that is not a determiner of the contents of another column must itself be a function of the other columns in the table. For example, in a table with three columns containing customer ID, product sold, and price of the product when sold, the price would be a function of the customer ID (entitled to a discount) and the specific product.

Third normal form (3NF). At the second normal form, modifications are still possible because a change to one row in a table may affect data that refers to this information from another table. For example, using the customer table just cited, removing a row describing a customer purchase (because of a return perhaps) will also remove the fact that the product has a certain price. In the third normal form, these tables would be divided into two tables so that product pricing would be tracked separately.

Domain/key normal form (DKNF). A key uniquely identifies each row in a table. A domain is the set of permissible values for an attribute. By enforcing key and domain restrictions, the database is assured of being freed from modification anomalies. DKNF is the normalization level that most designers aim to achieve.

Related glossary terms: Binary Large Object (BLOB), data structure, catalog, data mart, ECMAScript (European Computer Manufacturers Association Script), Visual FoxPro, segment, block, flat file

[Category=Data Management ]

Source:, 28 August 2013 07:30:47, External


Data Quality Glossary.  A free resource from GRC Data Intelligence. For comments, questions or feedback: