Choose Index below for a list of all words and phrases defined in this glossary.

Big Data_

index | Index

Big Data - definition(s)

Big Data - Big data are datasets that grow so large that they become awkward to work with using on-hand database management tools. Difficulties include capture, storage, search, sharing, analytics, and visualizing. This trend continues because of the benefits of working with larger and larger datasets allowing analysts to "spot business trends, prevent diseases, combat crime."Though a moving target, current limits are on the order of terabytes, exabytes and zettabytes of data. Scientists regularly encounter this problem in meteorology, genomics, biological research, Internet search, finance and business informatics. Data sets also grow in size because they are increasingly being gathered by ubiquitous information-sensing mobile devices, "software logs, cameras, microphones, RFID readers, wireless sensor networks and so on."

Wikipedia, 09 March 2011 11:12:41, External 

These advertisers support this free service

Big data - Big data (also spelled Big Data) is a general term used to describe the voluminous amount of unstructured and semi-structured data a company creates -- data that would take too much time and cost too much money to load into a relational database for analysis. Although Big data doesn't refer to any specific quantity, the term is often used when speaking about petabytes and exabytes of data.

A primary goal for looking at big data is to discover repeatable business patterns. It’s generally accepted that unstructured data, most of it located in text files, accounts for at least 80% of an organization’s data. If left unmanaged, the sheer volume of unstructured data that’s generated each year within an enterprise can be costly in terms of storage. Unmanaged data can also pose a liability if information cannot be located in the event of a compliance audit or lawsuit.

Big data analytics is often associated with cloud computing because the analysis of large data sets in real-time requires a framework like MapReduce to distribute the work among tens, hundreds or even thousands of computers.

See also: kilo, mega, giga, tera, peta, text mining, Hadoop, cloud audit

[Category=Data Management ]

Source:, 07 July 2013 09:01:49, External

big data - This term has been defined in many ways, but along similar lines. Doug Laney, then an analyst at the META Group, first defined big data in a 2001 report called “3-D Data Management: Controlling Data Volume, Velocity and Variety.” Volume refers to the sheer size of the datasets. The McKinsey report, “Big Data: The Next Frontier for Innovation, Competition, and Productivity,” expands on the volume aspect by saying that, “’Big data’ refers to datasets whose size is beyond the ability of typical database software tools to capture, store, manage, and analyze.”

Velocity refers to the speed at which the data is acquired and used. Not only are companies and organizations collecting more and more data at a faster rate, they want to derive meaning from that data as soon as possible, often in real time.

Variety refers to the different types of data that are available to collect and analyze in addition to the structured data found in a typical database. Barry Devlin of 9sight Consulting identifies four categories of information that constitute big data:

1. Machine-generated data. This includes RFID data, geolocation data from mobile devices, and data from monitoring devices such as utility meters.

2. Computer log data, such as clickstreams from websites.

3. Textual social media information from sources such as Twitter and Facebook.

4. Multimedia social and other information from Flickr, YouTube, and other similar sites.

IDC analyst Benjamin Woo has added a fourth V to the definition: value. He says that because big data is about supporting decisions, you need the ability to act on the data and derive value.

[Category=Big Data ]

Source: DataInformed, 16 October 2013 09:11:06, External

Big data - An all-encompassing term that refers to a collection of data sets that are too large or change too frequently to process on a single machine.

[Category=Big Data ]

Jamie Turner, 15 December 2014 11:13:45, External

Data Quality Glossary.  A free resource from GRC Data Intelligence. For comments, questions or feedback: