Data is created constantly, and at an ever-increasing rate. Mobile phones, social media, imaging technologies to determine a medical diagnosis—all these and more create new data, and that must be stored somewhere for some purpose. Devices and sensors automatically generate diagnostic information that needs to be stored and processed in real time. Merely keeping up with this huge influx of data is difficult, but substan- tially more challenging is analyzing vast amounts of it, especially when it does not conform to traditional notions of data structure, to identify meaningful patterns and extract useful information. These challenges of the data deluge present the opportunity to transform business, government, science, and everyday life.
Several industries have led the way in developing their ability to gather and exploit data:
Credit card companies monitor everypurchasetheircustomersmakeandcanidentifyfraudulent purchases with a high degree of accuracy using rules derived by processing billions of transactions.
Mobilephonecompaniesanalyzesubscribers’callingpatternstodetermine,forexample,whethera caller’s frequent contacts are on a rival network. If that rival network is offering an attractive promo- tion that might cause the subscriber to defect, the mobile phone company can proactively offer the subscriber an incentive to remain in her contract.
ForcompaniessuchasLinkedInandFacebook,dataitselfistheirprimaryproduct.Thevaluationsof these companies are heavily derived from the data they gather and host, which contains more and more intrinsic value as the data grows.
Three attributes stand out as defining Big Data characteristics:
Huge volume of data: Rather than thousands or millions of rows Big Data can be billions of rows and millions of columns.
Complexity of data types and structures: BigData reflects the variety of new data sources, formats, and structures, including digital traces being left on the web and other digital repositories for subse- quent analysis.
Speed of new data creation and growth: Big Data can describe high velocity data, with rapid data ingestion and near real time analysis.
Although the volume of Big Data tends to attract the most attention, generally the variety and velocity of the data provide a more apt definition of Big Data. (Big Data is sometimes described as having 3 Vs: volume, variety, and velocity.) Due to its size or structure, Big Data cannot be efficiently analyzed using only traditional databases or methods. Big Data problems require new tools and technologies to store, manage, and realize the business benefit. These new tools and technologies enable creation, manipulation, and management of large datasets and the storage environments that house them.
Another definition of Big Data comes from the McKinsey Global report from 2011 (McKinsey & Co.; Big Data: The Next Frontier for Innovation, Competition, and Productivity ):
Big Data is data whose scale, distribution, diversity, and/or timeliness require the use of new technical architectures and analytics to enable insights that unlock new sources of business value.
McKinsey’s definition of Big Data implies that organizations will need new data architectures and analytic sandboxes, new tools, new analytical methods, and an integration of multiple skills into the new role of the data scientist, which will be discussed in next Figure highlights several sources of the Big Data deluge.
The rate of data creation is accelerating, driven by many of the items in Figure