Skip Links

Defining 'big data' depends on who's doing the defining

When does data become big? AWS, IBM and research firms each have their own definitions.

By , Network World
May 10, 2012 06:05 AM ET

Network World - Big data is an IT buzzword nowadays, but what does it really mean? When does data become big?

At a recent Big Data and High Performance Computing Summit in Boston hosted by Amazon Web Services (AWS), data scientist John Rauser mentioned a simple definition: Any amount of data that's too big to be handled by one computer.

Some says that's too simplistic. Others say it's spot on.

CLOUD TRENDS: New bare metal cloud offerings emerging 

HADOOP: Hadoop wins over enterprise IT, spurs talent crunch 

"Big data has to be one of the most hyped technologies since, well the last most hyped technology, and when that happens, definition become muddled," says Jeffrey Breen of Atmosphere Research Group.

The lack of a standard definition points to the immaturity of the market, says Dan Vesset, IDC program vice president of the business analytics division of the research firm. But, he isn't quite buying the definition floated by AWS. "I'd like to see something that actually talks about data instead of the infrastructure needed to process it," he says.

Others agree with the AWS definition.

"It may not be all inclusive, but I think for the most part that's right," says Jeff Kelly, a big data analytics analyst at the Wikibon project. Part of the idea of big data is that it's so big that analyzing it needs to be spread across multiple workloads, hence AWS's definition. "When you're hitting the limits of your technology, that's when data gets big," Kelly says.

One of the most common definitions of big data uses three terms, all of which happen to start with the letter V: volume, velocity and variety. Many analyst firms, such as IDC and companies, such as IBM, seem to coalesce around this definition. Volume would mean the massive amount of data generated and collected by organizations; velocity, refers to the speed at which the data must be analyzed; and variety means the vast array of different types of data that is collected, from text, to audio, video, web logs and more.

But some are skeptical of that definition, too. Breen has a fourth "v" to add to the definition: vendor.

Companies such as AWS and IBM tailor definitions to support their products, Breen says. AWS, for example, offers a variety of big data analytic tools, such as Elastic Map Reduce, which is a cloud-based big data processing feature.

"The cloud provides instant scalability and elasticity and lets you focus on analytics instead of infrastructure," Amazon spokesperson Tera Randall wrote in an e-mail. "It enhances your ability and capability to ask interesting questions about your data and get rapid, meaningful answers." Randall says Rauser's big data definition is not an official AWS definition of the term, but was being used to describe the challenges facing business management of big data.

Big data analytics in the cloud is an emerging market though, Kelly says. Google recently, for example, released BigQuery, the company's cloud-based data analytics tool. IBM, for its part, says information is "becoming the petroleum of the 21st century," fueling business decisions across a variety of industries moving forward.

Our Commenting Policies
Latest News
rssRss Feed
View more Latest News