Skip Links

The coming datacosm: Big data, big business and the future of enterprise computing

By John Schroeder, CEO, MapR Technologies Inc., special to Network World
January 31, 2013 10:02 AM ET

Network World - This vendor-written tech primer has been edited by Network World to eliminate product promotion, but readers should note it will likely favor the submitter's approach.

The challenges and promise of big data are front and center for CIOs and other business leaders. The initial applications that have leveraged big data have provided organizations with significant returns and given a glimpse into the power of big data and how it can be used to disrupt their competitive landscape.

These business leaders are beginning to recognize that by better leveraging a variety of data sources -- everything from transactional data to trading data, genomics, smart meter output and sensor data -- they can dramatically change their organization's market position and profitability.

For example, telecommunications companies are looking to exploit big data to target market content to set-top boxes, target rate plans and bundles, improve quality-of-service and offload expensive data warehouses. What's interesting about these examples is that, while big data is being used for new applications with new data, it is also possible to use big data to transform existing applications and use cases.

[ IN DEPTH: How to manage big data overload ]

One of the drivers for big data solutions is the need to harness fast-growing data. In the past, a telecom billing application dealt with one phone per customer and something in the neighborhood of 100 calls a month. Today, telecom billing applications must deal with an explosion of devices, voice, text and data plans.

In his 1990 book, "Microcosm: The Quantum Revolution in Economics and Technology," George Gilder wrote about how technology was changing business, economics and even the very nature of how markets function through the introduction of ever smaller, more powerful yet affordable, microchips and computing systems.

Gilder followed up with the book "Telecosm," which discusses how the telecommunications revolution, including broadband, would connect people, computers and businesses in entirely new ways. Each of these waves predicted how exploiting the increasingly abundant and "free" resource (microprocessors in "Mircocosm" and bandwidth in "Telecosm") would be an engine of change and wealth creation.

We are now on the cusp of a third wave -- a "Datacosm" -- that will enable organizations to exploit abundant and fast-growing data. To be ready to leverage big data, enterprises need the best and latest tools in order to analyze and exploit all of their unstructured and structured data. Hadoop, a framework that enables the analysis of vast amounts of structured and unstructured data on a cluster of commodity servers, has emerged as the most important technology for the Datacosm.

When talking about big data, we are really talking about new architectures -- in essence, a paradigm shift that's needed to process all that information. In the Datacosm, it doesn't make sense to store data separately from the processing. Rather, users of big data need to simplify how they handle and scale data. So what's required is an architecture that can scale linearly and easily. That's really what Hadoop provides.

Our Commenting Policies
Latest News
rssRss Feed
View more Latest News