Skip Links

World's servers process 9.57ZB of data a year

UC San Diego research report estimates that average worker processes about 3TB of data annually

By Lucas Mearian, Computerworld
May 09, 2011 06:22 AM ET

Computerworld - Three years ago, the world's 27 million business servers processed 9.57 zettabytes, or 9,570,000,000,000,000,000,000 bytes of information.

Researchers at the School of International Relations and Pacific Studies and the San Diego Supercomputer Center at the University of California, San Diego, estimate that the total is equivalent to a 5.6-billion-mile-high stack of books stretching from Earth to Neptune and back to Earth, repeated about 20 times.

By 2024, business servers worldwide will annually process the digital equivalent of a stack of books extending more than 4.37 light-years to Alpha Centauri, according to a report compiled by the scientists.

The report, titled " How Much Information?: 2010 Report on Enterprise Server Information ," was released at the SNW conference last month.

Roger Bohn, one of the report's co-authors, compared the world's business servers to the underwater portion of an iceberg "that runs the world that we see.

"Most of this information is incredibly transient: it is created, used and discarded in a few seconds without ever being seen by a person," said Bohn, a professor of technology management at UC San Diego.

The study included estimates of the amount of data processed as input and delivered by servers as output. For example, one email message may flow through multiple servers and would thus be counted multiple times, he said.

The workload of the 27 million or so enterprise servers in use worldwide in 2008 was estimated by using cost and performance benchmarks for online transaction processing, Web services and virtual machine processing tasks.

The scientists estimate there were 3.18 billion workers in the world's labor force at the time, each of whom received an average of 3TB of information per year.

The analysis relied heavily on data and estimates from researchers at IDC and Gartner, which compile regular reports on server sales.

As large as the numbers may seem, the three scientists who worked on the report stated that their server workload figures may be low because server industry sales figures don't fully account for the millions of servers built in-house by Google, Microsoft, Yahoo and other companies using individual component parts .

The report estimates that Google runs the largest installed base of servers -- more than a million -- in the world. It estimates that Microsoft has between 500,000 and three quarters of a million servers running worldwide.

"The exploding growth in stored collections of numbers, images and other data is well known, but mere data becomes more important when it is actively processed by servers as representing meaningful information delivered for an ever-increasing number of uses," said James Short, who served as research director of the project.

Short, a research scientist at UC San Diego, said that as the capacity of servers used to process the explosion of data increases, there are "unprecedented challenges and opportunities for corporate information officers."

For example, the study pointed to a sharp increase in the use of server virtualization technology beginning in 2006, as well the more recent use of cloud computing systems where server-processing power is provided as a centrally administered commodity delved out on a pay-as-needed basis.

Originally published on www.computerworld.com. Click here to read the original story.

Our Commenting Policies
Cloud computing disrupts the vendor landscape

 

Latest News
rssRss Feed
View more Latest News