While computing capacity has exploded in recent years, power consumption is growing more slowly thanks to greater energy efficiency. Credit: Google A predicted explosion in power consumption by data centers has not manifested thanks to advances in power efficiency and, ironically enough, the move to the cloud, according to a new report. The study, published in the journal Science last week, notes that while there has been an increase in global data-center energy consumption over the past decade, this growth is negligible compared with the rise of workloads and deployed hardware during that time. Data centers accounted for about 205 terawatt-hours of electricity usage in 2018, which is roughly 1% of all electricity consumption worldwide, according to the report. (That’s well below the often-cited stat that data centers consume 2% of the world’s electricity). The 205 terawatt-hours represent a 6% increase in total power consumption since 2010, but global data center compute instances rose by 550% over that same time period. To drive that point home: Considerably more compute is being deployed, yet the amount of power consumed is holding steady. The paper cites a number of reasons for this. For starters, hardware power efficiency is vastly improved. The move to server virtualization has meant a six-fold increase in compute instances with only a 25% increase in server energy use. And a shift to faster and more energy-efficient port technologies has brought about a 10-fold increase in data center IP traffic with only a modest increase in the energy use of network devices. Even more interesting, the report claims the rise of and migration to hyperscalers has helped curtail power consumption. Hyperscale data centers and cloud data centers are generally more energy efficient than company-owned data centers because there is greater incentive for energy efficiency. The less power Amazon, Microsoft, Google, etc., have to buy, the more their bottom line grows. And hyperscalers are big on cheap, renewable energy, such as hydro and wind. So if a company trades its own old, inefficient data center for AWS or Google Cloud, they’re reducing the overall power draw of data centers as a whole. “Total power consumption held steady as computing output has risen because of improvement efficiency of both IT and infrastructure equipment, and a shift from corporate data centers to more efficient cloud data centers (especially hyper scale),” said Jonathan Koomey, a Stanford professor and one of the authors of the research, in an email to me. He has spent years researching data center power and is an authority on the subject. “As always, the IT equipment progresses most quickly. In this article, we show that the peak output efficiency of computing doubled every 2.6 years after 2000. This doesn’t include the reduced idle power factored into the changes for servers we document,” he added. Koomey notes that there is additional room for efficiency improvements to cover the next doubling of computing output over the next few years but was reluctant to make projections out too far. “We avoid projecting the future of IT because it changes so fast, and we are skeptical of those who think they can project IT electricity use 10-15 years hence,” he said. Related content news AWS and Nvidia partner on Project Ceiba, a GPU-powered AI supercomputer The companies are extending their AI partnership, and one key initiative is a supercomputer that will be integrated with AWS services and used by Nvidia’s own R&D teams. By Andy Patrizio Nov 30, 2023 3 mins CPUs and Processors Generative AI Supercomputers news VMware stung by defections and layoffs after Broadcom close Layoffs and executive departures are expected after an acquisition, but there's also concern about VMware customer retention. By Andy Patrizio Nov 30, 2023 3 mins Virtualization Data Center Industry news AI partly to blame for spike in data center costs Low vacancies and the cost of AI have driven up colocation fees by 15%, DatacenterHawk reports. By Andy Patrizio Nov 27, 2023 4 mins Generative AI Data Center opinion Winners and losers in the Top500 supercomputer ranking Besides Nvidia, who had a great showing on the list of the world’s most powerful supercomputers? Almost everyone. By Andy Patrizio Nov 20, 2023 4 mins CPUs and Processors Data Center Podcasts Videos Resources Events NEWSLETTERS Newsletter Promo Module Test Description for newsletter promo module. Please enter a valid email address Subscribe