8 radical ways to cut data center power costs

1 2 Page 2
Page 2 of 2

But 2009 brought a new line of data center equipment that could convert 13,000 VAC utility power directly to 575 VDC (volts of direct current), which can then be distributed directly to racks, where a final step-down converter takes it to 48 VDC for consumption by servers in the rack. Each conversion is about twice as efficient as older AC transformer technology and emits far less heat. Although vendors claim as much a 50 percent savings when electrical and cooling reductions are combined, most experts say that 25 percent is a more credible number.

This radical approach does require some expenditure on new technology, but the technologies involved are not complex and have been demonstrated to be reliable. One potential hidden cost is the heavier copper cabling required for 48 VDC distribution. As Joule's Law dictates, lower voltages require heavier conductors to carry the same power as higher voltages, due to higher amperage. Another cost factor with data centers is the higher voltage drop incurred over distance (about 20 percent per 100 feet), compared to AC. This is why the 48 VAC conversion is done in the rack rather than back at the utility power closet.

Of course, converting to direct current requires that your servers can accommodate 48 VDC power supplies. For some, converting to DC is a simple power supply swap. Chassis-based servers, such as blade servers, may be cheaper to convert because many servers share a single power supply. Google used the low-tech expedient of replacing server power supplies with 12V batteries, claiming 99 percent efficiency over a traditional AC-powered UPS (uninterruptible power supply) infrastructure.

If you're planning a server upgrade, you might want to consider larger systems that can be powered directly from 575 VDC, such as IBM's Power 750, which recently demolished human competitors as Watson on the "Jeopardy" game show. Brand-new construction enjoys the advantage of starting with a clean sheet of paper, as Syracuse University did when building out a data center last year, powering IBM Z and Power mainframes with 575 VDC.

Radical energy savings method 7: Bury heat in the earth In warmer regions, free cooling may not be practical all year long. Iowa, for example, has moderate winters but blistering summers, with air temperatures in the 90- and 100-degree range, which is unsuitable for air-side economization.

But the ground often has steady, relatively low temperatures, once you dig down a few feet. The subsurface earth is also less affected by outdoor weather conditions such as rain or heat that can overload traditional equipment. By sending pipes into the earth, hot water carrying server-generated heat can be circulated to depths where the surrounding ground will usher the heat away by conduction.

Again, the technology is not rocket science, but geothermal cooling does require a fair amount of pipe. A successful geothermal installation also requires careful advance analysis. Because a data center generates heat continuously, pumping that heat into a single earth sink could lead to local saturation and a loss of cooling. An analysis of ground capabilities near the data center will determine how much a given area can absorb, whether heat-transfer assistance from underground aquifers will improve heat dissipation, and what, if any, environmental impacts might ensue.

Speaking of Iowa, the ACT college testing nonprofit deployed a geothermal heat sink for its Iowa City data center. Another Midwestern company, Prairie Bunkers near Hastings, Neb., is pursuing geothermal cooling for its Data Center Park facility, converting several 5,000-square-foot ammo bunkers into self-contained data centers.

Radical energy savings method 8: Move heat to the sea via pipes Unlike geothermal heat sinks, the ocean is effectively an infinite heat sink for data center purposes. The trick is being near one, but that is more likely than you might think: Any sufficiently large body of water, such as the Great Lakes between the United States and Canada, can serve as a coolant reservoir.

The ultimate seawater cooling scenario is a data center island, which could use the ocean in the immediate area to cool the data center using sea-to-freshwater heat exchangers. The idea is so good that Google patented it back in 2007. Google's approach falls far afield of the objectives in this article, however, since the first step is to either acquire or construct an island.

But the idea isn't so farfetched if you're already located reasonably close to an ocean shore, large lake, or inland waterway. Nuclear plants have used sea and lake water cooling for decades. As reported in Computer Sweden (Google's English translation) last fall, Google took this approach for its Hamina, Finland, data center, a converted paper pulp mill. Using chilly Baltic Sea water as the sole means to cool its new mega data center, as well as to supply water for emergency fire protection, demonstrates a high degree of trust in the reliability of the approach. The pulp mill has an existing water inlet from the Baltic, with two-foot-diameter piping, reducing the project's implementation costs.

Freshwater lakes have been used successfully to cool data centers. Cornell University's Ithaca, N.Y., campus uses water from nearby 2.5-trillion-gallon Cayuga Lake to cool not just its data centers but the entire campus. The first-of-its-kind cooling facility, called Lake Source Cooling and built in 2000, pumps 35,000 gallons per hour, distributing water at 39 degrees Fahrenheit to campus buildings located 2.5 miles away.

Both salt- and freshwater cooling systems require one somewhat expensive component: a heat exchanger to isolate natural water from the water used to directly chill the data center. This isolation is necessary to protect both the environment and sensitive server gear, should a leak occur in the system. Beyond this one expensive component, however, sea (and lake) water cooling requires nothing more complex than ordinary water pipe.

How much money do you want to save? The value of these techniques is that none are mutually exclusive: You can mix and match cost saving measures to meet your short-term budget and long-term objectives. You can start with the simple expedient of raising the data center temperatures, then assess the value of other techniques in light of the savings you achieve with that first step.

This story, "8 radical ways to cut data center power costs," was originally published at InfoWorld.com. Follow the latest developments in data center technology and management at InfoWorld.com. For the latest developments in business technology news, follow InfoWorld.com on Twitter.

Read more about data center in InfoWorld's Data Center Channel.

This story, "8 radical ways to cut data center power costs" was originally published by InfoWorld.

Copyright © 2011 IDG Communications, Inc.

1 2 Page 2
Page 2 of 2
The 10 most powerful companies in enterprise networking 2022