Secrets of successful data centers

Security by obscurity

You don't see flashing neon signs on today's data centers. The goal is to keep as low a profile as possible.

Security and biometrics go hand in hand

Security and biometrics go hand in hand

For example, at Navisite's data center in Andover, Mass., everyone entering the data center must swipe a smart card and pass a sophisticated palm reader.

Emerson

Here comes the sun

At Emerson's new data center in St. Louis, a 7,800-square-foot rooftop solar array can generate 100 kilowatts of energy.

Thomson Reuter

Brrrrrrrring it on.

At Thomson Reuters' new data center in Eagan, Minn., ambient-air cooling is used between 3,300 to 3,500 hours, or roughly 140 days per year.

fire

Hold the HOH

If a fire breaks out in a data center, a traditional sprinkler system would put it out. It would also put the company out of commission by destroying the servers, storage equipment and data stored on those devices. A waterless system can fight a fire by using a special gas instead.

cables in ceiling

Hang 'em high

If you're blowing cold air up from under the floor, you should think about putting all of the cables in the ceiling tiles. That way the cables don't interfere with the air flow.

heat exchangers on the roof

Up on the roof

Putting heat exchangers on the roof allows data center mangers to save on the underground copper pipes that typically connect the air conditioners to the heat exchangers located on the ground near the building.

Cool chips

Cool chips

The source of all that unwanted heat, after all, is the CPU, so if you want to tackle the problem at the source, look for the latest, more energy efficient chips. An example would be Intel's Microarchitecture, known as Nehalem.

Rear-Door Heat eXchanger

Attack the rack

Sticking to the theory that attacking server-generated heat closest to the source is the most efficient approach, IBM has designed a product it calls Rear-Door Heat eXchanger. This four-inch wide, liquid-cooled device fits on the back of a standard server rack and passively removes an estimated 55% of the heat generated by a full server rack.

Map it

Map it

You can't develop a comprehensive plan to reduce data center energy costs without first doing an analysis of where your hot spots and cold spots are today.

Sensor overload

Sensor overload

The key in setting up and operating a successful data center is continually monitoring the temperate, at both the ceiling and rack levels. For example, IBM has deployed 100 sensors in a 2,000-square-foot data center. All that data is fed into an automated monitoring system.

Blowing hot and cold

Blowing hot and cold

One of the core concepts in today's data center is the hot aisle/cold aisle architecture. Cold air is pumped up from the floor into the front of the servers. Hot air is vented out the back. The hot air rises into a venting/air conditioning system that cools the air and re-circulates it back up through the floor. But data centers are now getting extremely granular, using variable-speed fans linked to sophisticated sensor networks to dynamically adjust the cold air flow based on CPU usage.

Redundant WAN links

Redundant WAN links

If your WAN links go down, your data center is kaput. The trick is to go with multiple ISPs and to run redundant physical WAN links.

Back it up

Back it up

We know that things can go wrong, so it's critical to have multiple back-up systems in place. That means offsite storage, it means battery backup, it means backup generators.

Get virtual

Get virtual

The underlying scenario for today's data center is server consolidation brought about by virtualization technology. Companies are drastically reducing the number of data centers, and reducing the number of physical servers. On the hardware side, blade servers are allowing companies to squeeze more computer power into smaller areas.