PwC packs a punch

Innovative design includes thermal cooling, high-powered electrical infrastructure

When PricewaterhouseCoopers U.S. CIO Stuart Fulton walks through the company's spankin' new data center, opened this month, he finds "cool things around just about every corner.''

ATLANTA -- When PricewaterhouseCoopers U.S. CIO Stuart Fulton walks through the company's spankin' new data center, opened this month, he finds "cool things around just about every corner."

PwC data center secrets

"Particularly when compared to the existing data center, the contrast is quite amazing," Fulton says.

One immediately noticeable feature is the white thermoplastic olefin (TPO) roof topping the building and reducing heat entering the facility, he says. Made of ethylene propylene rubber, TPO single-ply roof "membranes" combine the durability of rubber with the proven performance of hot-air weldable seams. Also visually impressive -- not to mention efficiency smart -- are six fan wall units, each with 24 fans, that will keep cool air circulating about IT gear, he adds.

Marching order

PwC, a professional services firm with U.S. headquarters in New York, set out to build a state-of-the-art data center almost three years ago. By January 2007, already having extended the life of its existing data center by three years with power, cooling and IT upgrades, but having little capacity left, the clock was ticking loud and clear.

John Regan, PwC's director of data center services, challenged his multidisciplinary team with this goal: "Build a Tier 3 data center that runs like a Tier 4 at the cost of a Tier 2."

Data center construction finished slightly ahead of schedule and a bit below budget, which PwC declined to specify. "Having the team work toward executing along those lines made them very sensitive to the overall cost while driving the best functionality into the environment. Ultimately, we ended up with a Tier 3-plus data center," Regan adds.

As classified by the Uptime Institute, Tier 2 data centers have a single, non-redundant distribution path serving the IT infrastructure, while Tier 3 facilities have multiple paths. Tier 4 data centers have multiple, independent, physically isolated systems, each having redundant capacity components and multiple distribution paths serving the IT racks.

PwC expects the data center to receive Gold certification for Leadership in Energy and Environmental Design from the U.S. Green Building Council. Largely because of the new data center, PwC already has earned designation as a top green IT organization for 2009 from Computerworld, a Network World sister publication.

With its certificate of occupancy in hand, the team anticipates beginning to load up the data center with IT gear late this year. Initially the facility will house more than 4,000 physical and virtual servers and support about 2.8 petabytes of storage -- enough to accommodate current and future needs, Fulton says. Dual carrier OC-192 -- 10Gbps SONET -- connections feed into the data center. The internal network fabric switching operates between 1G and 40Gb at the port level and the storage-area-network (SAN) backbone will operate at speeds of 4Gb at the edge and 8Gb at the core.

PwC will support all lines of business and back-office applications from the center, as well as the network and communications needs for its 31,000 U.S. employees. It is working on an application migration plan that will stretch over an 18- to 24-month window, Fulton says.

Apples to apples

Housed in an 80,000-square-foot building, the data center initially will comprise more than 20,000 square feet of raised floor with more than 13,000 square feet of white space active on day one for servers and SANs. This contrasts to 9,000 square feet of usable space in the current environment, Regan says.

That extra 4,000 square feet is only the beginning of an apples-to-apples comparison. The dramatic difference comes in power density, he says.

When PwC built the existing data center in 1999, it did so for an IT power load of 40 watts per square foot. Over time, with the increased capacity and densification of servers, plus the addition of SANs, PwC upgraded the plant to accommodate a power load of 130W per square foot -- where we've basically hit our facility limitations, Regan says.The new data center solves that problem with its ability to carry a power load of at least 250W per square foot. This entails UPS and electrical infrastructure to support the space, offset by the ability to cool the heat load.

Commissioning exercises show the data center is capable of supporting even higher power densities. "We've been able to achieve over 400W per square foot in some instances in our testing," Regan says. "That's a pretty substantial power density capacity gain over our current environment."

In addition, PwC designed the data center in pods for ease of expandability. It will begin with two pods, with a third -- 6,500 square feet of space -- physically assembled and ready to be loaded and lit up quickly, Regan says. The team expects the third pod to come online in three to five years.

"When you total up all that space, we're at over 23,000 square feet of raised floor with future potential to reach over 35,000 square feet. So not only does the new data center double the capacity of the current environment, but in a sense with the power density factored in, it essentially quadruples it," Regan says.

Coolness factors

The team hopes to get 20 years out of this data center, and has selected several technologies it hopes will make those a cost-effective and efficient two decades, Regan says.

For example, the data center decided to drive higher voltage into the building as a way to lower its spending on the copper infrastructure. It is using 575 volts for distribution with 13, 200V coming into the building. "I don't believe you'll see too many data centers in the U.S. with this type of electrical design," Regan says.

More typical is 480V electrical distribution, which costs a bit more in copper cabling — lower gauges of copper cable are needed when running higher voltages.

In addition, PwC is using thermal storage, 192,000 gallons worth of chilled water, in the data center. "This would help us ride through any potential glitches within the cooling infrastructure. It'll give us an hour to two hours worth of free cooling," Regan says.

PwC's data center team also has taken innovative approaches with the chilled water plant, allowing for water temperatures at 55 to 60 degrees. When coupled with water side economizers, PwC will be able to achieve significant free cooling, Regan says. It also uses hot aisle containment to maximize air-flow efficiency. "Overall we expect more free cooling in the environment and that will reduce operating costs for us," Regan says.

"At the end of the day," Regan adds, "there are many important piece parts but it's the culmination of all these different elements being brought together and what the team actually achieved that so impresses me about this data center."

Schultz is an IT writer in Chicago. She can be reached at bschultz5824@gmail.com.

Learn more about this topic

What does a real green data center look like?

How to build your next data center

10 ways to make your data center more efficient

Join the Network World communities on Facebook and LinkedIn to comment on topics that are top of mind.
Now read: Getting grounded in IoT