Skip Links

Clemson IT team embraces call to be entrepreneurial

By , Network World
August 15, 2011 06:07 AM ET

Network World - Five years ago Clemson University named James Bottum chief information officer and gave him the mandate to overhaul the school's IT infrastructure and build out a high performance computing environment. The goal: catapult the school into a leading research university and help attract faculty and students.

Mission accomplished. The South Carolina school is now among the top five non-federally funded University Supercomputing sites. But just as importantly, the environment Bottum helped create is driving creative funding efforts, everything from attracting partners that want to use the high-performance computing (HPC) system to sale of commercial software and new grants that benefit both the school and IT.

BACKGROUND: HPC experts look past petaflop to the exascale

"Last year the Clemson president told us our best years of public sector funding from the state were most likely behind us because of the financial crisis, and we needed to rethink our business model," Bottum says. "The encouragement was to become entrepreneurial."

Fortunately many of the changes Bottum's team made properly positioned Clemson for the new normal. The university has seen 180% growth in revenue from external sources, which helps supplement the school's IT budget, and a 250% increase in federal grants, part of which help offset IT costs.

"The main goal is to continue to run and support a robust set of services and infrastructure for Clemson University," Bottum says, "but do it in a way where we can grow and leverage what we're doing and create a stronger set of infrastructure and services that also contributes to the state economic development."

Bottum has unique qualifications that are helping get it all done. He spent 20-plus years in the research sector, including a stint at the National Science Foundation, then 15 years at the National Center for Supercomputing Applications, and for the last 10 years he has been a CIO (at Purdue before this).

James Bottum

Bottum's team at Clemson has a lot of recent achievements to be proud of, but they also get to investigate leading-edge stuff, everything from the huge HPC grid to new OpenFlow tools and the school's own Orange File System. It's a rich environment.

Early goings

When Bottum ( pictured at right) arrived at Clemson the school had 48 IT groups, each of which had its own servers and storage and many of which ran their own networks.

"I saw a departmental IT person in a room with fans blowing on a server," he says. "All of the high-performance computing was in a little data center in the engineering science college. They had about six or seven clusters but didn't have enough juice to power them all up at the same time. It was a real belt and suspenders kind of operation, a cluster in the closet model."

A couple of other surprises: The university was buying commodity 100Mbps Internet service at a much-inflated price from local telecom companies, and the school had a large data center 10 miles off campus with expansion potential to 30,000 square feet. The former meant the university could make a big leap forward by joining Internet2, and the latter was going to make it easier to aggregate the IT operations and modernize.

While the initial funding for the overhaul would come from the school itself, the new HPC capabilities attracted new monies along the way and Clemson won many grants, including an NSF Research Infrastructure Improvement Award.

MORE ON NETWORK RESEARCH: Follow our Alpha Doggs blog

Job one was rehabbing the data center and the Information Technology Center, and aggregating most of the IT groups and resources. The building was 20-plus years old and was upgraded in two phases.

"We had 7,000 or 8,000 square feet of space, half a megawatt, and 20-something-year-old power and air conditioning when I got here," says CTO Jim Pepin, who came over from the University of Southern California (USC). "We went up to 2 megawatts and filled that up in less than two years as we consolidated operations and started to build our HPC cluster."

From left to right in front of the HPC cluster: Jay Harris, director of operations; Boyd Wilson, executive director of computing, systems and operations; Mike Cannon (front), data storage architect; Jim Pepin (back), CTO; Lanae Neild, HPC administrator; Becky Ligon, file system developer. (Photo by Zac Wilson)

From left to right in front of the HPC cluster: Jay Harris, director of operations; Boyd Wilson, executive director of computing, systems and operations; Mike Cannon (front), data storage architect; Jim Pepin (back), CTO; Lanae Neild, HPC administrator; Becky Ligon, file system developer. (Photo by Zac Wilson)

The first phase ended in December 2007, and in the second phase, which was completed in December 2010, the data center space was built out to 16,000 square feet and split between two environments, one for enterprise gear -- everything from email and student systems to a mainframe to support the state's Medicaid system -- and the other for the HPC system, a 1,629-node Linux cluster. "So now we have two physically separate rooms with different air conditioning profiles and 4.5 megawatts," Pepin says.

Our Commenting Policies
Latest News
rssRss Feed
View more Latest News