Ch-Ch-Chatting with the South Pole's IT manager

From the start, Henry Malmgren was determined to get to the South Pole. After graduating from Texas Tech University in 1998 with a degree in MIS he applied for a job in the Antarctic every year before NSF contractor Raytheon finally hired him as a network engineer in 2001. Since then he has alternated between the Denver headquarters and the Amundsen-Scott South Pole Station, spending two summers and two winters there before finally working his way up to IT manager. Staying over is a commitment: Once the winter starts, there's no way to get in and out of the base until summer begins eight to nine months later. "I thought I would just do this for a single season, but somehow it always seemed too easy to keep coming back," he says.

Ok, I have to ask: How's the weather down there?

Right now it's pretty nice. We're at nice -64 degrees Fahrenheit with only about a 6.3 knot wind so it's what we would consider a really good day.

Whatever possessed you to work in Antarctica?

I had never traveled out of the U.S. at all until after I'd graduated from college. I had dated a girl who was an exchange student to Europe, and after hearing all of her stories, I knew I wanted to get a job out of the U.S. Somehow, the Antarctic job popped up on a job search, and I latched onto it as the perfect ticket to be able to travel for a while.

Is there really an official South Pole?

Absolutely. It looks like a barber pole. It's right outside our front door.

If you're on an ice sheet that moves about 30 feet a year, how do you know that the pole is really in the right spot?

We have a ceremony every New Year's day where we relocate the actual pole marker. Some years the United States Geologic Survey comes out and does an actual measurement with the sun, and other times we just use a surveyor-quality GPS to get a pretty close approximation. We actually put up a new marker, so you can see the line of previous markers stretching out away from the station.

The pole gets closer to the station every year. In about 20 years, it'll be right under our power plant.

What is your role at the station?

My domain includes everything from the IT superstructure to the satellites to the telephones systems and handheld radios. Anything telecom or computer related I'm ultimately responsible for. I have a staff of about seven supporting about 250 to 270 people during the summer season, which is where we are right now. During the winter, which runs from about mid-February to mid-October, we drop down to just four [staff] supporting 60 to 70 people.

What is a typical work day like for you?

We work nine hours a day, minimum, six days a week. I spend the first couple of hours answering e-mails from the Denver folks. [Then] I go out and talk to the scientists to see if they're getting what they need and if there are any problems we can solve. I love the fact that, although I'm on the management side I get to stick my hands in the technology a whole lot. I very much feel like a player coach because I have five years' worth of experience with these systems while most of my employees are here for the first time.

What takes up most of your time?

Information security has really come to the forefront in our priorities. Right now keeping up with security vulnerabilities and patches and things like that is taking a good third of our time. That's a change from even two years ago.

What is your data center like?

We have a brand new station that we just completed [in 2005] and we have a full-on data center with raised floors and everything you would have anywhere else in the world. We have about 30 servers. We also have what we call the RF [radio frequency] building, which is about a kilometer away from the main station, and we have a backup emergency data center out there where we keep extra file servers and a SAN. That's where the satellite dishes are. If anything happens to the primary data center we can switch our operations out to there.

Do you work with an IT counterpart back in the States?

There's a whole team I can call on back in Denver and we depend on them quite a bit. If we have problems and we can't solve them down here we rely on our Denver staff to provide trouble shooting.

What's a little known fact about life at the South Pole Station?

People would be surprised to know how well we eat down here. We have a little greenhouse. We have enough greenhouse production with hydroponics to get us a salad every couple of days.

Is it tough to hire staff to work there?

We usually have a high number of applicants -- it took me four years of applying before I was hired. But there are some times when we just don't have enough candidates. It fluctuates with the state of the economy. Right now I'm in a huge competition for satellite communications guys with contractors in Iraq and Afghanistan. Those guys are able to pay so much more than we can. Of course the advantage down here is that nobody is shooting at you.

What technical challenges do you face?

Our biggest challenge is bandwidth. We only have it only 12 hours a day at anywhere from T-1 (1.54 Mbit/sec) to 3 Mbit/sec speeds. We also have a transponder that we can use to send 60 Mbit/sec unidirectional from the pole to the real world. We use that to upload scientific data. Our record was 94Gbytes out in one day.

We have three different satellites we use to provide our Internet. All of those are pretty ancient. We have a weather satellite, an old maritime communications satellite and an old NASA satellite, the first one that was launched back in 1981. The others were launched in 1976 or 1977.

Basically we're scavenging whatever we can find and we can only see each satellite for 3 to 4 hours a day. Other than that we're almost a typical network. We use Cisco gear, we've got land lines to all of the bedrooms, we've got fiber optic distributed throughout the building so if fiber to the desktop ever becomes a reality the building is prewired for that. So we are trying to be as future proof as possible.

What is the most interesting project you've worked on lately?

In the past year we put up a really cool system where we're using the Iridium satellite network. We have 12 modems mulitiplexed together and have a total of 28.8K connectivity 24 x 7. Nobody thought it would work. Nobody ever thought we would have 24 x 7 connectivity at the South Pole. Now that's our last resort. When our broadband satellites are down we switch to the Iridium system automatically.

What happens if the satellite link goes down on your end? Do you draw straws to see who goes out to the RF building to wiggle the antenna?

Everyone here has had the experience of walking out to the remote data center in -100 degree temperatures when it's pitch black outside. That's part of the adventure of coming down here -- you can have these extreme situations that you're faced with. If you can handle swapping a router out at -100 degrees here then you can handle it anywhere.

What scientific projects do you support? There are several really big science projects here and they all generate huge volumes of data. Our 800 pound gorilla right now is the South Pole Telescope, a 10-meter in diameter radio telescope that scans the cosmic microwave background. They're looking for things like dark matter, and it generates tons of data that they want to send back to the states as quickly as possible.

If we didn't have [satellite broadband] they would have to store the data for the nine-month winter. This way they can see the results much quicker. They can also analyze any issues with the telescope and correct them while the winter observing season is going on, rather than having to wait a full year.

How do you support the researchers and scientists?

Generally they provide their own equipment and we provide the back end. But when something breaks we do step forward and help them get it repaired.

Many scientists -- and I can't say that I blame them -- don't want to trust their data to anyone else. Our support is really just providing the communications they need, providing the technical expertise.

Do the conditions affect reliability and uptime at all?

It is incredibly dry [so] static electricity is a huge problem for us. We lose more laptops and hard drives to static electricity than anything else. Our biggest failures are things like power supplies and hard drives. We're at an altitude of 12,000 feet and with the thin air here, cooling fans don't push a lot of air. Anything heat related tends to need a lot of extra TLC.

Hard drives also have a problem with the high altitude. Most hard drive heads float on a cushion of air above the platter. We have fewer air molecules for hard drives to float on so we have more hard drive crashes than anywhere else.

What happens when something breaks?

Getting service is a little tough. We try to maintain at least a year's worth of spare parts.

What's the most fun --and least fun -- part of your job?

The most fun is working with the scientists. Everybody down here -- from the dishwasher to the scientists to the construction guys -- they all have interesting background stories. Nobody here is your average person.

The worst part is that I've done two full years down here and being away from your friends and family, especially for the holidays, can be really tough sometimes.

What unique management issues do you face?

We have to order and plan and do everything a year and a half in advance to get supplies on time. We get supplies by airplane, but those come through McMurdough station [and] it is supplied once a year by a cargo vessel.

It's also a physically demanding job. A lot of people who come down here immediately get sick from the altitude.

Other than the cold and the environment and the lack of fresh food and vegetables, sometimes it almost feels like I could be somewhere in the real world. But then I look out the window and see the South Pole in my front yard and realize that I have one of the best jobs in the world.

In many data centers in the U.S. heat density is becoming a problem. Surely that's not an issue for you?

You would think that at the South Pole cooling wouldn't be a problem but with the amount of heat we generate [in the data center] getting rid of it actually can be quite an issue. We try to pipe some that heat to other parts of the building to recover it. The data center in the old station just had a hole cut in the wall with a fan [to the outside] to cool the systems. Sometimes you'd be sitting there in a parka trying to get something done.

How do you address disaster recovery?

That's a big deal. Fire is a huge danger in the Antarctic because it is so dry and because liquid water is such a hot commodity. We try to make sure that all of our data is backed up independently and we run as many systems as possible in parallel in the two locations.

What happens if there is a fire?

We have a wet sprinkler system. That's pretty pioneering for the South Pole. The building is design to be modular and is broken into sections. If the main station ever goes out there's a survival pod with emergency communications, an emergency generator and an emergency kitchen that we can retreat to if necessary. And we do have a backup data center about a kilometer away from the station that allows the data to survive.

In the summer, the plan is if there is a fire we'll get people out of here. In the winter, we have to survive for four or five months, depending on how long it takes to get a plane in.

Do you work with any promising new technologies?

This is not the place to deploy emerging technologies. When you're on the cutting edge you require a lot of support and down here support is hard to get. It's just not the place to do it.

What have you learned from your time at South Pole Station that might help you in future endeavors?

If nothing else I would say diplomacy. Trying to work with a customer base that's everything from PhD scientists to a guy that could be an iron worker who has never been here before and just wants to know how to turned on a computer and e-mail his family. It's such an interesting experience to learn how to educate different kinds of people.

What's the most outrageous experience you had?

We have this tradition called the 300 Club. When the temperature drops below -100 we hike the sauna up to 200 degrees and stay in there as long as we can stand it. Then we run outside, naked, around the geographic pole and back inside so we get that total 300-degree change in temperature. That happens every year and it's absolutely amazing. Just the feel of that cold on your skin is like nothing else. People always wonder if you can feel the difference between 60 below and 100 below and the answer is absolutely.

Doesn't your skin freeze in those temperatures?

That's why you spend as much time as possible in the sauna. The trick is to pace yourself. You can't run because if you do you're going to get frost bite in your lungs. But you don't want to walk too slowly or you will lose all of your body heat and get frost bite in various sensitive spots. There's definitely a happy medium of a fast walk.

But even at that when everyone comes back inside, with all of the hacking and coughing going on the place sounds like a tuberculosis ward for the next couple of days.

With all the talk about global warming these days I have to ask: Is your real estate down there shrinking?

1 2 Page 1
Page 1 of 2
The 10 most powerful companies in enterprise networking 2022