Argonne's MIRA supercomputer solves jet engines, Pringles and everything in between

SC13: Argonne Leadership Computing Facility boss shares what's running on world's fifth-fastest supercomputer.

Packing more than 786,000 processor cores, and putting out more than 8.5 petaflops of sustained computing performance, the Argonne National Lab's MIRA supercomputer is among the most powerful. 

Packing more than 786,000 processor cores, and putting out more than 8.5 petaflops of sustained computing performance, the Argonne National Lab's MIRA supercomputer is among the most powerful. Mike Papka

Mike Papka

So what, exactly, are such supercomputers for? According to the director of the Argonne Leadership Computing Facility, Mike Papka, almost anything.

One project, Papka recalls, dealt with Pringles - though he notes this wasn't under the auspices of the ALCF. 

“[Then-owners Proctor and Gamble] wanted to use supercomputers to design the conveyer belts at Pringles, because they wanted the conveyer belt going as fast as it could possibly go – they don’t want the Pringles flying off the belt,” he recounts.

[MORE SC13: Experts debate thorny exascale memory issues]

The ALCF, which is part of the U.S. Department of Energy, focuses on a comparatively small number of projects that demand outsized amounts of capacity, according to Papka. The INCITE program – which recently announced grants of computing time to 59 computational research projects – takes up about 60% of capacity at both Argonne and at the Oak Ridge National Laboratory, home to the even more powerful Titan supercomputer.

The new projects are averaging about 78 million CPU hours each, according to Papka. “That’s a pretty significant chunk of time,” he says, with some understatement.

The projects are, in a word, diverse, ranging from simulations of earthquakes and complex aeronautical phenomena to a dark energy project that Papka says will be the largest cosmology simulation ever performed.

mira

Another will attempt to study the properties of concrete in-depth, in hopes of creating a standardized reference material. One will simulate the complexities of managing a modern electrical grid powered by a wide range of alternative energy sources. Others will tackle lithium/air batteries, exascale computing, and supernovae.

It’s important to note that getting time on MIRA is an exacting process, requiring in-depth reviews both of the computer code that will be run and of the project’s scientific merits. INCITE receives requests for about five times more computing resources than it can provide.

“You don’t just pick up your code that you’re running on your laptop and drop it on this machine and start working,” says Papka. “And then we hold a number of scientific review panels, where we bring in the best scientists in a given area.”

“These are not trivial resources. You want to be using them effectively,” he adds.

Plans are already afoot for the next generation of government supercomputers, according to Papka. By 2017 or 2018, he hopes that a new pair of systems – running at 200 petaflops each – will be operating at Argonne and Oak Ridge.

“We expect that, late this year, we will actually issue an RFP, a request for proposals, to the community,” he says. “So we’re actually spending a fair amount of time, because once we go into that RFP process, we’re not allowed to talk to the vendors … having a dialogue with them.”

Astrophysicists and snack-food makers, eat your hearts out.

Email Jon Gold at jgold@nww.com and follow him on Twitter at @NWWJonGold.

Join the discussion
Be the first to comment on this article. Our Commenting Policies