Researchers race to produce 3D models of BP oil spill

NSF approves supercomputing time as researchers apply storm surge models to oil spread

The National Science Foundation has allocated 1 million hours on the Ranger supercomputer to help scientists forecast the affects of the massive BP oil spill on the Gulf Coast.

As the oil from BP's massive spill affects an ever expanding area, scientists have embarked on a crash effort to use one the world's largest supercomputers to forecast, in 3D, how BP's massive Gulf of Mexico oil spill will affect coastal areas.

The National Science Foundation late last week made an emergency allocation of 1 million compute hours - acting within 24 hours of receiving the request - on a supercomputer used at the Texas Advanced Computing Center at the University of Texas to study how the spreading oil from BP's gusher will affect coastlines.

The goal of this effort is to produce models that can forecast how the oil may spread in environmentally sensitive areas by showing in detail what happens when it interacts with marshes, vegetation and currents.

What may be as important are models that forecast what might happen if a hurricane carries the oil miles inland, said researchers in interviews.

The computer model they are working on "has the potential to advise and undergird many emergency management decisions that may be made along the way, particularly if a hurricane comes through the area," said Rick Luettich, a professor of marine sciences and head of the Institute of Marine Sciences at the University of North Carolina in Chapel Hill, and one of the researchers on this project.

The computer models now being used to track the oil's spread aren't finely tuned enough to show just what happens as the oil nears the coast line, said Luettich

"I don't think that they have any idea how this oil is predicted to move through the marshes and the nearshore zone," said Luettich.

The scientists aren't starting from scratch. They are using storm models developed after Katrina and other storms and adding oil into the calculations. A massive amount of data is involved to create complex simulations and that takes a powerful system to process it.

The Texas supercomputer, called Ranger , is the ninth most powerful supercomputer in the world, according to the Top 500, a twice-a-year ranking maintained by an international group of supercomputing scientists.

A compute hour on a laptop with one CPU core is one hour of time. Ranger has about 63,000 compute cores, and is capable of speeds of 579 TFLOPS (One teraflop equals one trillion floating point operations per second). This system is primarily used for academic research and cost $59 million in 2008, which also included funding for building the system and four years of support.

The National Science Foundation is funding the project.

Whether one million compute hours will be enough for the project remains to be seen. Katrina research may have used as many as 20 million compute hours.

The model being used is called ADCIRC or Advanced Circulation Model for Oceanic, Coastal and Estuarine Waters. And what this storm model can do "is actually track the oil spill into the marshes and the wetlands" because of its fine scale of resolution, said Clint Dawson, professors of aerospace engineering and engineering mechanics at the University of Texas, and one of the researchers.

The models track the oil spill have resolutions of 500 meters to a kilometer, but the model by the researchers looks at a resolution of 50 to 40 meters, which is fine enough detail to show, for instance, simulations of currents moving up channels, said Dawson.

The 3D modeling can show what happens to the oil at various depths and how it travels as it comes in contact with surfaces underwater, such as vegetation, tides and other conditions "that are invisible to the 2d model," said Gordon Wells, the program manager for space research at the University of Texas and science and technology advisory for state emergency management, who is part of the project.

"The hope, and I'm being optimistic is that it would you give you a much more accurate forecast of a potential impact by geography and potentially by what kind of impact is going to occur," said Wells. The 2D models "haven't done very well to date," he said.

Patrick Thibodeau covers SaaS and enterprise applications, outsourcing, government IT policies, data centers and IT workforce issues for Computerworld . Follow Patrick on Twitter at @DCgov , or subscribe to Patrick's RSS feed . His e-mail address is pthibodeau@computerworld.com .

Read more about mainframes and supercomputers in Computerworld's Mainframes and Supercomputers Topic Center.

This story, "Researchers race to produce 3D models of BP oil spill" was originally published by Computerworld .

Editors' Picks
Join the discussion
Be the first to comment on this article. Our Commenting Policies