- Silicon Valley's 19 Coolest Places to Work
- Is Windows 8 Development Worth the Trouble?
- 8 Books Every IT Leader Should Read This Year
- 10 Hot Hadoop Startups to Watch
Computerworld - As the oil from BP's massive spill affects an ever expanding area, scientists have embarked on a crash effort to use one the world's largest supercomputers to forecast, in 3D, how BP's massive Gulf of Mexico oil spill will affect coastal areas.
The National Science Foundation late last week made an emergency allocation of 1 million compute hours - acting within 24 hours of receiving the request - on a supercomputer used at the Texas Advanced Computing Center at the University of Texas to study how the spreading oil from BP's gusher will affect coastlines.
The goal of this effort is to produce models that can forecast how the oil may spread in environmentally sensitive areas by showing in detail what happens when it interacts with marshes, vegetation and currents.
What may be as important are models that forecast what might happen if a hurricane carries the oil miles inland, said researchers in interviews.
The computer model they are working on "has the potential to advise and undergird many emergency management decisions that may be made along the way, particularly if a hurricane comes through the area," said Rick Luettich, a professor of marine sciences and head of the Institute of Marine Sciences at the University of North Carolina in Chapel Hill, and one of the researchers on this project.
The computer models now being used to track the oil's spread aren't finely tuned enough to show just what happens as the oil nears the coast line, said Luettich
"I don't think that they have any idea how this oil is predicted to move through the marshes and the nearshore zone," said Luettich.
The scientists aren't starting from scratch. They are using storm models developed after Katrina and other storms and adding oil into the calculations. A massive amount of data is involved to create complex simulations and that takes a powerful system to process it.
The Texas supercomputer, called Ranger , is the ninth most powerful supercomputer in the world, according to the Top 500, a twice-a-year ranking maintained by an international group of supercomputing scientists.
A compute hour on a laptop with one CPU core is one hour of time. Ranger has about 63,000 compute cores, and is capable of speeds of 579 TFLOPS (One teraflop equals one trillion floating point operations per second). This system is primarily used for academic research and cost $59 million in 2008, which also included funding for building the system and four years of support.
The National Science Foundation is funding the project.
Whether one million compute hours will be enough for the project remains to be seen. Katrina research may have used as many as 20 million compute hours.
The model being used is called ADCIRC or Advanced Circulation Model for Oceanic, Coastal and Estuarine Waters. And what this storm model can do "is actually track the oil spill into the marshes and the wetlands" because of its fine scale of resolution, said Clint Dawson, professors of aerospace engineering and engineering mechanics at the University of Texas, and one of the researchers.
Originally published on www.computerworld.com. Click here to read the original story.