Stanford consumes million core supercomputer to spawn supersonic noise forecast

Sequoia IBM Bluegene/Q system at Lawrence Livermore National Laboratories to run complex simulations to figure out the physics of noise

Stanford researchers said this week they had used a supercomputer with more than one million computing cores to predict the noise generated by a supersonic jet engine.

The researchers used the 1,572,864 processor Sequoia IBM Bluegene/Q system at Lawrence Livermore National Laboratories to run complex simulations that determined the physics of noise  that are often impossible in the harsh exhaust environment of massive and powerful jet engines.  

[NEWS: 25 crazy and scary things the TSA has found on travelers]

"The exhausts of high-performance aircraft at takeoff and landing are among the most powerful human-made sources of noise. For ground crews, even for those wearing the most advanced hearing protection available, this creates an acoustically hazardous environment. To the communities surrounding airports, such noise is a major annoyance and a drag on property values.  Understandably, engineers are keen to design new and better aircraft engines that are quieter than their predecessors. New nozzle shapes, for instance, can reduce jet noise at its source, resulting in quieter aircraft," Stanford stated.

The researchers noted that with the advent of massive supercomputers boasting hundreds of thousands of computing cores, engineers been able to model jet engines and the noise they produce with accuracy and speed.  Such fluid dynamics simulations test all aspects of a supercomputer. The waves propagating throughout the simulation require a carefully orchestrated balance between computation, memory and communication. Supercomputers like Sequoia divvy up the complex math into smaller parts so they can be computed simultaneously. The more cores you have, the faster and more complex the calculations can be, the researchers said.

"And yet, despite the additional computing horsepower, the difficulty of the calculations only becomes more challenging with more cores. At the one-million-core level, previously innocuous parts of the computer code can suddenly become bottlenecks," the researchers stated.

stanford

 Follow Michael Cooney on Twitter: nwwlayer8 and on Facebook

Check out these other hot stories:

DARPA wants electronics that can dissolve or burst apart after use

The IEEE Gadget Graveyard: What technology will bite the dust in 2013?

Company set to blast squadron of tiny satellites into mine asteroids

FTC targets outfits that crammed $70M onto phone bills

Ex-Microsoft execs look to tie -up sex merchandise business in India

Security Holy Grail anyone?

Insider Shootout: Best security tools for small business
Join the discussion
Be the first to comment on this article. Our Commenting Policies