- Top 10 Recession-Proof IT Jobs
- 7 Hot IT Jobs That Will Land You a Higher Salary
- Link Building Strategies and Tips for 2014
- Top 10 Accessories for Your iPad Air
IDG News Service - Many attempts have been made over the last 46 years to rewrite Amdahl's law, a theory that focuses on performance relative to parallel and serial computing. One scientist hopes to prove that Amdahl's law can be surpassed, and that it doesn't apply in certain parallel computing models.
A presentation titled "Breaking the Law" at the International Supercomputing Conference this week in Leipzig, Germany, will show how "pitfalls of Amdahl's law can be avoided in specific situations," according to a blog entry that provides a teaser on the presentation.
The presentation will "challenge Amdahl's generalized law by exposing it to a new class of experiments in parallel computing," wrote Thomas Lippert, director of the JA1/4lich Supercomputing Centre at JA1/4lich, Germany, in the blog entry. Lippert will lead the presentation.
Amdahl's law, established in 1967 by noted computer scientist Gene Amdahl when he was with IBM, provides an understanding on scaling, limitations and economics of parallel computing based on certain models. The theory states that computational tasks can be decomposed into portions that are parallel, which helps execute tasks and solve problems quicker. However, the speed of task execution is limited by tasks -- in the case of computers it could be serial tasks -- that cannot be parallelized.
"If you throw enough hardware at parallel you can solve a problem; you still have to do some in serial, which is a limiting factor in speeding up tasks," said Nathan Brookwood, principal analyst at Insight 64.
The mathematics of Amdahl's law assume there is a limit to parallel speed-up, assuming some things are constant, such as the problem size and the nature of the processors doing the computation.
Amdahl's law has been challenged in the past. Amdahl's law was re-evaluated by John Gustafson, who provided an understanding of parallelism among processors in which the size of a problem can be meaningfully increased. The corollary, called Gustafson's law, assumes that problem size is not constant, and parallel computer speed can scale up accordingly. Gustafson now works at Advanced Micro Devices as senior fellow and chief graphics product architect.
The mathematic equations resulting from Amdahl's law and corollaries have become reference points as chip and software makers try to scale supercomputing performance. Such computing power is necessary to find scientific solutions in fields such as biotechnology and meteorology. Countries are also developing faster supercomputers for economic forecasting and national security reasons.
The ISC presentation has been triggered by a past history of optimizing simple and efficient systems for simulations in high-performance computing such as in Blue Gene/L, which took over as the world's fastest computer in 2004, said Lippert in an email. The systems are highly scalable and energy efficient, but have been restricted to problems that are not adapted to parallel processing.