The U.S. is falling behind Japan in the area of supercomputing, as federal research agencies have shifted their focus toward grid computing in the past decade, according to witnesses at a congressional hearing Wednesday.
The result is that U.S. companies have less access to supercomputing resources because demand from the U.S. government has traditionally driven the supercomputing industry in the U.S., critics of the government's efforts in high-performance computing told the House Science Committee.
"The federal government cannot rely on fundamental economic forces to advance high-performance computing capability," said Vincent Scarafino, manager of numerically intensive computing at Ford Motor. "The federal government should help with the advancement of high-end processor design and other fundamental components necessary to develop well-balanced, highly capable machines. U.S. leadership is currently at risk."
The Science Committee hearing on the status of supercomputing in the U.S. turned into an argument over the relative merits of expensive stand-alone supercomputers vs. networked computing grids made up of cheaper commodity computers. Supercomputing, or high-performance computing, is the use of high-end computers on scientific, industrial, national defense, and other computing-intensive applications.
Grid, or parallel, computing, which is the use of many computers networked together to perform a specific task, has been useful to Ford for such experiments as vehicle safety analysis, Scarafino said, but parallel computing cannot be used to do all kinds of analysis. Supercomputers are needed to do such experiments as occupant injury analysis, he added.
Worried about the launch of Japan's Earth Simulator in March 2002, members of the U.S. House Science Committee called for more cooperation between federal agencies and a renewed U.S. government push for high-performance computing. The Earth Simulator is ranked as the world's fastest supercomputer, although the next five fastest supercomputers are in the U.S., according to the Top500.org supercomputer list.
"Supercomputers help design our cars, predict our weather, and deepen our understanding of the natural forces that govern our lives," said Rep. Sherwood Boehlert (R-N.Y.) chairman of the committee. "So when we hear that the U.S. may be losing its lead in supercomputing, that Japan now has the fastest supercomputer, that the U.S. may be returning to a time when our top scientists didn't have access to the best machines, that our government may have too fragmented a supercomputing policy -- well, those issues are a red flag that should capture the attention of all of us."
But the launch of the Japanese Earth Simulator isn't entirely bad news for the U.S., said Raymond Orbach, director of the Office of Science at the Department of Energy. Instead, it shows that scientific research computations faster than 25 teraflops at sustained speeds is possible, he said.
"We think that the range of 25 to 50 teraflops opens up a whole new set of opportunities for scientists that have never been realized before," Orbach said. "So what we have, thanks to the Japanese now, is existing proof."
Boehlert questioned whether the U.S. National Science Foundation (NSF) still has supercomputing as a priority after the agency has seemed to move toward supporting grid computing. "Lethargy is setting in, and I'm getting concerned," he said. "I don't want to be second to anybody."
Supercomputing remains a priority at NSF, said Peter Freeman, assistant director for computer and information science and engineering at the NSF. But an NSF Advisory Panel on Cyberinfrastructure, in a report released in February, recommended the agency also focus on other technologies, including grid computing. Supercomputers need to be integrated into computing grids, he said.
Freeman called supercomputers essential, but he said they should be one piece of a cyber infrastructure that includes networks and databases. Without those other pieces, supercomputers "will not deliver their potential," he said.
Freeman attempted to answer the question posed by the committee: Is the U.S. government on the right path with supercomputing? "My answer is yes, if we keep in perspective that supercomputers must be embedded in a cyber infrastructure that also includes massive storage, high-performance networks, databases, lots of software, well-trained people, and that the entire ensemble be open to all scientists and engineers," he said.
However, Scarafino and Daniel Reed, director of the National Center for Supercomputing Applications at the University of Illinois at Urbana-Champaign, seemed to disagree. Reed called on the NSF to continue its role in pushing supercomputing research, a role that began in the 1980s but began to wane in the mid-1990s, according to witnesses.
"Grids, which were pioneered by the high-end computing community, are not a substitute for high-end computing," Reed said. "Many problems of national importance can only be solved by tightly coupled high-end computers."
Reed asked that Congress spend more money on supercomputing research and development. "I know that everyone who comes before you pleads poverty and asks for more money," he told the committee. "But there is a clear need for additional resources."
However, Freeman noted that federal agency budgets for supercomputing R&D have increased significantly in the last 10 years. U.S. government funding of supercomputing went from $421.9 million in fiscal year 1992 to a high of $906.7 million in 2001. Supercomputing R&D requests for federal agencies in fiscal year 2003 are $846.5 million, up from $788.9 million in 2002, according to the Science Committee.
Funding is expected to continue to increase at the NSF, Freeman added. When Boehlert asked if the NSF was optimistic about future budget increases, Freeman answered, "That is certainly our intention."
Asked how federal agencies can better work with each other to promote supercomputing, Freeman and Orbach said agencies are already working together frequently.
But Reed questioned if the U.S. has a commitment to the long-term supercomputing funding effort that resulted in the Japanese supercomputer, instead of funding other technologies, such as grid computing. "They sustained an investment in machines," he said. "We went in a different direction, which yielded some real benefits, but we stopped going in a direction that still had significant opportunities."