• United States
by Network World Staff and IDG News Service

Microsoft, others eye supercomputing

Nov 21, 20054 mins
Computers and PeripheralsData CenterIBM

Increased processing power coupled with lower-cost hardware is making supercomputing a more viable platform for corporate buyers running data-intensive business applications.

That was a major theme at the 18th annual Supercomputing conference held last week at the Washington State Convention and Trade Center in Seattle as vendors combined forces in an effort to spread supercomputing beyond its academic and research roots. Leading the charge was Microsoft, which has had a presence at the high-performance computing conference since 2003, but stepped up its profile this year with a keynote address by Bill Gates (transcript of Gates’s address).

In addition, the software giant used the conference as a launch pad for its formal entry into the market dominated by Linux.

“The [big] message is that Microsoft is entering scientific computing, technical computing, and we will bring all the attention and commitment that we have given to business and consumer computing in the past to this new area where computation is poised to make a significant impact,” said Kyril Faenov, directory of high-performance computing at Microsoft.

As part of that commitment, Microsoft is readying an operating system tuned for high-performance computing: Windows Compute Cluster Server 2003. The software is in its second beta release and should be generally available in the first half of next year, the company said.

Microsoft also said it has set up 10 high-performance computing institutes in universities around the world to support research and drive the company’s efforts to meet the software demand for computing-intensive environments.

At the same time, vendors such as HP and InfiniBand switch-maker Voltaire announced support for Microsoft’s cluster efforts. Platform Computing, which makes resource-management and orchestration software for high-performance computing clusters, said it was partnering with Microsoft to integrate its work scheduling Platform LSF software into Windows Compute Cluster Server 2003.

Industry experts note that Microsoft has a tough battle ahead as it tries to break into a market that has been dominated by more specialized, higher-end software and systems, such as IBM’s Blue Gene, a commercial version of the supercomputer ranked as the fastest in the world.

Platform Computing also announced an expanded relationship with IBM to integrate Platform LSF into Blue Gene.

“This is the very high end of supercomputing vs. the Cluster Compute Server from Microsoft for the entry level,” says Songian Zhou, Platform Computing’s CEO. “These are the bookends of the entire technology spectrum of the [high-performance computing] market, and this general trend [of integrating the management of HPC resources] is expanding into the data center for enterprise IT.”

Supercomputers are appearing in enterprise data centers as the technology moves away from specialized, proprietary systems to clusters of low-end machines based on standard processors from Advanced Micro Devices (AMD) and Intel running Linux, industry observers say.

Even supercomputing giant Cray is moving to a standard platform, announcing at the Supercomputing conference that it was extending its partnership with AMD and would build its next-generation supercomputers on the Opteron platform. Sun also used the show to unveil upgraded Opteron-based Sun Fire servers designed for high-performance computing clusters.

“More industries are using high-performance computing for their data-driven tasks. One reason is that cluster computers have become more reasonably priced,” said Kathryn Kelley, communications chair for the conference. “The hardware, software and algorithms have become easier to use.”

As a result, the Supercomputing conference is getting a closer look from corporate IT professionals, and not just those in energy, finance, manufacturing and pharmaceutical industries.

“The truth of it is, there are parallel problems that companies like Google face that, as I understand it, are as challenging as the scientific problems that we are trying to solve with high-end scientific computing,” said Richard Loft, deputy director of the scientific computing division at the National Center for Atmospheric Research in Boulder, Colo.

He said Gates’ keynote address is one example of the expansion of supercomputing beyond niche deployments into broader enterprise use. “Microsoft has been notable by its absence in the high-performance computing game,” he said. “The fact that [Gates] is giving a keynote signals something of a shift. It represents the broadening of the definition of high-performance computing from being simply about scientific calculations to including a lot of large-scale computing associated with business.”

It’s no surprise that attendance at the Supercomputing conference is growing, Kelley said. She estimates the show drew more than 9,000 attendees this year, compared with fewer than 8,000 last year. Nearly 270 exhibitors, including EMC, HP and Microsoft, crowded the two-level show floor.

“I personally think it’s healthy for the scientific computing community and what you would call the large-scale business-computing community to swap ideas,” Loft said. “There are likely to be a lot of ways we could end up cross-fertilizing one another.”