Americas

  • United States

Microsoft faces 64-bit question

News
Oct 07, 20026 mins
Enterprise Applications

New Datacenter release to offer improved performance over 32-bit version.

REDMOND, WASH. –  Microsoft‘s first operating system designed for the corporate data center has garnered sparse acceptance since its release two years ago, but the software is on the verge of a performance upgrade that experts say could put it on the road to corporate recognition.

Early next year Microsoft plans to release Windows.Net Server 2003 Datacenter, a 64-bit operating system that will boast more addressable memory for high-level transaction processing and up to 25% better performance compared with the 32-bit version.

The 32-bit version of Datacenter was intended to be Microsoft’s answer to 64-bit Unix systems and mainframes in corporate data centers. It’s sold as a hardware and software combination through OEMs, such as Hewlett-Packard, IBM, NEC or Unisys, which guarantee the stability of the package through rigorous testing and certification.

“The 64-bit version will be a very solid step forward for them,” says Tony Iams, senior analyst with D.H. Brown Associates. “Until they have the full 64-bit stack – processor, operating system, applications – it will be a challenge for them to match Unix systems or the mainframe.”

A 64-bit version of SQL Server will closely follow the .Net Datacenter server release.

However, the 64-bit operating system won’t vault Microsoft into data center glory, observers say.

“This is a mindset shift from Microsoft’s high-volume business model. The Microsoft ecosystem is not built for the low-volume business that is the data center. They will need three to four years to mature and re-create such things as the familiar Windows software library and add support,” he says.

And on top of that, Microsoft must prove capabilities it has not had in the past and that took Unix years to develop before it was accepted in the data center.

“Microsoft has to prove it can play nice and that means the hardest nuts of computing have to be cracked, including interoperability with everything, reliability, performance, manageability down to individual cycles and high-level service-level agreements,” says Dan Kusnetzky, vice president of system software with IDC.

Microsoft will start to address some of those issues with .Net Server 2003 Datacenter.

On top of support for Intel 64-bit chips, Microsoft will add dynamic partitioning, eight-node clustering, support for nonuniform memory access and the Windows System Resource Manager (WSRM), a workload management tool.

“We had process control in Windows 2000 Datacenter, but with WSRM now you can specify resources for individual applications,” says Bob Ellsworth, director of Windows server marketing for Microsoft. “This new version of Datacenter will really show progress in reliability and scale.”

Ellsworth’s contention was supported last month when the new Datacenter operating system posted Microsoft’s highest benchmark ever for a nonclustered system in the Transaction Processing Council’s TPC-C tests for Online Transaction Processing.

NEC, using its TX7 server with 32 Itanium 2 processors running beta code of .Net Server 2003 Datacenter and 64-bit SQL Server 2000 Enterprise Edition, more than doubled the previous high mark and placed fifth on the TPC-C list behind four Unix servers.

But benchmarks won’t drive corporate demand.

Today, Datacenter’s 32-bit architecture, certification requirements, limited number of supported applications (43 to date) and price tag (in the millions of dollars depending on configuration) have combined to dampen corporate interest.

There are fewer than 1,000 installations of Datacenter, which supports eight- to 32-way configurations, according to analysts, although they acknowledge it is difficult to get an accurate measure given that Microsoft doesn’t reveal shipment numbers.

Early adopters

Users today are taking it slow with Win 2000 Datacenter by first consolidating servers and reaping reliability gains. The city of Minneapolis did that slightly more than a year ago, consolidating 18 servers onto one 32-way Unisys ES7000 machine running Datacenter server to ensure 24-7 uptime to support an e-government initiative.

The move eliminated the annual $20,000 cost the city spent per technician to ensure someone was always on call and resulted in nearly a $50,000 savings in licensing fees.

The city stayed away from Unix mainly because it lacked expertise with the platform, but using Datacenter has not been without its setbacks.

Those setbacks have involved Datacenter Program, which requires hardware vendors who sell Datacenter-equipped machines to test and certify that add-on software or devices that touch the kernel or modify kernel mode Document Link Libraries (DLL) won’t disrupt operation of Datacenter. Windows has a history of installed software creating memory leaks or DLL collisions that result in system crashes.

Datacenter’s certification program is designed to prevent that, but it also alters the traditional dynamics of running a Windows operating system.

“Certification is an issue you deal with when you go to Datacenter,” says Ray Zabilla, CEO of consulting firm Bitsolutions and an ex-mainframe guru who helped deploy Datacenter for the city.

“It seems we have to get the [software] vendors to understand the difference between Windows 2000 and Datacenter. Some of our vendors don’t understand that we’re unable to load just any software on the machine,” he adds.

Zabilla says that does create problems for the city, which consolidated Internet Information Server, SQL Server 2000 and Exchange Server 2000 into clusters on one box, in that it must sometimes wait months for a Datacenter version of a service pack.

And there are other areas for improvement, Zabilla says, mainly in that he’d like more support resources for the platform. “There isn’t an Exchange support queue dedicated to Datacenter,” he says.

But for the most part, users have experienced a stability and reliability not before enjoyed with the Windows operating system.

“We’ve had some support and patch issues, but we’ve seen many months with 100% uptime,” says John Benzinger, vice president of IT for Freemarkets, which provides a combination of software and hosted services that multinational companies use to purchase goods and services. “We’ve seen a reduction in preventative maintenance, including taking the server down, rebooting and cleaning log files.”

Freemarkets, the first customer to go live on Datacenter, runs eight Compaq 8500 eight-way servers loaded with Datacenter and 16G-bytes of RAM. Benzinger has two two-node clusters for production applications that run online auctions and business intelligence for multinational corporations and two two-node clusters for testing and Q&A.

Benzinger says two of his gripes are that Microsoft must work closer with its OEM hardware partners to ensure the timely release and certification of patches, and that the Datacenter pricing model be revised. The current model requires him to pay the same price for machines he runs for quality assurance and development as those he runs in production.

Those are just a few of the milestones of maturity Microsoft must pass before it can grow up into the corporate data center.

“The first version of this software has been a very extended market test,” IDC’s Kusnetzky says. “Microsoft still has to meet a lot of requirements Unix took a decade to meet. Sixty-four-bit support is old news in the Unix world.”