Americas

  • United States

New IT model gives Morgan Stanley flexibility

News
Jan 22, 20034 mins
Data Center

In mid-2001, Morgan Stanley began overhauling its IT infrastructure, aiming at a new management architecture based on a thin-client model that would allow all data, applications and even operating systems to be hosted on network servers.

The guiding goal was flexibility, said implementer Jeffrey Birnbaum, managing director and global head of enterprise computing in Morgan Stanley’s institutional securities division. With thousands of internally developed applications and 36,500 supported PCs spread throughout 20 countries, Morgan Stanley wanted an architecture that would let it quickly deliver anything that needed to flow to end users, including data, applications, software patches and system configuration changes.

It also wanted the ability to add new applications and machines to its network for the lowest possible cost, Birnbaum said.

In the past 18 months, Red Hat has helped Morgan Stanley work toward its flexible architecture. Birnbaum and Red Hat CTO Michael Tiemann were in New York Tuesday for an afternoon keynote they dubbed “100 Million Reasons Why Architecture Matters.”

Morgan Stanley’s definition of enterprise computing is “any application on any box at any time,” Birnbaum said.

“We had to get away from the idea of having local installs,” he said. For organizations with thousands of applications and PCs, the client/server model is a synchronization mess.

While a low operating cost was one of his goals, Birnbaum said he knew he would be adding hardware to create the system he envisioned. The vendor-led consolidation push is the wrong way to go, he argued.

“This is the anti-consolidation speech in some sense,” he said. “Decisions should not be driven by, ‘I need to lower the number of boxes I have in the environment to manage it.’ … It’s inevitable. You have to add boxes to manage the environment.”

A good architecture means you can add components without adding to management burdens, he said.

Tiemann mapped out the year-and-a-half process of developing the technology needed to get Morgan Stanley to its “any application, any box, any time” set-up. The company still isn’t 100% there, but the project has been sufficiently successful that its timeframe is advancing: By 2005, two years ahead of schedule, the company plans to be running 80% of its systems on commodity hardware, such as thin clients, that can be used for different tasks, as opposed to hardware configured for specific tasks.

A key advance was the release of version 2.4 of the Linux kernel, Tiemann said, calling it the first kernel version ready to support a massive enterprise deployment. Slowly, in concert with other hardware and software makers, Red Hat and Morgan Stanley developed the tools, new Linux features, and integration required for Morgan Stanley’s architecture.

The payoff for Morgan Stanley has been performance and reliability increases along with cost decreases, Tiemann said. For Red Hat and other Linux developers, the project presented new scale and complexity obstacles, the solutions to which have advanced the Linux kernel and related vendor technologies, he said.

One pair of attendees said they were drawn to the keynote because of Red Hat’s participation. They’re using Red Hat’s software throughout their organization, the government of Jefferson County, Colo.

Steve O’Brien, the county’s IT operations director, said he appreciated Birnbaum’s remarks about the misleading mantra of consolidation.

“I’ve never thought that’s something you should be doing for its own sake, so it’s nice to hear someone else say that,” O’Brien said.

David Gallaher, Jefferson County’s director of IT development, said the speakers’ comments about the cost advantages of an open architecture resonated with him. Burned in the past by what he calls “the horrors of client/server,” he has for the past three years used a development model of applications written in Java, run on Linux, and delivered to users via a Web browser.

The lesser expense of working with open technologies has freed him to tackle projects that would previously have been prohibitively expensive, he said.

“Can you imagine the cost to run a development server, a transition server, a test server, and then to deploy?” he said. “We could never have afforded it. We can now.”