Virtualization hits the big time

Deploying virtualization everywhere increases benefits but creates management headaches.

With VMware's success, virtualization has taken on a life of its own. Beyond the server, vendors are now touting products for virtualizing any and every layer of the infrastructure.

"It's very difficult to apply virtualization to one part of your infrastructure unless you apply it to many or most parts of your infrastructure," says Andreas Antonopolous, senior vice president of Nemertes Research. "If you decouple some of your resources from the physical, yet they interact with other resources that are coupled to the physical, it lessens the benefits you achieve."

Antonopolous offers the example of implementing server virtualization without network or storage virtualization. "Some of the biggest benefits you get from server virtualization, like the ability to boot a given server in a different data center for disaster-recovery purposes, you can only do if your storage is virtualized and you have a [storage-area network] replicated between the two sites. Once you have those pieces in place, the benefits from server virtualization become huge."

The problem is that not all virtualization technologies are equally mature. Whereas server virtualization seems to have hit its stride, other areas are not as far along, especially in the management and security realms. And getting the various virtualized pieces to work together cohesively can be a big challenge.

Virtual frauds

Watch for application vendors that say their applications are "virtualization-ready." Application vendors have been known to overplay the virtualization card, says Paul Winkeler, founder of PBnJ Solutions, an IT consulting firm. "Application vendors realize customers are thinking about virtualization, so they will happily say their app runs fine in virtualized environments," Winkeler says. "But that's the whole idea behind virtualization - the application can't tell whether or not it's virtualized. So they're not saying anything."

Be on the look out for application vendors that say their isolation tools are virtualization. "Some application vendors use the term virtualization, when they are really just isolating," says Andy Gerringer, senior network administrator at Alamance Regional Medical Center, in Burlington, N.C. (see Alamance's award-winning virtualization project). "To isolate an application means that files are still installed and simply redirected or shielded from the operating system. That's not virtualization," he says.

Neal Tisdale, vice president of software development at New Energy Associates in Atlanta, agrees. "Sun says Solaris Containers is virtualization, and it's not full virtualization - it's more isolation," he says. "At the application level it is because your application thinks it has its own machine, but full virtualization allows you to change even network settings and [basic input/output system] settings and operating system settings and have entire copies of the operating system running."

Eating the layer cake

Baptist Healthcare System in Louisville, Ky., has struggled with this challenge firsthand. It uses VMware's ESX Server to consolidate as many as five Citrix servers onto one hardware box. It then lays Softricity's SoftGrid on top of the Citrix servers to isolate each application and deliver them to users on the fly in real time.

Tom Taylor, Baptist Heathcare

"So now we have multiple points of virtualization. We have SoftGrid on top of Citrix, running on top of ESX," says Tom Taylor, corporate manager for client/server infrastructure at the hospital group. "That's all running on [virtual LANs] and connected through a VPN and running on a SAN."

For the most part, the architecture works well and runs smoothly, Taylor says. But when performance issues crop up, pinpointing the problem through all those layers of virtualization is difficult.

"It's been a struggle eating that layer cake, if you will," he says. "The drawback to virtualization is added complexity. If all these different layers are virtualized, and there's a problem, who owns it? Ultimately, it falls on the poor guy putting it into the enterprise, and in my environment, that's me. It's my responsibility to work with the vendors to find root causes, and when you're dealing with all these different layers, it's complex and it's frustrating."

How do you know it's virtualization?

Decoupling and independent. Virtualization is the decoupling of operating systems from hardware, says Andy Gerringer, senior network administrator at Alamance Regional Medical Center in Burlington, N.C. "As far as a universal definition goes, I guess you would use the term independent," he says. "If one thing is dependent or reliant on another, then there isn't true virtualization.

For example, VMware virtualizes an operating system that is independent of the hardware much the same as Softricity virtualizes an application that is independent from the operating system," says Gerringer, who oversaw a virtualization project that won a 2006 Enterprise All-Star Award.

Logical vs. physical, plus benefits. Virtualization is separating the logical from the physical, with the aim of more flexibility, says Andreas Antonopoulos, senior vice president of Nemertes Research. "Through a looser association between the logical and physical, virtualization should provide flexibility in the form of management, allocation and discovery," he says. "It should provide gains you can't get in the physical environment."

Replacement technology. Others offer more practical guidelines. Tom Taylor, corporate manager for client/server at Baptist Healthcare System, in Louisville, Ky., says he defines virtualization as "using technology to replace the physical attributes of a provided service." He uses a credit-card analogy: "A credit card is money virtualization. The money gets moved around but there's no actual cash involved. That's how it is with IT virtualization as well."

Security issues

Beyond such complexity, there also are problems implementing virtualization in an environment in which not all layers are virtualization ready. This is especially true for security, which Antonopolous calls a virtualization laggard.

"Virtualized servers rely on security resources that are usually tied very absolutely to the physical through the IP address, thereby creating problems," he says. "If you have a firewall that says IP address A can talk to IP address B, but both of those IP addresses are virtualized and both of the servers behind them are virtualized, yet that firewall still assumes a static association, it makes it difficult to move resources around. It makes it harder to manage the infrastructure."

This is especially problematic when users look to virtualize a typical three-tier architecture consisting of a Web server, application server and database server. In traditional environments, the Web server might be on the DMZ, separated from the application server and database server on the internal network by firewalls. But once the servers are virtualized and consolidated onto one large blade server, for example, the idea of the physical DMZ and perimeter goes away.

"You should have firewalls between the servers, but you can't physically put a firewall there, because they're all running on the same blade frame," Antonopolous says. "You should be able to logically put a firewall between them, but you don't have security virtualization software to do that."

Baptist Healthcare's Taylor has run into this issue as well. "We wanted to virtualize some of the servers and services in our DMZ, so I proposed that we take an ESX Server and dedicate a couple of the network cards on it to the DMZ, leaving the rest of the network cards VLAN-ed to our private network," he says. "In theory, they'd be separate, because we'd have virtually separated the two even though they'd be riding on the same server."

The hospital's network and security groups nixed the idea, however. "They said that if we wanted to virtualize those services, then that ESX Server had to be fully in the DMZ and could not have any ties to the private network except through the firewall. It's a huge limitation."

Technologies such as federated ID and digital certificates are heading in the right direction, as are recent moves by Cisco and others to add more security within switches and routers,Antonopolous says. But more interesting would be placing the security in the server virtualization itself, within the hypervisor. "That has a significant advantage, because it's software and can be moved around. And it can be provided to all virtual machines residing on that hypervisor, which provides flexibility," he explains. "In fact, it can even enforce policies for how those machines communicate with each other within that hypervisor." Malware designed to attack the operating system would not be able to reach it.

VMware has indicated it will be adding security to its hypervisor over time, and Antonopolous says other vendors are sure to follow.

The bigger picture

Today, companies can implement virtualization across the IT infrastructure without management and security headaches by sticking to standards and designing with regard to overall business objectives. Wachovia Bank in Charlotte, N.C., is one such user. The bank used DataSynapse's GridServer as the basis for its enterprisewide grid-computing architecture. From there, it implemented DataSynapse's FabricServer to virtualize its Java applications, enabling it to reduce overall hardware and programming costs, resulting in seven-digit savings annually and at least a 300% ROI (Wachovia earned a 2006 Enterprise All-Star Award for this project).

Wachovia looks at virtualization not as a tactic but as an overall business strategy, says Tony Bishop, senior vice president and director of product management at the bank. Bishop says he doesn't get caught up in specifics such as server or storage virtualization. Instead, he says, the bank aims to virtualize demand and supply across the whole infrastructure, building what he calls a service-oriented infrastructure.

"Demand virtualization is at run-time when [a user or system] says, 'Do this for me; calculate this for me; fetch this for me; look this up for me,'" Bishop says. "On the supply side, you need virtualization to give you the flexibility and control to abstract and alleviate any constraints of a hard-wired environment, so it lets you move stuff around and adjust, allocate or partition things based on efficiencies at run-time. FabricServer is the broker between the virtualized supply-and-demand environments, he says.

"FabricServer deals with the execution," he says. "It recognizes who I am, what kind of service level I'm supposed to get and what kind of priority, and it gives me the right resources to fulfill that."

This top-down approach lends itself to huge business efficiencies, Bishop says. "Very quickly, you see there are many business processes and systems that could leverage a virtualized environment like this," he says. "If I can pick up and virtualize the movement of my processing to available capacity, wherever that is, I can alleviate bottlenecks, be more cost-effective, improve my performance or resiliency and so on."

Because each service, be it on the demand or supply side, communicates with FabricServer in a standard rules-based way, managing performance is more straightforward. Plus, FabricServer brokers each service, thus ensuring strict security.

"FabricServer provides logical security. It acts as the logical controlling broker mechanism," Bishop says. "It knows my profile, whether I've been authenticated, whether I'm entitled to get service, what I'm entitled to, and so on and so forth. It can even ensure the encryption of the message back and forth."

The only drawback for Wachovia, Bishop says, is that the true, open standards for getting all the layers talking and working together aren't ready yet. And, because "we're not completely sure what the final ones will be, we're relying on de facto standards, which I think bring you about two-thirds of the way there. It's just something to keep an eye on," he says.

In the end, organizations that want to reap the most benefits from virtualization can't do it piecemeal, analysts and users say. "You can start small, with one particular application or by pooling a set of resources," Bishop says. "But don't stop there. If you're going to do it, do it. Don't just play at it. Do your research, do your homework, but do it. If you go just halfway, you get in trouble, and you won't get the gains of your investments."

Cummings is a freelance writer in North Andover, Mass. She can be reached at jocummings@comcast.net.


< Previous story: FAQ on virtualization | Next story: Linux virtualization heats up >

Learn more about this topic

6 hot virtualization technologies for '06

1/9/06

All things virtual

8/21/06

Taking VMware beyond the basics

8/21/06

Managing the virtual realm

8/21/06

Join the Network World communities on Facebook and LinkedIn to comment on topics that are top of mind.

Copyright © 2006 IDG Communications, Inc.

IT Salary Survey: The results are in