Seven things that will sink virtualization

Experts reveal potential pitfalls

IT executives and industry watchers offer tips on what not to do when deploying server virtualization.

Nick Portolese

Many IT professionals measure the success of a technology deployment by all the things they did (see "7 tips for succeeding with virtualization"), but some say a successful virtualization implementation often can be the result of things IT didn't do.

"Many factors play here, such as the skill set of one's team, budget allocation and anticipated timeline. If you don't do your homework ahead of time [with virtualization], you can pay the consequences longer term," says Nick Portolese, senior manager of data center operations at Nielsen Mobile in San Francisco. "I learned this quickly enough midstream so I could benefit from it, but I do wish I had someone to tell me about all of this at the outset."

Here, industry watchers and enterprise IT managers reveal what not to do when starting or expanding a virtualization initiative.

1. Don't commit a false start.

Jumping the gun and rushing a virtualization project across IT will serve only to bombard staff with more work than necessary and fewer results, Portolese says.

Many IT shops start tinkering with virtualization on a small scale, but larger deployments require extensive research on servers, workloads, applications and business demands to reveal where the technology will benefit the company most. Portolese's advice starts with this: Don't rush a rollout until you have done your homework and know your environment inside out.

"An essential piece of this is to understand and categorize the workloads and existing management of each of the applications you want to virtualize -- before you start," Portolese says. "This has been a learning curve for me and my team."

2. Don't delay management plans.

Putting a detailed management approach in place after a deployment will result in performance problems across the virtual environment and headaches for IT staff, industry watchers say.

"Moving forward to expand virtualization without implementing good management discipline -- in areas like capacity, performance, configuration or automation -- is a recipe for disaster," says Andi Mann, research director at Enterprise Management Associates (EMA). "Once a deployment gets over a certain size -- around 100 or more virtual machines, depending on workloads -- or once it gets out of test and development and moves to production, the typically unmanaged environment is just not going to work."

3. Don't neglect the human factor.

It's naïve to pretend virtualization is a simple matter of having staff manage more server instances, and doing so could wreak havoc, industry watchers say. Many argue that virtual servers can be managed alongside physical servers, but virtualization technology probably will reach across IT domains and require staff to become adept at deploying and managing virtual servers, desktops, storage and applications.

"Don't delay in assessing the personnel impact," says Cameron Haight, a Gartner research vice president. "Virtualization is becoming pervasive, so organizations need to adapt to meet the new realities that it presents -- developing a virtualization competency center, for example."

Virtualization champions can't limit their campaigns to the server group. Virtualization will touch all IT departments, and neglecting to get everybody's input will result in pushback.

"Trying to shoehorn a broad virtualization initiative into an unreceptive IT department, especially when the server group ignores the mandates from other departments like security, networking or storage, is another big problem," EMA's Mann says.

In addition to bringing virtual know-how into the current technical staff's skill set, IT cannot ignore the cultural changes virtualization will impose upon business units.

"There are also the human issues of dealing with business departments and assuming they will be happy to share 'their servers' -- the ones they already paid for -- with other departments for no other reason than reducing IT's operating costs," Mann says.

4. Don't overload the virtual infrastructure.

Virtualization seems to perform magic across overwrought infrastructure by partitioning resources to handle more workloads, but that doesn't mean virtual servers are infallible. 

The technology still has its limits, warns Jeremy Gill, CIO at Michael Baker Corp., a civil engineering firm in the Pittsburgh area. "Don't put more virtual machines on infrastructure than it can handle," he says, citing his firm's practice of adding infrastructure once server utilization hits 70% to 75%.

5. Don't assume one size fits all.

Virtualization offers speedy provisioning of resources, but that doesn't mean IT departments can skimp on capacity planning and testing before they roll out virtual machines or assign applications to virtual infrastructure, experts say.

"Don't move a new application or server type without first having tested I/O, utilization and more," Gill says.

Capacity planning tools, such as those from Cirba, ISaccountable and PlateSpin (part of Novell), could help IT managers understand how their current infrastructure consumes resources and choose the appropriate applications to migrate to a virtual platform, says James Staten, principal analyst at Forrester Research.

"Don't assume one size fits all for virtual machines. Each one will have its own unique performance characteristics," Staten says. "Be sure to use a capacity-planning and management tool to help make the right sizing decisions."

6. Don't let your guard down.

There hasn't been a widely publicized attack on virtual infrastructure to date, but that doesn't mean enterprise IT managers can rest easy or assume that virtual machines aren't vulnerable.

Such vendors as Altor Networks, Blue Lane Technologies, Catbird and Reflex Security are working to add a layer of security to the virtual environment, but the underlying platform still could suffer an attack, industry watchers point out. The market for virtual security products is nascent, and analysts argue that hypervisor providers should take the opportunity to weave tighter security controls into their products to ensure more secure environments.

"Adding a layer of security to virtualized deployments is different from securing the virtualization platform itself. The risk is huge: The hypervisor stands in a very strong position to augment, if not outright replace, the operating system as the underlying platform," says Scott Crawford, research director at EMA.

Enterprise IT managers have to realize that approaches to securing known environments still can fail, so virtualization presents more opportunity for attackers.

"Despite a decade of experience in securing IT, it is still possible, for example, to directly modify DLL files critical to the Windows operating system environment. Hooks into the platform for adding on security is not the same as building it in," Crawford says.

7. Don't set it and forget it.

A large part of virtualization's appeal is that it's dynamic by nature. But that also means the technology demands more care and feeding than physical machines.

IT can't configure virtual infrastructure and assume it will work as expected for lengthy period of time, Nielsen Mobile's Portolese says. IT managers must plan for virtual-machine life-cycle management from the outset, he says. Products from vendors like Embotics and Fortisphere offer features to deprovision virtual resources that are no longer needed, and audit the environment to rediscover underutilized resources.

"Having a mechanism to audit and decommission [virtual machines] is useful because there is no point in having a system consuming precious ESX resources," Portolese explains. "It is a bit of a no-no if you don't have some type of life-cycle management tools in place to assist here."

Essentially, the homework that Portolese suggests IT managers do at the start of a virtualization project must be an ongoing process to maximize the return on virtual resources and prevent wasting time correcting errors left untouched for too long in the environment. "If one does not take the majority of time developing and planning the initial deployment or ongoing expansion, they will spend a lot more time and effort maintaining and correcting past mistakes."

< Previous story: 7 tips for succeeding with virtualization | Next story: Why San Diego city workers expect apps up and running in 30 minutes or less >

Copyright © 2008 IDG Communications, Inc.

The 10 most powerful companies in enterprise networking 2022