Next-generation storage networks are all about dynamic allocation of resources, and thin provisioning can get that job done quickly and easily – but not carefree.
Next-generation storage is all about dynamic allocation of resources, and thin provisioning can get that job done quickly and easily - but not carefree.
As thin provisioning - also called dynamic provisioning or flex volumes - becomes a standard feature in virtual storage arrays, IT executives and other experts warn that dynamic resource allocation is not a one-size-fits-all proposition. Applying the technology in the wrong way could create a major disaster, they caution. (Compare Storage Arrays.)
"Vendors have made it so that IT teams think, 'Thin provisioning is so easy, why wouldn't we use it?' But some major thinking has to go into how your storage network is actually architected and deployed to benefit from the technology," says Noemi Greyzdorf, research manager at IDC. (Click here for a slide show of eight technologies for better data storage.)
In traditional storage networks, IT has to project the amount of storage a particular application will need over time, then cordon off that disk space. This means buying more hardware than is needed immediately, as well as keeping poorly utilized disks spinning ceaselessly - a waste of money and energy resources.
With thin provisioning, IT can keep storage growth in check because it need not commit physical disk space for the projected amount of storage an application requires. Instead, IT relies on a pool of disk space that it draws from as the application needs more storage. Having fewer idling disks means better capacity management, increased utilization, and lower power and cooling consumption.
Hold on a sec . . .
IT executives shouldn't let these benefits, attractive as they are, blind them to the technology's requirements, early adopters say. You can't just flip the switch on your storage pool and walk away, says Matthew Yotko, senior IT director at New York media conglomerate IAC, speaking about one of the biggest misconceptions surrounding thin provisioning.
You've got to take the critical step of setting threshold alerts within your thin-provisioning tools because you're allowing applications to share resources, Yotko says. Otherwise, you can max out your storage space, and that can lead to application shutdowns and lost productivity because users can't access their data.
"You can get pretty close to your boundary, fast, and that can lead to panicked calls asking your vendor to rush you a bunch of disks. Alerting is an important aspect that we originally missed," Yotko says.
Yotko has since integrated threshold alerts for IAC's 3Par array into a central network management system, he says. Doing that lets him keep tabs on how myriad file, e-mail, domain-controller and Web-application servers for 15 business units are handling the shared resource pool. That pool supports more than 25TB of data, he adds.
Yotko also calculates the time between an application reaching its threshold and storage being drained, and adjusts his alerts accordingly. If the window is too close, he pads it. The goal is to allow sufficient time for adding capacity - and avoiding a disaster, he says.
By setting and fine-tuning his alerts, Yotko reports being able to realize utilization rates of more than 80% among his array, a flip from the less than 20% he had realized before thin provisioning.
No dumping here
Another common mistake IT teams make is thinking that every application is a candidate for thin provisioning, says Scott McCullough, manager of technical operations at manufacturer Mine Safety Appliances in Pittsburgh.
"The only applications that can take advantage of thin provisioning are those for which you can predict storage growth," McCullough says. For that reason, he uses NetApp's Provisioning Manager to oversee resource pools for his Web servers, domain controllers and Oracle database, but not the company's high-volume SQL Server. That server would quickly drain any projected resource allotment, he says.
Before he thin-provisions an application, McCullough studies its performance to make sure it won't endanger the pool. "It doesn't make sense to take up all your resources and potentially starve other applications," he says.
"You definitely need to be able to forecast or trend the application's trajectory," IDC's Greyzdorf says. Biomedical and other science programs can be particularly tricky for thin provisioning, she notes, because they can start off needing 200GB of storage and quickly skyrocket to 3TB with the addition of a single project.
Choosing the wrong applications to thin-provision not only endangers your entire storage pool, but also negates any management and budget relief you might gain otherwise. Done correctly, thin provisioning should reduce the overall time spent configuring and deploying storage arrays. If applications continuously hit their thresholds, however, and you're forced to add capacity on the fly, that benefit is quickly negated, costing you in terms of personnel and budget.
This concern has ResCare rolling out thin-provisioning piecemeal, says Daryl Walls, manager of system administration at the Louisville, healthcare support and services provider. "We are cautious about our deployments. We evaluate each implementation to see whether it makes sense from an application, server and storage point of view," he says. (Read a status report on utility storage.)
Once his applications have been thin-provisioned, Walls closely monitors them to make sure that usage patterns don't change dramatically. In the worst case, that would require them to be removed from the pool. "A few times we've underestimated, usage has crept up on us, and we've received alerts saying, 'You're at 70% to 80% utilization,'" he says. In those instances, IT teams must decide whether to expand the application's allotment, procure more resources or move the application off the system.
What goes where
Thin provisioning can wreak havoc on your network if you don't have proper allotment policies in place, says Matt Vance, CIO at Nutraceutical, a health supplements company in Park City, Utah.
"IT has always managed and controlled space utilization, but with thin provisioning you can get a false sense of security. We've found that even with a resource pool, you still need to take responsibility in managing the way people receive and use storage. Otherwise you wind up wasting space, and that's hard to clean up after the fact," Vance says.
For instance, being lax about monitoring the amount of space users and applications are absorbing can lead to overspending on hardware and software, and necessitate an increase in system management. This is particularly concerning in Vance's environment, where the principal driver for moving to virtualization and thin provisioning was the need to bring high-performance database applications online quickly without breaking the bank on storage requirements.
Reporting tools have become essential at Nutraceutical. Each time an application nears its threshold, Vance turns to Compellent Technologies' Storage Center software tools to analyze how the server used its storage space. "We then decide whether it was used appropriately or if we need to tweak our policies," he says.(Get results from a test of 12 iSCSI SAN servers.)
Vance says he is a solid proponent of thin provisioning, but he cautions his peers to stave off the complacency that automation can bring on: "We can't let the pendulum swing so far toward automation that we forget to identify where IT still has to be managing its resources."
Gittlen is a freelance technology editor in the greater Boston area. She can be reached at email@example.com.
< Previous story: Three tips for dealing with unstructured data – lots of it | Next story: Fibre Channel over Ethernet: allowing the big data-center synch >