The data center is transforming -- modernizing to meet business demand as technologies such as software-defined architecture, cloud and virtualization take hold. This modernization is also being driven by CIOs and IT executives taking a hard look at their computing needs and asking whether they want to own and/or operate data centers any longer, industry experts say.
It's a big issue. According to new research by Synergy Research Group, spending on enterprise data center equipment is static while spending on service provider data centers is booming. And Gartner predicts that a software-defined data center's programmatic capabilities specifically regarding application programming interfaces and/or command line interfaces will be required for 75% of global 2000 organizations seeking to adopt modern IT approaches such as DevOps by 2020.
As the workloads companies are trying to support are increasingly linked to analytics and other data-intensive apps, those with data centers are facing a huge dilemma, observes Rick Villars, IDC's vice president of data center and cloud. Customers are struggling over whether it is better for them to buy servers and storage and put it in their own data center to run, or if it makes more sense to buy infrastructure in a data center run by a vendor that can optimize the apps for the workloads they need.
More and more companies are recognizing that they don't want to become service providers and they don't want to build a new data center, Villars says, noting that IDC frequently hears from clients that "this is not where our strategic investments should be spent."
At the same time, "most companies are saying ... we want to take advantage of converged infrastructures and solid state storage and virtualization to run those existing apps for a lower cost ... and do more with less,'' he says. Some of the newer technologies are more advanced than the data center they are housed in, Villars adds. "We call that data center obsolescence."
To continue reading this article register now