What will AI mean to the traditional data center?

High-performance computing and machine learning could be a double-edged sword for data centers.

The recent decision by US-Norwegian business Kolos to build a giant, 600,000m2 data center in the city of Ballangen, Norway, is on the surface at least, a wise decision. Ballangen, which incidentally is the birthplace of singer Frida from 70s pop group Abba, is inside the Arctic circle, which sort of solves the cooling issue. Maybe.

As demands to store and manage data accelerates, driven by increased machine connectivity and data analytics, so the pressure on traditional data centers increases, certainly in terms of capacity and cooling. This Kolos site, chosen primarily for its clean, renewable energy, is expected to house 70MW of IT equipment, eventually scaling up to offer 1,000MW, within 10 years of construction.

The quest for bigger and better is understandable but when does size become an issue? Surely we can’t keep building huge data centers in the arctic to keep up with the insatiable demand for data? There is a school of thought that the demand for AI-enabled machines will alter the course of data center development. Machine learning is, after all, capacity and power hungry.

“A combination of deep machine learning and analytics on growing data sets is driving adoption of IT systems that are more similar to high-performance computing clusters,” says Daniel Bizo, a senior analyst for data center technologies at 451 Research. “Such systems are high-powered and require much more power and cooling capacity than the average rack. Many traditional facilities will be stretched to meet such requirements at cost.”

The pressure is understandable and Kolos is in many ways a reaction to that but as we know, things move fast in the data center world. No sooner have you built a state-of-the-art complex than the technology shifts up a gear. But this can also lead to innovation. Certainly the demand for machine learning capability is already having an impact. So-called AI chips are being used in data centers to help businesses who cannot afford their own high-powered servers to gain access to a sort of AI capability. This is the thinking behind Nvidia’s recent deal with Chinese firm Baidu.

For Aaditya Sood, senior director of engineering and products at Nutanix, machine learning can influence data center development significantly. Sood, who sold his DevOps automation firm Calm.io to Nutanix 12 months ago, believes that giving businesses access to machine learning GPUs through the data center has to be a good thing.

To continue reading this article register now

IT Salary Survey: The results are in