What will AI mean to the traditional data center?

High-performance computing and machine learning could be a double-edged sword for data centers.

The recent decision by US-Norwegian business Kolos to build a giant, 600,000m2 data center in the city of Ballangen, Norway, is on the surface at least, a wise decision. Ballangen, which incidentally is the birthplace of singer Frida from 70s pop group Abba, is inside the Arctic circle, which sort of solves the cooling issue. Maybe.

As demands to store and manage data accelerates, driven by increased machine connectivity and data analytics, so the pressure on traditional data centers increases, certainly in terms of capacity and cooling. This Kolos site, chosen primarily for its clean, renewable energy, is expected to house 70MW of IT equipment, eventually scaling up to offer 1,000MW, within 10 years of construction.

The quest for bigger and better is understandable but when does size become an issue? Surely we can’t keep building huge data centers in the arctic to keep up with the insatiable demand for data? There is a school of thought that the demand for AI-enabled machines will alter the course of data center development. Machine learning is, after all, capacity and power hungry.

“A combination of deep machine learning and analytics on growing data sets is driving adoption of IT systems that are more similar to high-performance computing clusters,” says Daniel Bizo, a senior analyst for data center technologies at 451 Research. “Such systems are high-powered and require much more power and cooling capacity than the average rack. Many traditional facilities will be stretched to meet such requirements at cost.”

The pressure is understandable and Kolos is in many ways a reaction to that but as we know, things move fast in the data center world. No sooner have you built a state-of-the-art complex than the technology shifts up a gear. But this can also lead to innovation. Certainly the demand for machine learning capability is already having an impact. So-called AI chips are being used in data centers to help businesses who cannot afford their own high-powered servers to gain access to a sort of AI capability. This is the thinking behind Nvidia’s recent deal with Chinese firm Baidu.

For Aaditya Sood, senior director of engineering and products at Nutanix, machine learning can influence data center development significantly. Sood, who sold his DevOps automation firm Calm.io to Nutanix 12 months ago, believes that giving businesses access to machine learning GPUs through the data center has to be a good thing.

“This shouldn’t be limited to the big seven—the Googles and Facebooks and so on,” he says, adding that while machine learning in data centers is opening doors for potential innovation for business users, it can also have its own impact on how data centers are designed and managed.

“Where AI—I prefer machine learning—is going to help is in two things. Firstly, the modelling system—discovering how my data center applications start better, how they are connected and, secondly, the more important part is how do I take corrective actions when things go wrong,” Sood says.

While Sood admits that the make-up of any future data center will remain heterogeneous, in the sense that “it will never be 90 percent one vendor, there will always be different vendors at different layers fighting it out,” he says. It could lead to complexity. How you manage that complexity is always a challenge, which is where machine learning algorithms could probably come in and help.

If anything, it’s an opportunity. Bizo at 451Research agrees that demand for increased machine learning will reinvigorate the data center space and lead to further startups and acquisitions.

“Technology shifts always invite new entrants and data centers are no different,” says Bizo. “In the foreseeable future, data center capacity demand is already forecast to grow. Advancements in deep learning and big data analytics will create appetite for an increased use of these techniques and in turn the systems on which they run. This will represent net new capacity requirements over the coming ten years. Even though there is a lot of technology and clever design involved, the most valuable component may still be location: access to and ownership of strategically positioned real estate.”

Bizo suggests that high-performance deep learning will naturally be attracted to power, where cost is lower and availability is higher. This could influence location but also may lead to some innovation.

“We expect some novel approaches, such as location next to electrical utility distribution substations and remote sites close to hydro and thermal power plants, to gain currency with some major customers training their AI or performing data mining on vast data archives,” says Bizo. “Bandwidth and its cost remain an issue, however.”

So where will this innovation be? Are there any signs of companies trying to tackle data center issues at the moment?

Ray Chohan, senior vice president of corporate strategy at global intellectual property analytics company PatSnap, says AI can have a significant impact on how the infrastructure is run and supported, everywhere from energy usage and cooling technology, to robotic handling of day-to-day maintenance and security.

“For example, this patent published by EMC in 2014 describes the use of AI to improve the cost/performance storage and energy savings incurred by data center owners,” says Chohan.

Another interesting application of AI in the data center comes from Oracle, in a patent that was published in May 2017. The patent describes a system and method for Distributed Denial of Service (DDoS) identification and prevention. Among other things, the author sees this invention as a step up from traditional intrusion detection methods that use machine learning, as they “typically require human interaction to improve rule-based binary classifications.”

As IoT botnets become more present, Cisco expects DDoS attacks to reach approximately 17 million per year in 2020. With so much IT infrastructure moving toward cloud, the ability to leverage AI and minimise the damage of these attacks will become extremely valuable.

Chohan adds that the companies innovating most in this area include Microsoft, Numenta, Amazon, Harris Corporation and IBM. The US is by far the most authoritative in this field, with 84 percent of the patents filed in this area but if we have learned one thing from history it’s that when there is a shift in technology—and machine learning/AI is that shift—it sort of levels the playing field a bit for new entrants to stake their claim. One thing is certain. The data center is already changing and will no doubt be the source of a new wave of innovation and acquisition targets for the next 10 years.

This story, "What will AI mean to the traditional data center?" was originally published by IDG Connect.

Copyright © 2017 IDG Communications, Inc.

The 10 most powerful companies in enterprise networking 2022