
Using a machine learning system developed by its self proclaimed "Boy Genius" Jim Gao, Google says it calculates power usage efficiency or PUE, a measure of energy efficiency, every 30 seconds. Google, in a blog post by Joe Kava, vice president of the company's Data Centers says it constantly tracks things like total IT load (the amount of energy servers and networking equipment are using at any time), outside air temperature (which affects how cooling towers work) and the levels at which Google sets its mechanical and cooling equipment.
+More on Network World: DARPA wants unique automated tools to rapidly make computers smarter; DARPA system to blend AI, machine learning to understand mountain of text+
"After some trial and error, Jim's models are now 99.6 percent accurate in predicting PUE. This means he can use the models to come up with new ways to squeeze more efficiency out of our operations. For example, a couple months ago we had to take some servers offline for a few days-which would normally make that data center less energy efficient. But we were able to use Jim's models to change our cooling setup temporarily-reducing the impact of the change on our PUE for that time period. Small tweaks like this, on an ongoing basis, add up to significant savings in both energy and money," Kava wrote.
"...a comprehensive DC efficiency model enables operators to simulate the DC operating configurations without making physical changes. Currently, it's very difficult for an operator to predict the effect of a plant configuration change on PUE prior to enacting the changes. This is due to the complexity of modern DCs, and the interactions between multiple control systems. A machine learning approach leverages the plethora of existing sensor data to develop a mathematical model that understands the relationships between operational parameters and the holistic energy efficiency. This type of simulation allows operators to virtualize the data center for the purpose of identifying optimal plant configurations while reducing the uncertainty surrounding plant changes," Gao wrote in a Google white paper outlining the details of his work.
A typical large-scale data center generates millions of data points across thousands of sensors every day, yet this data is rarely used for applications other than monitoring purposes. Advances in processing power and monitoring capabilities create a large opportunity for machine learning, data-driven to guide best practice and improve data center efficiency, Gao said.
"Machine learning applications are limited by the quality and quantity of the data inputs. As such, it is important to have a full spectrum of DC operational conditions to accurately train the mathematical model. The model accuracy may decrease for conditions where there is less data. As with all empirical curvefitting, the same predictive accuracy may be achieved for multiple model parameter. It is up to the analyst and DC operator to apply reasonable discretion when evaluating model predictions," Gao wrote.
Follow Michael Cooney on Twitter: nwwlayer8 and on Facebook
Check out these other hot stories:
DARPA: Big data needs Big search, Big analytics
Selfies in space: Boeing, Samsung team for cosmic mobility
Bell Labs offers $100,000 prize for game-changing information technology
NASA's broken planet-hunter spacecraft given second life
US Navy wants smart robots with morals, ethics
Federal car fleet to become test bed for high-tech safety gear
"Game of Thrones" author like DOS, hates spellcheck
Quick look: What's hot with 3D printers?
10 crucial issues around controlling orbital debris, directing space traffic















