Apr 28, 2014
Google Using Machine Learning to Boost Data Center Efficiency
Google is using machine learning and artificial intelligence to wring even more efficiency out of its mighty data centers.
In a presentation today at Data Centers Europe 2014, Google’s Joe Kava said the company has begun using a neural network to analyze the oceans of data it collects about its server farms and to recommend ways to improve them. Kava is the Internet giant’s vice president of data centers.
In effect, Google has built a computer that knows more about its data centers than even the company’s engineers. The humans remain in charge, but Kava said the use of neural networks will allow Google to reach new frontiers in efficiency in its server farms, moving beyond what its engineers can see and analyze.
Google already operates some of the most efficient data centers on earth. Using artificial intelligence will allow Google to peer into the future and model how its data centers will perform in thousands of scenarios.
In early usage, the neural network has been able to predict Google’s Power Usage Effectiveness with 99.6 percent accuracy. Its recommendations have led to efficiency gains that appear small, but can lead to major cost savings when applied across a data center housing tens of thousands of servers.
Why turn to machine learning and neural networks? The primary reason is the growing complexity of data centers, a challenge for Google, which uses sensors to collect hundreds of millions of data points about its infrastructure and its energy use.
“In a dynamic environment like a data center, it can be difficult for humans to see how all of the variables interact with each other,” said Kava. “We’ve been at this (data center optimization) for a long time. All of the obvious best practices have already been implemented, and you really have to look beyond that.”
Enter Google’s ‘Boy Genius’
Google’s neural network was created by Jim Gao, an engineer whose colleagues have given him the nickname “Boy Genius” for his prowess analyzing large datasets. Gao had been doing cooling analysis using computational fluid dynamics, which uses monitoring data to create a 3D model of airflow within a server room.
Gao thought it was possible to create a model that tracks a broader set of variables, including IT load, weather conditions, and the operations of the cooling towers, water pumps and heat exchangers that keep Google’s servers cool.
Original Article here
The best products and services in Data center, telecom, and inter-networking supply
Labels:
Data Center
Subscribe to:
Posts (Atom)