Google has found a novel use for artificial intelligence: lowering the power bill for its datacenters.
In order to do so, Google turned to DeepMind, the artificial-intelligence subsidiary it acquired in early 2014 for $400 million. Based in London, DeepMind develops machine-learning platforms that have become quite adept at playing games such as Go. Google decided to use that same technology to refine its datacenter operations.
“By applying DeepMind’s machine learning to our own Google datacenters, we’ve managed to reduce the amount of energy we use for cooling by up to 40 percent,” read a new note on the Google DeepMind blog. “The implications are significant for Google’s datacenters, given its potential to greatly improve energy efficiency and reduce emissions overall.”
As any sysadmin or datacenter pro will tell you, datacenters are highly dynamic environments prone to unexpected events such as server failures. It is difficult for anyone (human or machine) to calculate an algorithm or formula that will allow a datacenter’s systems to automatically respond to all scenarios.
Recognizing those challenges, the DeepMind team trained its A.I. software on datacenter parameters and operations, including temperature and power fluctuations. “Since our objective was to improve datacenter energy efficiency, we trained the neural networks on the average PUE (Power Usage Effectiveness), which is defined as the ratio of total building energy usage to the IT energy usage,” the blog added. “We then trained two additional ensembles of deep neural networks to predict the future temperature and pressure of the datacenter over the next hour.”
With DeepMind’s software trained in taking PUE-centric actions without exceeding operating constraints, the team then set their creation loose in a live datacenter: “Our machine learning system was able to consistently achieve a 40 percent reduction in the amount of energy used for cooling, which equates to a 15 percent reduction in overall PUE after accounting for electrical losses and other non-cooling inefficiencies.”
The DeepMind team plans on rolling out its system to more datacenters. Does that mean other, non-Google datacenters will adopt similar measures to make their systems more energy-efficient? Given how energy savings translate into significant cost savings, the answer is likely “yes,” especially as A.I. platforms become more ubiquitous. And on a broader level, it could be interesting to see how artificial intelligence is applied to making other kinds of infrastructure more efficient.