Google Uses AI To Cut Energy Used To Cool Its Data Centers

Google uses AI to cool data centers, save energy

Data centers are a large group of networked computer servers typically used by organizations for the remote storage, processing, or distribution of large amounts of data. They power most of our day-to-day life services, apps and systems that we depend upon. However, running thousands of hard drives, processors, networking equipment and magnetic tapes takes a real toll on the grid, which results in poor energy efficiency in data centers. To make things worse, all of that equipment also needs a powerful cooling system to keep it running.

However, Google has found a way to ease that problem. For the last few months, Googleโ€™s artificial intelligence (AI) division, DeepMind, has been using machine-learning algorithm at its two datacentres, which has helped the search giant reduce the energy used for data center cooling by 40% and overall energy usage by 15% in power usage efficiency (PUE).

โ€œWe accomplished this by taking the historical data that had already been collected by thousands of sensors within the data center โ€“ data such as temperatures, power, pump speeds, setpoints, etc. โ€“ and using it to train an ensemble of deep neural networks. Since our objective was to improve data center energy efficiency, we trained the neural networks on the average future PUE (Power Usage Effectiveness), which is defined as the ratio of the total building energy usage to the IT energy usage. We then trained two additional ensembles of deep neural networks to predict the future temperature and pressure of the data center over the next hour. The purpose of these predictions is to simulate the recommended actions from the PUE model, to ensure that we do not go beyond any operating constraints,โ€ Google explained in a blogpost.

It resulted in a 40 percent reduction in the amount of energy used for cooling, which was equal to a 15 percent reduction in overall PUE after accounting for electrical losses and other non-cooling inefficiencies. The results were so impressive that Google plans to deploy the system inside all of its data centers by the end of the year.

The use of the AI technology is “a phenomenal step forward” to help cut down energy usage in data centers, DeepMind research engineer Rich Evans and Google data center engineer Jim Gao said on Google’s blog.

According to Evans and Gao, the energy reduction was realized by training DeepMind’s self-learning algorithms to predict how hot data centers were going to get within the next hour. Equipped with that data, the coolers were only able to run at the maximum temperature necessary to keep the servers sufficiently cool. Googleโ€™s data centers are used to run such services as Search, YouTube and Gmail.

Using a system of neural networks that zero in on different operating scenarios and limits within the data centers allows DeepMind to make a more efficient and adaptive framework to comprehend data center dynamics and enhance efficiency, according to Evans and Gao.

“The implications are significant for Googleโ€™s data centers, given its potential to greatly improve energy efficiency and reduce emissions overall,” Evans and Gao said. “This will also help other companies who run on Googleโ€™s cloud to improve their own energy efficiency.”

However, the best thing about the system is it can be deployed in other data centers and environments with no changes, according to Evans. It can even be applied to other domains like the national energy grid, or optimizing water usage.

Google claims that its data centers are already among the most energy-efficient in the world. The company has claimed that its data centers use hardly 50 percent of the energy consumed by most other data centers of comparable size.

โ€œI really think this is just the beginning. There are lots more opportunities to find efficiencies in data centre infrastructure,โ€ said DeepMindโ€™s co-founder, Mustafa Suleyman. โ€œOne of the most exciting things is the kind of algorithms we develop are inherently general โ€ฆ that means the same machine learning system should be able to perform well in a wide variety of environments [such as power generation facilities and energy networks].โ€

With its algorithm being a perfect candidate for many industrial facilities, Suleyman explained that the team is already in talks with interested parties outside of Google.

The team announced it would be releasing a white paper describing its results and how the system was built and implemented in the near future.

Kavita Iyer
Kavita Iyerhttps://www.techworm.net
An individual, optimist, homemaker, foodie, a die hard cricket fan and most importantly one who believes in Being Human!!!

Subscribe to our newsletter

To be updated with all the latest news

Read More

Suggested Post