Skip to main content

Why Google is ready to entrust energy management to AI

The technology has recommended strategies that hadn't occurred to its human counterparts.

This article is drawn from the Energy Weekly newsletter from GreenBiz, running Thursdays.

Our collective fascination with the potential of artificial intelligence is real.

Research out this month from McKinsey suggests that by 2030, 70 percent of businesses will have adopted AI in some form or another — the five most common applications center on imaging, language translation, automation, robotics and advanced machine learning.

While we humans haven't figured out much when it comes to answering legitimate questions about data security, privacy concerns, decision bias and so forth, early forms of AI are producing concrete progress on a number of fronts — including applications related to energy management.

Early this week, AutoGrid — an AI company that helps companies such as National Grid, NextEra, Xcel, E.ON, Total and China Light and Power (CLP) optimize their power resources — snagged $32 million in late-stage venture funding. It works with seven of the 10 largest North American utilities. Did I mention it's on pace to have up to 5 gigawatts of power on its platform by the end of 2018? The new investors in this round include CLP, Innogy, Orsted and Tenaska.

The automated energy management recommendations suggested by its AI control systems are helping to drive consistent savings of 30 percent.
Closer to home for most GreenBiz readers, however, are the latest details of Google's ongoing experiments with AI in its massive data center operations. The tech giant shared some of those results in a recent blog penned by Google data center engineer Amanda Gasparik and two members of the DeepMind AI team, Chris Gamble and Jim Gao. 

Perhaps the most dramatic takeaway is this one: The automated energy management recommendations suggested by its AI control systems are helping to drive consistent savings of 30 percent. And its performance is getting better as it learns. The system is getting so good, in fact, that Google is letting it take action without human intervention.

"It was amazing to see the AI learn to take advantage of winter conditions and produce colder than normal water, which reduces the energy required for cooling within the data center," noted another Google data center engineer quoted in the post. "Rules don't get better over time, but AI does."

I asked the Google team whether it was surprised by the results. In written responses to my questions, Gao told me that engineers hadn't thought of some processes the AI had suggested, such as spreading cooling measures over multiple pieces of equipment for a bigger impact or reducing the water flow that helps pull heat out of the cooling systems. (Conventional logic has suggested increasing the flow.)    

This isn't some theoretical science experiment, by the way. Gao told me that the system is "currently live in multiple Google data centers with plans to roll out further."

When can your data center geeks get their hands on this technology? Not any time soon. "We can see the appeal of this, and we think there's potential for the system to be applied to other industrial systems outside of Google's data centers to help reduce CO2 emissions on an even grander scale," Gao wrote me. "However, our immediate focus is on fine-tuning the system within Google's data centres and rolling it out to more sites."

If you're interested in the role of artificial intelligence in sustainability, check out the agenda for VERGE 18, Oct. 16-18, in Oakland, California.

More on this topic