Skip to content

AI is forcing data centers to use more energy. Software can help

Tech giants are vying to stave off a carbon time bomb caused by climate change huge data centers They build all over the world.

A technique developed by Google is becoming more and more important power-hungry artificial intelligence goes online: Software is used to search for clean electricity in parts of the world with excess sun and wind on the grid and then ramp up data center operations there. This could reduce CO2 emissions and costs.

“There is an urgent need to figure out how to operate data centers in a way that maximizes the use of renewable energy,” said Chris Noble, co-founder and CEO of Cirrus Nexus, a cloud computing manager that develops Google data centers. Microsoft And Amazon.

The climate risks posed by AI-driven computing are far-reaching – and will only get worse without a major shift from fossil fuel-based electricity to clean electricity. Nvidia Corp. Chief Executive Officer Jensen Huang said AI did this reach a “tipping point”. He also said that the Data center costs will double within five years to fuel the rise of new software.

Data centers and transmission networks each already account for up to 1.5% of global consumption International Energy Agency. Together they are responsible for emitting about as much carbon dioxide annually as Brazil.

Hyperscalers – as the largest data center owners such as Google, Microsoft and Amazon are known – have all set climate goals and are under internal and external pressure to achieve them. These ambitious goals include decarbonizing their operations.

But the rise of AI is already affecting these goals. Graphics processors have been key to the rise of large language models and consume more power than central processors used in other forms of computing. Accordingly, training an AI model uses more electricity than 100 households per year IEA estimates.

“The growth of AI far exceeds the ability to generate clean electricity for it,” he said.

Additionally, AI energy consumption is volatile, resembling more of a sawtooth graph than a smooth line that most data center operators are accustomed to. This makes decarbonization challenging, not to mention ensuring grid stability.

The growth of AI is being driven by North American companies, keeping computing power – and energy consumption – concentrated there, said Dave Sterlace, account director for global data centers at Hitachi Energy. It's a trend he didn't expect two years ago.

To reduce data center carbon emissions, hyperscalers and other data center providers have financed large amounts of solar or wind farms and used credits to offset emissions. (As for credits, some have failed to have one significant impact on emissions.)

But that alone will not be enough, especially given the increasing use of AI. For this reason, the operators are resorting to the strategy of the Alphabet Inc. subsidiary Google called load shifting. The idea: reduce emissions by turning the way data centers work.

Today, most data centers strive to operate in a “steady state” so that their energy consumption is somewhat stable. This leaves them at the mercy of the grid to which they are connected and the current mix of natural gas, nuclear and renewable energy generation, as there are no transmission lines between regions. To end their reliance on dirtier grids, tech giants are looking for ways to shift daily or even hourly data center operations around the world to absorb excess renewable energy production.

Google launched the first attempt to offset its electricity consumption in certain data centers with carbon-free electricity every hour to keep its machines running on clean energy 24 hours a day. No one has yet fully achieved this goal. And of course, the burden-shifting strategy around the world could be complicated by countries pushing for data sovereignty policies that seek to restrict and protect the flow of data across borders. But what Cirrus Nexus and Google are testing could still be a crucial piece of the puzzle for reducing emissions.

Manhattan-based Cirrus Nexus scours the world's power grids and measures emissions in five-minute increments to find the least polluting computing resources for itself and its customers in industries ranging from pharmaceuticals to accounting. The company had the opportunity to put this search into action last summer.

The Netherlands experienced its sunniest June on record, causing the cost of solar power on the grid to fall. This made running servers more cost-effective and less carbon-intensive. Cirrus Nexus then shifted its computing workload to California as soon as the sun set in the Netherlands, allowing it to rely on solar energy that was coming online for that day in the Golden State.

By chasing the sun from Europe to the West Coast of the US and back again, the company was able to reduce compute emissions by 34% for certain workloads for itself and its customers, rather than relying solely on servers in one of the two locations, according to company data that was shared with Bloomberg Green. A corresponding flexibility of processes brings with it both advantages and risks.

The ability to target carbon-free replacement megawatts can help reduce strain on grids, such as during a heat wave or cold winter storm. However, data centers must work with utilities and grid operators as large fluctuations in demand can disrupt power systems and increase the likelihood of blackouts. Dominion Energywhich is The demand for data centers is increasing rapidly is working on a program at its Virginia utility to use load shifting in data centers to reduce strain on the grid during extreme weather conditions.

In recent years, Google and Amazon have been testing the changing use of data centers for their own operations and for customers using their cloud services. (Cirrus Nexus, for example, uses cloud services from Amazon, Microsoft and Google.) In Virginia, Microsoft signed a contract with Constellation energy Corp., which guarantees that more than 90% of the electricity for its data center in the region will be carbon-free energy. However, reaching 100% remains a daunting goal for him and other hyperscalers.

Google's data centers run on carbon-free energy about 64% of the time, with 13 of its regional sites achieving 85% and seven globally at just over 90%, said Michael Terrell, who leads Google's 24/7 carbon-free energy strategy.

“But if you don’t displace fossil resources, you won’t fully meet your climate goals,” Terrell said.

Subscribe to the Eye on AI newsletter to stay up to date on how AI is shaping the future of business. Log in for free.