As tech giants race to establish large-scale data centers worldwide, the resulting “carbon time bomb” has become a major concern for them.
With an increasing number of power-hungry artificial intelligence (AI) systems coming online, a technology pioneered by Google is gaining more attention: using software to locate clean electricity in regions where solar and wind energy are abundant, and then increasing data center computing loads there. This approach can reduce carbon emissions and costs.
Chris Noble, co-founder and CEO of cloud computing company Cirrus Nexus, says it is urgent to figure out how to run data centers in a way that maximizes the use of renewable energy.
The company is effectively managing data centers owned by Google, Microsoft, and Amazon to reduce carbon emissions.
Concerns over AI Energy Consumption
Climate risks caused by the computational demands of artificial intelligence are profound, and global climate risks will worsen if there is no shift from fossil fuel-based power to clean energy. Jensen Huang, CEO of NVIDIA, says artificial intelligence has reached a “tipping point.”
He also states that the cost of data centers will double within five years.
According to the International Energy Agency, data centers and transmission networks each account for 1.5% of global energy consumption. The total amount of carbon dioxide emitted by them each year is equivalent to “one Brazil.”
Large-scale companies like Google, Microsoft, and Amazon, the world’s largest data center owners, have set climate goals and face internal and external pressure to achieve them. These lofty goals include decarbonizing their operations.
However, the rise of artificial intelligence has posed significant challenges to achieving these goals. Graphics processing units are key to the rise of large language models and consume more power than central processing units used in other forms of computing. According to the International Energy Agency, the electricity required to train an artificial intelligence model for a year exceeds the usage of over 100 households.
Noble says, “The growth of artificial intelligence far outpaces humanity’s ability to produce clean energy for it.”
Furthermore, the energy consumption of artificial intelligence is unstable, resembling a sawtooth pattern rather than the smooth line that most data center operators are accustomed to. This makes decarbonization a challenge, not to mention ensuring the stability of the grid.
Dave Sterlace, Global Director of Data Center Customers at Hitachi Energy, says the growth of artificial intelligence is being driven by North American companies, which concentrates both computing power and energy use there. This is a trend he didn’t anticipate two years ago.
Strategies for Addressing the Issue
To reduce carbon dioxide emissions from data centers, large-scale and major data center providers have funded numerous solar or wind power plants and used credit quotas to offset emissions.
However, this alone is not enough, especially with the increasing use of artificial intelligence. That’s why operators are turning to the “load shifting” strategy adopted by Alphabet Inc.’s Google. The concept is to reduce emissions by disrupting the way data centers operate.
Today, most data centers seek to operate in a “steady state,” making their energy consumption relatively stable. This makes them susceptible to the whims of the grid, considering the lack of transmission lines between regions regardless of whether natural gas, nuclear power, or other renewable energy sources generate power. To reduce dependence on the grid, tech giants are seeking opportunities to shift data center operations globally on a daily or even hourly basis to absorb excess renewable energy.
Google is pioneering attempts to use zero-carbon electricity on an hourly basis in some data centers to power its machines around the clock with clean energy. This goal has not yet been fully achieved. Moreover, the strategy of shifting loads globally may become complicated due to some countries implementing restrictions and data sovereignty policies to protect cross-border data flows.
However, the innovation being pursued by Cirrus Nexus and Google could still be a crucial part of the decarbonization puzzle. The former is searching for grids worldwide and measuring carbon emissions every five minutes to find the least emitting computing resources for itself and its clients. Last summer, the company put this search into practice.
At that time, the Netherlands was experiencing its sunniest June on record, leading to lower costs for solar power generation on the grid. This made running servers cheaper and emitted less carbon. When the sun set in the Netherlands, Cirrus Nexus shifted its computing load to California to take advantage of the solar energy that had just come online there.
According to latest data, by tracking the sun from Europe to the US West Coast and back, the company was able to reduce emissions for certain workloads for itself and its clients by 34%. This makes operations flexible, with both benefits and risks.
Michael Terrell, Head of Google’s 24/7 (all-weather) Zero Carbon Energy Strategy, says Google’s data centers use carbon-free energy about 64% of the time, with utilization rates of carbon-free energy at 85% for 13 regional sites and slightly over 90% for 7 global data centers.
“But if you’re not replacing fossil assets, then you’re not going to fully achieve your climate goals,” he says.