Right now, generative artificial intelligence is impossible to ignore online. An AI-generated summary may randomly appear at the top of the results whenever you do a Google search. Or you might be prompted to try Meta’s AI tool while browsing Facebook. And that ever-present sparkle emoji continues to haunt my dreams.
This rush to add AI to as many online interactions as possible can be traced back to OpenAI’s boundary-pushing release of ChatGPT late in 2022. Silicon Valley soon became obsessed with generative AI, and nearly two years later, AI tools powered by large language models permeate the online user experience.
One unfortunate side effect of this proliferation is that the computing processes required to run generative AI systems are much more resource intensive. This has led to the arrival of the internet’s hyper-consumption era, a period defined by the spread of a new kind of computing that demands excessive amounts of electricity and water to build as well as operate.
“In the back end, these algorithms that need to be running for any generative AI model are fundamentally very, very different from the traditional kind of Google Search or email,” says Sajjad Moazeni, a computer engineering researcher at the University of Washington. “For basic services, those were very light in terms of the amount of data that needed to go back and forth between the processors.” In comparison, Moazeni estimates generative AI applications are around 100 to 1,000 times more computationally intensive.
The technology’s energy needs for training and deployment are no longer generative AI’s dirty little secret, as expert after expert last year predicted surges in energy demand at data centers where companies work on AI applications. Almost as if on cue, Google recently stopped considering itself to be carbon neutral, and Microsoft may trample its sustainability goals underfoot in the ongoing race to build the biggest, bestest AI tools.
“The carbon footprint and the energy consumption will be linear to the amount of computation you do, because basically these data centers are being powered proportional to the amount of computation they do,” says Junchen Jiang, a networked systems researcher at the University of Chicago. The bigger the AI model, the more computation is often required, and these frontier models are getting absolutely gigantic.
Even though Google’s total energy consumption doubled from 2019 to 2023, Corina Standiford, a spokesperson for the company, said it would not be fair to state that Google’s energy consumption spiked during the AI race. “Reducing emissions from our suppliers is extremely challenging, which makes up 75 percent of our footprint,” she says in an email. The suppliers that Google blames include the manufacturers of servers, networking equipment, and other technical infrastructure for the data centers—an energy-intensive process that is required to create physical parts for frontier AI models.
Most PopularPS5 vs PS5 Slim: What’s the Difference, and Which One Should You Get?By Eric Ravenscraft Gear13 Great Couches You Can Order OnlineBy Louryn Strampe GearThe Best Radios to Catch Your Favorite AirwavesBy Nena Farrell GearThe Best Robot Vacuums to Keep Your Home CleanBy Adrienne So
GearDespite an upwards trend of energy needs at data centers, it’s still a small percentage of the amount of energy humans use overall. Fengqi You, an energy systems engineering researcher at Cornell, mentions oil refineries, buildings, and transportation as more impactful at the present moment. “Those sectors use much more energy compared to AI data centers right now,” he says. Keeping that in mind, AI’s energy consumption footprint could continue to grow in the near future, as generative AI tools are integrated into more corners of the internet and adopted by more users online.
Moist Egregious
In addition to high levels of energy usage, the data centers that train and operate generative AI models consume millions of gallons of water.
“The water that is available for people to use is very limited. It's just the fresh surface water and groundwater. Those data centers, they're just evaporating water into the air,” says Shaolei Ren, a responsible AI researcher at UC Riverside and coauthor of “Making AI Less ‘Thirsty’: Uncovering and Addressing the Secret Water Footprint of AI Models.”
While it may appear similar at first, the impact on the local environment from companies operating giant data centers is not comparable to the impact from residents who may take multiple bubble baths a week or leave the faucet running while they brush their teeth. “They're different from normal, residential users. When we get the water from the utility, and then we discharge the water back to the sewage immediately, we are just withdrawing water—we're not consuming water,” Ren says. “A data center takes the water from this utility, and they evaporate the water into the sky, into the atmosphere.” He says the water consumed by data centers may not return to the earth’s surface until a year later.
Alistair Speirs, a senior director of Azure global infrastructure at Microsoft says in an email that AI is contributing to data center growth, and points out how the transition to cloud computing is also a major factor worth considering. “That can make the growth seem quite fast, when much of it is replacing hardware previously operated on-premises,” he says. Speirs says Microsoft is aiming to meet its goal of being carbon negative, water positive, and zero waste by the end of the decade.
Fengqi You, the researcher from Cornell, also emphasizes the importance of continuing the transition to renewable energy sources—though he questions the efficacy of companies that are relying on carbon offset plans as part of their sustainability efforts. “Offsetting is a temporary solution, which is better than nothing, but it's definitely not an ultimate solution,” he says. Ren feels similarly about water replenishment efforts: It’s better than no action, while remaining an insufficient measure. He argues that more attention should be paid to the water footprint of the supply chain for large companies as well as their direct consumption.
Most PopularPS5 vs PS5 Slim: What’s the Difference, and Which One Should You Get?By Eric Ravenscraft Gear13 Great Couches You Can Order OnlineBy Louryn Strampe GearThe Best Radios to Catch Your Favorite AirwavesBy Nena Farrell GearThe Best Robot Vacuums to Keep Your Home CleanBy Adrienne So
GearOf course, Google and Microsoft are not the only big contenders in the AI race. When contacted via email, Melanie Roe, a spokesperson for Meta, asked for more information about this story, but did not respond to further messages. OpenAI did not reply to requests for comment.
Power Players
Rather than a bane on the environment, technology companies are often positioning AI development as part of the climate solution and critical to innovation. In an effort to reduce AI’s immediate impact as well as the cost, researchers and developers are looking into inventive approaches to lower the energy needed to create AI tools by relying on more efficient hardware chips. They are also experimenting with the potential of smaller AI models that require less computation.
Going beyond environmental concerns, these data centers have the potential to overwhelm local power grids with their energy needs. “In Washington, there's a Microsoft data center building in Quincy,” says Moazeni. “I know there's a lot of concern that the power they are burning is basically sucking up all of the energy in that area.” Around the world, the server farms that train and operate AI models may compete with local residents and businesses for power—possibly leading to blackouts during peak times.
Bobby Hollis, a vice president of energy at Microsoft, says in an email that the company works with the appropriate authorities and utilities to avoid impacting local services. He claims Microsoft builds supporting infrastructure to avoid any utility service reductions for residents.
Users trying to be conscientious about their energy consumption may find themselves at a loss. Even if you don’t seek out generative AI tools, they can be hard to circumvent because of how they are now included as default features in operating systems, web apps, and everyday software programs. Whether you’re logging into an online work portal or just using the internet to connect with friends, it's almost impossible to click around without seeing multiple chatbots offering summaries of information and promising productivity increases.
And although AI already feels ubiquitous, it will continue to creep into more of our online lives. As it does, the upper limits of its energy usage and water consumption remain to be seen.