Researcher Jesse Dodge did some back-of-the-napkin math on the amount of energy AI chatbots use.

“One query to ChatGPT uses approximately as much electricity as could light one light bulb for about 20 minutes,” he says. “So, you can imagine with millions of people using something like that every day, that adds up to a really large amount of electricity.”

He’s a senior research analyst at the Allen Institute for AI and has been studying how artificial intelligence consumes energy. To generate its answers, AI uses far more power than traditional internet uses, like search queries or cloud storage. According to a report by Goldman Sachs, a ChatGPT query needs nearly 10 times as much electricity as a Google search query.

And as AI gets more sophisticated, it needs more energy. In the U.S., a majority of that energy comes from burning fossil fuels like coal and gas which are primary drivers of climate change.

Most companies working on AI, including ChatGPT maker OpenAI, don’t disclose their emissions. But, last week, Google released a new sustainability report with a glimpse at this data. Deep within the 86-page report, Google said its greenhouse gas emissions rose last year by 48% since 2019. It attributed that surge to its data center energy consumption and supply chain emissions.

“As we further integrate AI into our products, reducing emissions may be challenging,” the report reads.

Google declined an interview with NPR.

"Bigger and bigger data centers all the way up to supercomputers"

Google has the goal of reaching net-zero emissions by 2030. Since 2007, the company has said its company operations were carbon neutral because of the carbon offsets it buys to match its emissions.

But, starting in 2023, Google wrote in its sustainability report that it was no longer "maintaining operational carbon neutrality." The company says it's still pushing for its net-zero goal in 2030.

“Google's real motivation here is to build the best AI systems that they can,” Dodge says. “And they're willing to pour a ton of resources into that, including things like training AI systems on bigger and bigger data centers all the way up to supercomputers, which incurs a tremendous amount of electricity consumption and therefore CO2 emissions.”

Microsoft has taken its climate pledge one step further than Google, saying it will be carbon negative by 2030. But, it too is facing setbacks because of its focus on AI. In its sustainability report released in May, Microsoft said its emissions grew by 29% since 2020 due to the construction of more datacenters that are “designed and optimized to support AI workloads.”

“The infrastructure and electricity needed for these technologies create new challenges for meeting sustainability commitments across the tech sector,” the report reads.

A company spokesperson declined to comment further.

AI’s deep thirst for energy

AI requires computer power from thousands of servers that are housed in data centers; and those data centers need massive amounts of electricity to meet that demand.

Northern Virginia has become a hub for the burgeoning data center industry. The data centers in that corner of the state will need the equivalent of enough energy to power 6 million homes by 2030, according to the Washington Post.

The thirst for electricity nationwide has become so intense that plans to decommission several coal plants have been delayed, according to another report by the Washington Post.

“There's a whole material infrastructure that needs to be built to support AI,” says Alex Hanna, the director of research for Distributed AI Research Institute. She worked on Google’s Ethical AI team, but left the company in 2022 over the handling of a research paper that highlighted the environmental costs of AI.

Hanna says the data center boom will continue to grow “as long as there are these organizations that are committed to going whole hog on AI.”

Goldman Sachs has researched the expected growth of data centers in the U.S. and estimates they’ll be using 8% of total power in the country by 2030, up from 3% in 2022. Company analysts say “the proliferation of AI technology, and the data centers necessary to feed it” will drive a surge in power demand “the likes of which hasn’t been seen in a generation.”

Currently, there are more than 7,000 data centers worldwide, according to Bloomberg. That’s up from 3,600 in 2015. When combined, Bloomberg estimates these data centers consume the equivalent amount of electricity per year as the entire country of Italy.

"AI-first" world

All major tech companies are going full throttle on AI. Alphabet CEO Sundar Pichai has dubbed Google an “AI-first” company. Over the last few months, the company released its Gemini chatbot to the world and added its A.I. Overview tool to Google Search. Facebook parent Meta has added chatbots to several of its products. And Apple announced a partnership with OpenAI last month to bring AI to its Siri digital assistant.

During first quarter earnings, all of these companies said they were investing billions of dollars in AI.

Google said it spent $12 billion on capital expenditures just that quarter, which was “driven overwhelmingly” by investments in data centers to fuel its AI endeavors. The company said it expects to keep up that same level of spending throughout the year.

Hanna, the AI researcher, says the environmental costs of artificial intelligence are only going to get worse unless there’s serious intervention.

“There's a lot of people out there that talk about existential risk around AI, about a rogue thing that somehow gets control of nuclear weapons or whatever,” Hanna says. “That's not the real existential risk. We have an existential crisis right now. It's called climate change, and AI is palpably making it worse.”

Editor's note: Google and Microsoft are among NPR's financial supporters.

Copyright 2024 NPR

Transcript

SACHA PFEIFFER, HOST:

Every major tech company is now working on artificial intelligence, but AI uses a ton of energy, and that spikes emissions that contribute to climate change. NPR's tech correspondent, Dara Kerr, reports.

DARA KERR, BYLINE: Here's an amazing stat from a research analyst at the Allen Institute for AI.

JESSE DODGE: One query to ChatGPT uses approximately as much electricity as could light one lightbulb for about 20 minutes.

KERR: That's Jesse Dodge, and he says every single question we ask an AI chatbot is routed to a data center.

DODGE: And then they feed that to their AI system.

KERR: When this process happens, a lot of energy is consumed. AI uses far more electricity from those data centers than traditional internet use, like posting on social media or storing our photos in the cloud. A majority of that electricity involves burning fossil fuels.

DODGE: You can imagine, with millions of people using something like ChatGPT every day, that adds up to a really large amount of electricity.

KERR: This proliferation of people using AI has fueled the growth of data centers. According to Bloomberg, the number of data centers worldwide has nearly doubled in the last 10 years, and when combined, they consume as much electricity per year as the entire country of Italy. Dodge says this surge in energy consumption is growing, especially for tech companies. Google alone says it's planning to spend billions of dollars this year on new data centers.

DODGE: Google's real motivation here is to build the best AI systems that they can, including things like training AI systems on bigger and bigger data centers - all the way up to supercomputers - which incurs a tremendous amount of electricity consumption and, therefore, CO2 emissions.

KERR: These emissions contribute to climate change, and Google notes a stunning revelation deep within its new, 86-page sustainability report - the company's total greenhouse gas emissions increased nearly 50% over the last five years. It says that's in a large part due to its growing AI push.

ALEX HANNA: There's a lot of people out there that talk about existential risk around AI, about a rogue AI thing that somehow gets control of nuclear weapons or whatever. That's not the real existential risk. We have an existential risk right now. It's called climate change, and AI is palpably making it worse.

KERR: Alex Hanna used to work on Google's Ethical AI team. She left the company over the handling of a research paper that highlighted the environmental costs of AI. Hanna now works at the Distributed AI Research Institute. In its report, Google says that as it continues to add more AI into its products, that, quote, "reducing emissions may be challenging." Google declined an interview with NPR. This spike in greenhouse gas emissions is a big change for Google, which has an ambitious climate pledge to reach net-zero emissions by 2030.

(SOUNDBITE OF ARCHIVED RECORDING)

SUNDAR PICHAI: We have until 2030 to chart a sustainable course for our planet or face the worst consequences of climate change.

KERR: That's Google CEO Sundar Pichai speaking in a company video, but Hanna, the AI researcher, says the company's new report is telling.

HANNA: They turned around and said, Well, JK - we actually missed our carbon goals.

KERR: Microsoft has also reported surging emissions because of data centers. It says its greenhouse gas emissions are up nearly 30% since 2020, but the majority of companies working on AI have not disclosed this data, so researchers say it's hard to even know how big the problem really is. Dara Kerr, NPR News.

(SOUNDBITE OF KV'S "RUN")

PFEIFFER: And a note - NPR gets financial support from both Google and Microsoft, although we cover them the same way as we would if they didn't support us.

(SOUNDBITE OF KV'S "RUN") Transcript provided by NPR, Copyright NPR.

300x250 Ad

300x250 Ad

Support quality journalism, like the story above, with your gift right now.

Donate