option
Home
News
ChatGPT's Energy Use Lower Than Expected

ChatGPT's Energy Use Lower Than Expected

April 10, 2025
236

ChatGPT, the chatbot from OpenAI, might not be the energy guzzler we thought it was. But, its energy use can vary a lot depending on how it's used and which AI models are answering the questions, according to a new study.

Epoch AI, a nonprofit research group, took a crack at figuring out how much juice a typical ChatGPT query uses. You might have heard that ChatGPT needs about 3 watt-hours to answer a single question, which is 10 times more than a Google search. But Epoch thinks that's a bit of an overstatement.

Using OpenAI's latest default model, GPT-4o, as a benchmark, Epoch found that the average ChatGPT query actually uses around 0.3 watt-hours. That's less than what many household appliances need.

"The energy use is really not a big deal compared to using normal appliances or heating or cooling your home, or driving a car," Joshua You, the data analyst at Epoch who did the analysis, told TechCrunch.

The energy use of AI, and its impact on the environment, is a hot topic as AI companies are looking to expand their data centers like crazy. Just last week, over 100 organizations signed an open letter urging the AI industry and regulators to make sure new AI data centers don't drain natural resources and force utilities to rely on nonrenewable energy sources.

You told TechCrunch that his analysis was sparked by what he saw as outdated research. He pointed out that the report that came up with the 3 watt-hours estimate assumed OpenAI was using older, less-efficient chips to run its models.

Epoch AI ChatGPT energy consumption

Image Credits:Epoch AI

"I've seen a lot of public discourse that correctly recognized that AI was going to consume a lot of energy in the coming years, but didn't really accurately describe the energy that was going to AI today," You said. "Also, some of my colleagues noticed that the most widely reported estimate of 3 watt-hours per query was based on fairly old research, and based on some napkin math seemed to be too high."

Sure, Epoch's 0.3 watt-hours figure is just an estimate, too. OpenAI hasn't shared the details needed to make a precise calculation.

The analysis also doesn't take into account the extra energy costs from ChatGPT features like image generation or processing long inputs. You admitted that "long input" ChatGPT queries — like those with long files attached — probably use more electricity upfront than a typical question.

You said he does expect the baseline power consumption of ChatGPT to go up, though.

"\[The\] AI will get more advanced, training this AI will probably require much more energy, and this future AI may be used much more intensely — handling much more tasks, and more complex tasks, than how people use ChatGPT today," You said.

While there have been some cool breakthroughs in AI efficiency lately, the scale at which AI is being used is expected to drive huge, power-hungry infrastructure growth. In the next two years, AI data centers might need nearly all of California's 2022 power capacity (68 GW), according to a Rand report. By 2030, training a frontier model could need power equivalent to eight nuclear reactors (8 GW), the report predicted.

ChatGPT alone reaches a ton of people, and that number's only growing, which means its server demands are massive. OpenAI, along with several investment partners, plans to spend billions on new AI data center projects over the next few years.

OpenAI's focus, along with the rest of the AI industry, is shifting to reasoning models. These models can do more tasks but need more computing power to run. Unlike models like GPT-4o, which answer queries almost instantly, reasoning models "think" for seconds to minutes before answering, which uses up more computing power — and thus more energy.

"Reasoning models will increasingly take on tasks that older models can't, and generate more \[data\] to do so, and both require more data centers," You said.

OpenAI has started to release more power-efficient reasoning models like o3-mini. But it seems unlikely, at least right now, that these efficiency gains will offset the increased power demands from reasoning models' "thinking" process and the growing use of AI around the world.

You suggested that if you're worried about your AI energy footprint, you should use apps like ChatGPT less often, or choose models that use less computing power — if that's possible.

"You could try using smaller AI models like \[OpenAI's\] GPT-4o-mini," You said, "and use them sparingly in a way that requires processing or generating a ton of data."

Related article
OpenAI Unveils Two Advanced Open-Weight AI Models OpenAI Unveils Two Advanced Open-Weight AI Models OpenAI revealed on Tuesday the release of two open-weight AI reasoning models, boasting capabilities comparable to its o-series. Both models are available for free download on Hugging Face, with OpenA
ByteDance Unveils Seed-Thinking-v1.5 AI Model to Boost Reasoning Capabilities ByteDance Unveils Seed-Thinking-v1.5 AI Model to Boost Reasoning Capabilities The race for advanced reasoning AI began with OpenAI’s o1 model in September 2024, gaining momentum with DeepSeek’s R1 launch in January 2025.Major AI developers are now competing to create faster, mo
Oracle's $40B Nvidia Chip Investment Boosts Texas AI Data Center Oracle's $40B Nvidia Chip Investment Boosts Texas AI Data Center Oracle is set to invest approximately $40 billion in Nvidia chips to power a major new data center in Texas, developed by OpenAI, as reported by the Financial Times. This deal, one of the largest chip
Comments (32)
0/200
PaulHill
PaulHill August 15, 2025 at 11:00:59 PM EDT

Wow, ChatGPT’s energy use is lower than I thought! Still, it’s wild how much it varies by model and usage. Makes me wonder how sustainable scaling up AI will be in the long run. 🤔

WillPerez
WillPerez August 14, 2025 at 7:01:00 PM EDT

Surprising to see ChatGPT's energy use isn't as bad as hyped! Still, makes me wonder how much power those massive AI models really burn through when we’re all chatting away. 🤔

CharlesRoberts
CharlesRoberts April 24, 2025 at 4:09:19 AM EDT

A energia usada pelo ChatGPT é surpreendentemente baixa! Pensei que fosse um devorador de energia, mas não é tão ruim assim. Ainda assim, é louco como pode variar muito dependendo do modelo e do uso. Faz você pensar no futuro da IA, né? 🤔 Continuem o bom trabalho, OpenAI!

JasonHarris
JasonHarris April 19, 2025 at 7:13:39 AM EDT

ChatGPT's energy use is surprisingly low! I was expecting it to be a power hog, but it's not that bad. Still, it's wild how much it can change depending on the model and usage. Makes you think about the future of AI, right? 🤔 Keep up the good work, OpenAI!

AlbertRoberts
AlbertRoberts April 16, 2025 at 4:28:59 AM EDT

ChatGPT's energy use being lower than expected is great news! But it's still wild how it varies so much. Makes you think about the future of AI and energy efficiency. Keep up the good work, OpenAI! 🔋💡

JimmyJohnson
JimmyJohnson April 15, 2025 at 10:49:30 PM EDT

A notícia de que o uso de energia do ChatGPT é menor do que o esperado é ótima! Mas é louco como varia tanto. Faz você pensar sobre o futuro da IA e a eficiência energética. Continue o bom trabalho, OpenAI! 🔋💡

Back to Top
OR