Is ChatGPT Really 10× Worse Than Google? The AI Energy Myth That Won’t Die

Is ChatGPT Really 10× Worse Than Google? The AI Energy Myth That Won’t Die

By now, the stat feels like settled science: asking ChatGPT a question uses roughly ten times more energy than a Google search. You’ve probably seen it in a climate explainer, a tech newsletter, or a worried op-ed about the carbon cost of AI. It’s crisp, alarming, and eminently shareable. It also rests on a number that has since changed by an order of magnitude — and nobody seems to have sent the correction.

The Statistic That Launched a Thousand Hot Takes

The “10×” figure traces back to a widely cited 2023 estimate from researchers and analysts attempting to quantify the energy footprint of large language models. The working number that circulated — roughly 2.9 to 3 watt-hours (Wh) per ChatGPT query — was contrasted against Google search’s approximately 0.3 Wh per query, yielding that satisfyingly round tenfold gap.

The statistic spread fast because it did real narrative work. It gave concrete shape to vague anxieties about AI’s environmental cost. It was specific enough to sound authoritative, dramatic enough to be alarming, and simple enough to fit in a tweet. Tech journalists ran with it. Climate reporters incorporated it into broader AI-and-sustainability pieces. It became the de facto baseline for the conversation.

What received far less attention: the estimate was always partial, speculative, and dependent on hardware configurations that were already shifting at the time of publication.

How a Single Estimate Became Conventional Wisdom

The 3 Wh figure wasn’t fabricated — it reflected genuine modeling of what early ChatGPT inference likely cost, given known GPU energy consumption and rough assumptions about query volume and infrastructure efficiency. The problem was how it traveled. Each retelling stripped away the caveats. Phrases like “estimated,” “approximately,” and “under certain assumptions” fell off in successive rewrites, a familiar game of telephone in which uncertainty launders itself into fact.

Coverage also treated the number as static, as if AI hardware and software optimization were frozen in amber. In reality, inference efficiency was improving rapidly — through better model quantization, more efficient chips, improved batching strategies, and purpose-built AI accelerators replacing general-purpose GPUs.

The media ecosystem that amplified the figure had little incentive to revisit it. A correction — “AI might use only as much energy as Google, actually” — is a far less compelling headline than the original scare. So the 10× claim kept circulating, uncontested, as a kind of zombie statistic.

What More Recent Analysis Actually Shows

A 2025 analysis by Epoch AI arrived at a dramatically different figure: approximately 0.3 Wh per ChatGPT query — a tenfold reduction from the earlier estimate, and functionally equivalent to a Google search. This reflects both genuine efficiency gains in AI inference and more rigorous bottom-up modeling that accounts for current hardware generations and real-world deployment configurations.

The revision matters. If the per-query energy cost has fallen by 90%, that fundamentally changes the individual behavioral calculus — the “should I feel guilty about asking AI this question?” anxiety that the 10× figure was quietly stoking.

But here’s where the story gets more complicated, not less.

Per-Query Is Only Part of the Picture

Focusing on the per-query number, whether it’s 3 Wh or 0.3 Wh, can obscure the larger structural reality of AI’s energy footprint.

Consider what the per-query metric leaves out:

  • Training costs: A single large model training run can consume hundreds of megawatt-hours — an energy cost that gets amortized across queries but never disappears from the ledger.
  • Continuous model updates and fine-tuning: These aren’t one-time events; they recur throughout a model’s lifecycle.
  • Infrastructure overhead: Cooling systems, networking, and idle compute capacity all draw power that never shows up in a “per query” calculation.
  • Aggregate scale: OpenAI reportedly handles over 100 million daily active users. Even at 0.3 Wh per query, that aggregates fast.

The International Energy Agency’s projection that data centers could consume 347 TWh annually by 2028 — substantially driven by AI workloads — didn’t emerge from per-query thinking. It emerged from modeling total system demand. The efficiency gains at the query level are real, but they’re being partially or fully offset by the explosive growth in AI adoption and infrastructure buildout.

This is Jevons’ Paradox in digital form: more efficient systems get deployed at far greater scale, and total consumption rises even as unit consumption falls.

The Transparency Problem Nobody Wants to Solve

Underlying all of this is a more fundamental issue: we shouldn’t have to rely on independent researchers and leaked estimates to understand the energy footprint of the world’s most-used AI systems.

Neither OpenAI, Google, nor Microsoft publishes granular, independently verifiable energy consumption data for their AI products. What exists are voluntary sustainability reports, high-level commitments to carbon neutrality, and carefully worded disclosures that reveal little about actual consumption. The gap between corporate climate commitments and operational transparency is vast.

This opacity has real consequences. It means that public discourse — including the spread of the 10× myth — happens in an information vacuum that companies themselves have created. Journalists can’t correct bad estimates with good data because good data isn’t available. Policymakers can’t regulate what they can’t measure. Researchers work from inference and estimation rather than ground truth.

The AI energy story isn’t just a science communication failure. It’s a transparency failure, and the two reinforce each other.

The Myth That Persists Because It Has To

The “ChatGPT uses 10× more energy than Google” claim is probably wrong, at least by current efficiency standards. But it became dominant not because people were credulous, but because it filled a genuine epistemic void. In the absence of real data, compelling estimates become facts.

The corrected figure — ~0.3 Wh per query — is more accurate but not reassuring in the way the correction might seem to promise. The aggregate numbers are still large and growing. The training costs are still largely invisible. The transparency problem is still unsolved.

The lesson isn’t that AI’s energy footprint is fine, actually. It’s that we’ve been arguing about the wrong number, with no reliable way to know what the right one is — and that’s precisely how the industry prefers it.

Leave a Reply

Your email address will not be published. Required fields are marked *