Most articles about AI and the climate answer a question that doesn't matter. Defenders spend thousands of words proving your personal ChatGPT use is negligible, which is true but beside the point. Critics warn that AI will cook the planet, which is not supported by any serious projection. Both camps avoid the actual question: how fast is aggregate AI demand growing, what will power it, and what will it enable?
Feeling guilty over a 0.3 Wh query is about as useful as holding your breath to fight climate change. Any industry that uses about 1.5% of the world's electricity, like air conditioning or industrial motors1, shouldn't be a cause for concern. The real story is somewhere in the middle.
The 2026 snapshot
Global data centers consumed roughly 415 TWh in 2024, about 1.5% of global electricity.1 AI-specific servers accounted for around 93 TWh in 2025, or about 0.3% of global electricity.2 For comparison, residential air conditioning uses more than six times that amount, industrial motors use around forty times, and global video streaming sits in roughly the same ballpark as AI at 100–120 TWh.
A short query to a standard (non-reasoning) model like GPT-4o consumes about 0.3 Wh, roughly three seconds of microwave operation.34 If you sent a thousand such queries a day, you'd rack up ~110 kWh per year, which is about 3% of what a typical Spanish household consumes annually.5
So on the narrow point, the "AI is fine" crowd is right. Someone who quits ChatGPT to "help the planet" while still eating beef and driving a combustion car is doing climate theater, not climate action. But this frame captures today's snapshot of a fast-moving target, and only for the simplest queries on the most efficient models. Three things make it misleading as a summary of where AI's climate profile actually sits.
The first thing: "average query" is a moving target
The 0.3 Wh figure applies to a short, single-turn query on a non-reasoning model. The industry has dedicated the past eighteen months to the transition of users to reasoning models, including o3, DeepSeek R1, Claude with extended thinking, and GPT-5, which require 10 to 100 times more energy per query. Measured benchmarks put o3 at around 33 Wh, GPT-4.5 at around 30 Wh, and Claude 3.7 Sonnet with extended thinking at around 17 Wh.6 These aren't edge cases. They're becoming the default.
Agentic workflows compound the shift. A single user request to an AI agent ("book me a flight", "refactor this module") can trigger dozens or hundreds of inference calls as the agent plans, searches, verifies, and iterates. The unit of energy cost is no longer a prompt. It's a task, and tasks can be arbitrarily compute-intensive.
The second thing: demand outruns efficiency
Every defender of AI's footprint eventually invokes efficiency, and the gains are real. NVIDIA's Blackwell is 25–50 times more efficient per token than Hopper.7 Algorithmic efficiency in pre-training triples roughly every year.8 Quantization, mixture-of-experts, and distillation all deliver genuine improvements.
The problem is that demand has been compounding faster. When per-query efficiency was supposedly improving quickly, ChatGPT processed roughly 1 billion prompts per day in December 2024 and 2.5 billion by July 2025, a 150% increase in just seven months 9. According to the IEA's central scenario, global data centers will be around 945 TWh by 2030 and 1,200 TWh by 2035. Their Lift-Off scenario reaches 1,700 TWh, about 4.4% of global electricity.1
This is Jevons paradox in real time. Meta spent 50% more on AI after DeepSeek showed off its cutting-edge capabilities in January 2025 at a fraction of the cost of training. Microsoft, Google, and Amazon held or increased capex. Satya Nadella posted about Jevons paradox on the same day DeepSeek dropped: "As AI gets more efficient and accessible, we will see its use skyrocket."10 Token prices collapsed by over 90% across the industry in 2025, and total inference spending more than doubled.
The streaming video analogy that sometimes gets invoked doesn't rescue this picture. Streaming kept energy flat despite exponential traffic growth because video is cached at the edge, so the marginal cost of one more viewer is close to zero.11 AI inference can't be cached that way: every query needs fresh GPU computation. And unlike streaming, where humans have finite viewing hours, AI demand has no obvious ceiling because agents can generate queries continuously.
The third thing: the grid mix matters more than the chips
Here's the fact that gets lost in per-query debates: AI's climate impact depends almost entirely on what kind of electricity is feeding the data center, not on how efficient the chips are. A 1,700 TWh AI sector powered by renewables and nuclear is a footnote in the energy transition. The same sector running on gas and coal is a real problem.
In 2024, the carbon intensity of US data center electricity was approximately 48% higher than the national grid average, with a value of 548 gCO₂e/kWh compared to 369 gCO₂e/kWh. This is because data centers have clustered in gas-heavy regions such as Virginia.12 The IEA's Lift-Off scenario predicts that fossil fuels will provide nearly half of the additional electricity for data centers between 2024 and 2030. Natural gas is expected to grow 1.5 times faster than the Base Case, with the United States experiencing the most significant absolute increase, and coal generation is expected to double, primarily in China.13 Google abandoned its net-zero goal in July 2025. Microsoft's emissions have risen roughly 23% since 2020 despite record renewable procurement.14
None of this is locked in. The same IEA report projects renewables meeting half of data center demand growth by 2030, with nuclear (including the first small modular reactors) contributing meaningfully after 2030.1 Tech companies have committed over $10 billion to nuclear partnerships. Microsoft is restarting Three Mile Island. The money is real; the timelines are long. AI is forcing a binary choice: build clean firm generation fast enough to feed new demand, or lock in gas infrastructure that will be running in 2050. That choice is being made now in permit queues and interconnection agreements, and it has very little to do with whether you send another ChatGPT query today.
The positive case, stated without hype
The IEA predicts that the widespread adoption of current AI applications, including grid optimization, materials science, logistics, precision agriculture, and building efficiency, could reduce global emissions by approximately 5% of energy-related CO₂ by 2035. That's larger than the emissions from the data centers running those applications, even in the Lift-Off scenario.15
The qualifications are important. The IEA calls this figure an "exploratory analysis" rather than a projection and explicitly warns there is "currently no momentum" ensuring widespread adoption.15 It assumes aggressive deployment in sectors where deployment has been slow. It assumes rebound effects don't eat the savings, though rebound effects are well-documented: cheaper autonomous trips mean more trips, and cheaper AI-optimized shipping means more stuff shipped.
Genuine examples are emerging. DeepMind's cooling AI reduced Google data center cooling energy by roughly 40%.16 AlphaFold compressed decades of protein structure research into a few months.17 At a fraction of the compute cost, GraphCast performs better than traditional weather models. 18 These aren't experimental; they're being used in real-world settings.
The honest positive case is this: AI's potential climate benefits probably exceed its energy costs, but only if the benefits actually get deployed, the energy is clean, and rebound effects don't eat the gains. "AI will save the climate" is as lazy as "AI will destroy it."
What individuals should do
Stop feeling guilty about prompts. Your Wh per query is not the lever that matters. You'll do more climate good by eating one less steak, taking one fewer flight, or voting for better energy policy than by boycotting LLMs.
What matters at the individual level is where you direct your attention. Demand the acceleration of the deployment of clean generation to meet data center demand; grid interconnections, nuclear licensing, transmission lines, and permitting reform are the bottleneck, not GPUs. Advocate for transparency requirements on AI companies' operational emissions. Treat "AI for climate" claims with the same scrutiny you'd apply to any corporate sustainability marketing. Some of it is real; plenty is PR. And use AI where it genuinely makes you more effective at things that matter, including climate work. The question isn't "Is this query free?" (none are) but "Is this query worth it?" (many are).
What policymakers should do
The meaningful levers are systemic: carbon-aware siting rules that require new data center builds to demonstrate grid capacity, water availability, and carbon intensity below thresholds; mandatory energy reporting at the workload level, not just the facility level, because you can't manage what you can't measure; clean firm power built at the pace of demand through nuclear, geothermal, long-duration storage, and accelerated permitting; water accountability in stressed regions; and a firm refusal to let efficiency metrics substitute for absolute emissions reductions. The climate responds to absolute emissions, so that's what policy should target.
The bottom line
AI's current climate footprint is modest. It's comparable to streaming, smaller than air conditioning, and a rounding error next to steel, cement, transportation, or agriculture. Individual AI use is not a meaningful climate decision for most people.
The trajectory is a different story. AI is the fastest-growing source of new electricity demand in advanced economies, one of the few sectors where emissions are rising while most are flat or declining,1 and it concentrates grid strain in specific regions in ways that will shape what generation gets built this decade. Efficiency gains so far are being reinvested into more capability rather than banked as energy savings.
The most honest thing that can be said is that whether AI turns out to be good, bad, or neutral for the climate depends almost entirely on the electricity mix feeding the data centers. Clean firm power, and the benefits will outweigh the costs. Gas and coal, and we'll have built out fossil infrastructure for a generation to run chatbots and image generators. At this very moment, that choice is being made in lines for grid interconnection, permit offices, and public utility commissions, not in your ChatGPT tab.
If you care about this, pay attention to the infrastructure decisions. That's where the outcome actually gets determined.
-
IEA, "Energy and AI", April 2025. https://www.iea.org/reports/energy-and-ai ↩
-
Gartner, "Electricity Demand for Data Centers to Grow 16% in 2025 and Double by 2030", November 2025. https://www.gartner.com/en/newsroom/press-releases/2025-11-17-gartner-says-electricity-demand-for-data-centers-to-grow-16-percent-in-2025-and-double-by-2030 ↩
-
Epoch AI, "How much energy does ChatGPT use?", February 2025. https://epoch.ai/gradient-updates/how-much-energy-does-chatgpt-use ↩
-
Andy Masley, "What's the full "hidden" climate cost of a ChatGPT prompt?", August 2025. https://www.andymasley.com/writing/whats-the-full-hidden-climate-cost/ ↩
-
Endesa, "Household energy consumption in Spain (INE) and how to save", (~3,264 kWh/year). https://www.endesa.com/en/blogs/endesa-s-blog/light/energy-consumption-in-spanish-households ↩
-
Jegham et al., "How Hungry is AI?", arXiv:2505.09598, May 2025. https://arxiv.org/abs/2505.09598 ↩
-
NVIDIA Developer Blog, "Introducing NVFP4 for Efficient and Accurate Low-Precision Inference", June 2025. https://developer.nvidia.com/blog/introducing-nvfp4-for-efficient-and-accurate-low-precision-inference/ ↩
-
Epoch AI, "Trends in Artificial Intelligence", February 2026. https://epoch.ai/trends/#training-runs ↩
-
TechCrunch, "ChatGPT users send 2.5 billion prompts a day", July 2025. https://techcrunch.com/2025/07/21/chatgpt-users-send-2-5-billion-prompts-a-day/ ↩
-
NPR Planet Money, "Why the AI world is suddenly obsessed with a 160-year-old economics paradox", February 2025. https://www.npr.org/sections/planet-money/2025/02/04/g-s1-46018/ai-deepseek-economics-jevons-paradox ↩
-
IEA, "The carbon footprint of streaming video: fact-checking the headlines", December 2020. https://www.iea.org/commentaries/the-carbon-footprint-of-streaming-video-fact-checking-the-headlines ↩
-
Gupta et al., "Environmental Burden of United States Data Centers in the Artificial Intelligence Era", arXiv:2411.09786. https://arxiv.org/abs/2411.09786 ↩
-
IEA, Energy and AI, "Energy supply for AI", April 2025. https://www.iea.org/reports/energy-and-ai/energy-supply-for-ai ↩
-
MIT Technology Review, "We did the math on AI's energy footprint", May 2025. https://www.technologyreview.com/2025/05/20/1116327/ai-energy-usage-climate-footprint-big-tech/ ↩
-
IEA, Energy and AI, "AI and climate change", April 2025. https://www.iea.org/reports/energy-and-ai/ai-and-climate-change ↩
-
DeepMind, "DeepMind AI Reduces Google Data Centre Cooling Bill by 40%", July 2016. https://deepmind.google/blog/deepmind-ai-reduces-google-data-centre-cooling-bill-by-40/ ↩
-
Demis Hassabis, John Jumper, Pushmeet Kohli and Anna Koivuniemi, "AlphaFold: Five years of impact", https://deepmind.google/blog/alphafold-five-years-of-impact/ ↩
-
Remi Lam, "GraphCast: AI model for faster and more accurate global weather forecasting", November 2023. https://deepmind.google/blog/graphcast-ai-model-for-faster-and-more-accurate-global-weather-forecasting/ ↩
Top comments (0)