AI's energy problem is real, it's growing, and it's showing up in your electricity bill.
Photo by Paul Hanaoka on Unsplash
Here is a number that you should know.
In 2024, US data centres consumed 183 terawatt-hours of electricity. That is more than 4% of the entire country's electricity consumption — roughly equivalent to the annual electricity demand of the entire nation of Pakistan.
And that was before the AI boom fully hit.
In 2025, electricity demand from data centres soared by 17%, with AI-focused facilities climbing even faster — well outpacing global electricity demand growth of 3%. By 2030, worldwide data centre consumption is projected to nearly double, from 448 TWh to 980 TWh. AI-optimised servers alone are set to grow almost fivefold — from 93 TWh to 432 TWh.
This is not a background statistic. It is a civilisational choice we are making quietly, one GPU cluster at a time.
What a Single Query Actually Costs
| Action | Energy per query |
|---|---|
| Google Search | ~0.0003 kWh |
| ChatGPT text query | ~0.0003 kWh (0.3 Wh) |
| Image generation | Estimated 10–100x text |
| AI agent (multi-step) | Compounding per step |
A ChatGPT query consumes roughly 0.3–0.34 watt-hours of electricity — about ten times more than a Google search. That sounds small. Multiply it by hundreds of millions of daily users, add image generation, video, AI agents running autonomously in the background — and the numbers become difficult to comprehend.
AI-focused data centres consumed 155 TWh in 2025. All of the world's text queries accounted for around 2% of that. The other 98% went to training, fine-tuning, and infrastructure overhead.
The query on your screen is the smallest part of the problem.
Who Pays for It
The obvious answer is Big Tech. The less obvious answer is you.
Residential electricity prices jumped 7.1% in 2025 — more than double the inflation rate — and topped 20% in some states. The AI data centre rush is a significant contributing factor, because hyperscale data centres demand a nearly unimaginable amount of energy. A typical hyperscale facility uses around 100 MW — as much as 100,000 households. Meta's Hyperion project in Louisiana will need at least 5 GW to run.
In Ireland, data centres now consume 22% of the country's total electricity. A small country's grid, quietly colonised by server racks.
Regions such as Virginia and Ireland may experience grid vulnerability, as AI infrastructure is highly concentrated — over 90% of projected compute capacity sits in North America, Western Europe, and Asia-Pacific. The burden is not evenly distributed. The benefits are even less so.
The Fixes Engineers Are Actually Building
Neuromorphic Chips
Neuromorphic chips abandon the von Neumann architecture entirely — no clock, no fetch-decode-execute cycle, no power consumed when nothing is happening. Intel's Loihi 2 has demonstrated 1,000x energy reduction on certain tasks versus a GPU. Researchers have trained a 1.5-billion-parameter model on neuromorphic hardware, achieving up to a 70,000-fold reduction in energy consumption and a 100-fold speed-up compared with GPUs.
Photonic Chips
Photonic chips replace electrical signals with light. By eliminating electronic-to-optical signal conversions and achieving seamless communication through wavelength division multiplexing, photonic systems significantly reduce energy consumption in data centres. Integrating analog memory into neuromorphic photonic architectures can achieve over 26x power savings compared to conventional SRAM-DAC designs.
Nuclear and Renewables
The pipeline of conditional agreements between data centre operators and small modular reactor projects has grown from 25 GW at end of 2024 to 45 GW — indicating that AI's energy appetite may accelerate the commercialisation of entirely new energy technologies.
The Uncomfortable Truth
Power consumption per AI task is declining rapidly — efficiency is improving at a rate unprecedented in energy history. The chips are getting better. The models are getting more efficient.
But more people are using AI. More tasks are being handed to AI agents. The efficiency gains are being consumed faster than they are being made.
This is Jevons' Paradox — the more efficient a technology becomes, the more it gets used, and total consumption rises. It happened with steam engines. With cars. With electricity itself.
The question is not whether AI will become more energy efficient. It will. The question is whether the hardware revolution — neuromorphic, photonic, the entire post-silicon stack — can outrun the appetite of a technology the world has decided it cannot live without.
Found this useful? Drop a ❤️ or 🦄 — it helps others find the article.
Questions or pushback on the numbers? Drop them in the comments — I read and reply to everything.
Follow me on Dev.to for more deep dives on AI hardware, semiconductors, and the engineering behind next-gen computing.
Top comments (0)