DEV Community

Cover image for Every ChatGPT Query Has a Power Bill And You Might Be Paying It
Devesh Korde
Devesh Korde

Posted on

Every ChatGPT Query Has a Power Bill And You Might Be Paying It

I was reading about Nvidia's GTC announcements last week new chips, new partnerships, new records. Exciting stuff. Then I came across a number that stopped me cold.

A single AI data center campus can consume more electricity than 100,000 homes.

That's not a typo. Not a projection for 2030. That's happening right now, in 2026, in places like northern Virginia where "Data Center Alley" already eats up 26% of the state's total electricity. And the people living near these campuses? Their power bills have gone up 42% since 2019.

Every time you ask ChatGPT a question, every time Copilot autocompletes your code, every time an AI model trains on another trillion tokens there's a physical cost. Electricity, water, heat. Real resources consumed in real places by real machines.

Nobody talks about this at product launches. But it's becoming the defining tension of the AI era.

The numbers are staggering

Let me throw some data at you because this isn't a vibes argument. It's math.

The International Energy Agency released a report this year projecting that global data center electricity demand will more than double by 2030 reaching around 945 TWh. That's roughly the entire electricity consumption of Japan. AI is the primary driver.

In the United States alone, data centers are on track to account for nearly half of all electricity demand growth between now and 2030. Here's the part that hit me hardest: the US economy is projected to consume more electricity for processing data in 2030 than for manufacturing all energy-intensive goods combined including aluminium, steel, and chemicals.

We went from "AI is software eating the world" to "AI is eating the power grid." And the grid wasn't built for this.

PJM Interconnection the largest US grid operator, serving 65 million people across 13 states is projecting it'll be six gigawatts short of reliability requirements by 2027. For context, that's roughly six nuclear power plants worth of missing capacity. The grid operator's president said he's never seen the system under this kind of projected strain.

The uncomfortable truth

We're living through a strange moment. The most advanced technology humans have ever built is constrained by one of the oldest problems in industrial history: where does the power come from?

The AI companies talk about intelligence, reasoning, agents, superintelligence. The reality on the ground is transformers, substations, cooling towers, and utility commissions. It's a retired couple in Ohio whose electricity bill jumped because a data center moved into their county.

I'm not saying we should stop building AI. I use these tools every single day they make me a better developer. But I think we should be honest about the costs. Not just the API pricing page, but the real, physical, environmental costs that are being distributed across communities that never asked for a server farm.

The AI revolution has a power bill. And right now, we're all splitting it whether we signed up for it or not.

I've written it more here...
Mind Of Korde

Top comments (0)