DEV Community

Damien Gallagher
Damien Gallagher

Posted on • Originally published at buildrlab.com

The US Government Just Admitted It Has No Idea How Much Power AI Uses

The Numbers Don't Add Up — Because Nobody Has the Numbers

On March 25, 2026, the U.S. Energy Information Administration quietly dropped a press release that should have been a much bigger story: they're launching a pilot survey to figure out how much electricity data centers actually use.

Read that again. The official agency responsible for tracking U.S. energy consumption has to run voluntary field studies just to get a baseline on data center power usage. In 2026. When AI is reshaping the global economy and data centers are being built at a pace not seen since the dot-com boom.

This isn't a minor administrative detail. It's a flashing warning light on the dashboard of the most significant infrastructure buildout of the last 20 years.

What the EIA Is Actually Doing

The agency identified 196 companies operating data centers across three regions: Texas, Washington state, and the Northern Virginia/DC corridor — collectively home to the largest concentration of AI compute infrastructure in the world.

Each company will be asked to voluntarily report on energy sources, electricity consumption, site characteristics, server metrics, and cooling systems. The surveys are web-based in Texas and Washington, with in-person interviews in Northern Virginia.

EIA Administrator Tristan Abbey was refreshingly candid about the problem: "A tremendous amount of excellent work goes into our retrospective consumption surveys, but they were conceived decades ago. Going forward, that excellent work will be geared toward faster cycles and finer detail."

Translation: the measurement framework was built for a world that no longer exists. They're catching up.

Why This Matters Right Now

The timing isn't coincidental. Over the past year, a few uncomfortable facts have been piling up:

Global data center electricity consumption is projected to exceed 1,000 TWh by 2026 — roughly equivalent to the entire annual electricity consumption of Germany. The IEA has noted that a single standard AI data center consumes as much electricity as 100,000 households. And a Consumer Reports analysis earlier this year found that data center capacity will nearly double between 2025 and 2028, jumping from 80 to 150 gigawatts.

Congress has also been pushing for mandatory energy disclosure from hyperscalers, driven partly by bipartisan frustration that electricity rates in data center-heavy states are rising for ordinary residents while the companies responsible face no reporting obligations.

The EIA survey is a first step toward changing that — but voluntary surveys have obvious limitations. If you're a hyperscaler, there's no obligation to participate and no penalty for giving incomplete data.

The Real Problem: We're Building the Grid Backwards

Here's what strikes me as a developer and architect who's spent years working with cloud infrastructure: we've been treating energy as an unlimited resource, an operational cost line item, not a physical constraint.

Every Lambda function, every vector embedding, every inference call — they all pull from the same power grid that hospitals and homes depend on. At small scale, that's fine. At the scale AI is operating now? It's a systemic risk that nobody has properly priced in.

Southeast Asian governments are already revisiting nuclear energy plans specifically to power AI data centers. The U.S. is still running voluntary surveys to find out what it's dealing with.

The cloud abstracted away hardware from developers. AI is now abstracting away energy. That second abstraction may not be sustainable.

What Developers and Architects Should Take Away

If you're building AI-heavy systems right now, energy efficiency is no longer just a green credential — it's becoming a resilience and cost issue:

  • Model selection matters: A 70B parameter model may do what a 7B model can handle, but at 10x the energy cost. Right-size your inference.
  • Batch where possible: Asynchronous batch inference is dramatically more energy-efficient than synchronous real-time calls for non-latency-sensitive workloads.
  • Cache aggressively: Semantic caching for repeated or similar queries can reduce inference calls by 40-70% in real-world applications.
  • Region awareness: Some AWS regions run on higher percentages of renewable energy. For long-running AI workloads, region choice has real carbon and eventually cost implications.

The Bigger Picture

The EIA survey is a small administrative action with large implications. It signals that regulators are beginning to treat AI infrastructure as critical national infrastructure — something to be measured, monitored, and eventually governed.

For builders in this space, that shift matters. The era of building fast and ignoring the externalities is getting shorter. The questions being asked now — about energy, about liability (see: the Section 230 battles playing out this week in US courts), about security — are the questions that will define which AI companies survive the next phase.

Building for accountability isn't the opposite of building fast. It's how you build things that last.


Sources: EIA Press Release March 25 2026 · IEA data center energy analysis · Consumer Reports data center capacity report · Tech Startups daily briefing March 26 2026

Top comments (0)