AI Uses Less Water Than You Think. But It Costs More Than It Should.
A new study from UC Davis is making the rounds on Hacker News right now: AI data centers use significantly less water than the public has been led to believe.
The internet ran with the scary numbers. The reality is more nuanced. A typical ChatGPT query uses roughly the same water as charging your phone for 20 minutes.
Good news. The AI water hysteria was overblown.
But here's the thing nobody's talking about: while the water FUD was dominating headlines, the cost FUD got buried.
The Real AI Resource Story Is Pricing, Not Water
Let's put some real numbers on the table.
ChatGPT Plus: $20/month
That's $240/year. For one person. To use a chat interface.
In raw terms:
- 🇺🇸 USA: About 2 hours of minimum wage work
- 🇧🇷 Brazil: ~R$100/month — 2-3 days of average developer salary
- 🇳🇬 Nigeria: ~N32,000/month — literally 3-5 days of a mid-level developer's salary
- 🇮🇳 India: ~Rs1,600/month — a meaningful chunk of monthly income for many developers
- 🇵🇠Philippines: ~P1,120/month — close to a day's pay for an entry-level dev
The water cost of AI is declining due to efficiency gains. The dollar cost of AI access has stayed sticky at $20/month across all markets.
Why AI Costs "Too Much" Has Nothing To Do With Water
The $20/month price point isn't set by infrastructure costs. It's set by market positioning.
Anthropic's Claude API, the exact same model that powers Claude.ai, bills at roughly:
- $3 per million input tokens (Sonnet)
- $15 per million output tokens (Sonnet)
A typical developer query is maybe 500-1000 tokens in, 500-1000 tokens out. That's fractions of a cent.
At heavy developer usage — 100 substantive queries per day — you're looking at maybe $5-8/month in raw API costs.
The $20/month is the distribution/brand/convenience tax. Not infrastructure.
The Efficiency Paradox
Here's what I find genuinely interesting about the water study:
AI systems are getting dramatically more efficient at the infrastructure level. Each query costs less compute, less water, less energy than it did 2 years ago.
But end-user prices haven't moved.
In every other tech sector, efficiency gains eventually flow to users. Cloud storage dropped from dollars to cents per GB. Compute dropped from mainframe to smartphone. SMS went from $0.10/text to free.
AI infrastructure is getting cheaper. But consumer AI pricing is showing zero signs of following.
What This Means If You're a Developer
If you're building on AI — whether that's automating workflows, building apps, or just using it as a thinking partner — you have two options:
Pay the convenience tax: $20/month to ChatGPT, $20/month to Claude.ai, pile up subscriptions until you're spending $60-100/month across multiple tools.
Go closer to the metal: Use APIs directly. You pay for what you use, at a fraction of the subscription price. SimplyLouie wraps the Claude API at $2/month flat — less than the infrastructure cost of the convenience tax.
The water efficiency story is real and worth celebrating. The pricing efficiency story hasn't happened yet.
The Actual Question Worth Asking
When will AI pricing efficiency follow the pattern of every other compute category?
Cloud storage took about 10 years to commoditize. LLM APIs have existed for about 3 years.
The race to the bottom hasn't started yet. But it will.
In the meantime: the developers who find the infrastructure price — not the convenience tax — win.
What's your take? Do you think consumer AI pricing will eventually follow compute efficiency gains downward? Or is there something structural about AI that keeps prices sticky?
I'm curious specifically from developers outside the US — does the $20/month price feel proportionate to your local market? Or does it feel like American pricing exported globally?
Top comments (0)