The AI industry has an electricity problem everyone talks about. It also has a water problem almost nobody does.
A single ChatGPT query consumes roughly 500 milliliters of freshwater — about one standard bottle. Training GPT-3 required 700,000 liters, enough to fill a backyard swimming pool three times. These numbers come from Shaolei Ren, an associate professor at UC Riverside who has spent three years mapping the water footprint the AI industry would prefer you didn't know about.
The water goes to cooling. Data centers generate so much heat that servers would melt without constant temperature regulation. Most facilities use evaporative cooling — running water over heat exchangers, letting it evaporate, and dumping the waste heat into the atmosphere. A mid-sized data center consumes 300,000 gallons of water per day. A large one drinks 5 million gallons — what a town of 50,000 people uses for everything: showers, cooking, lawns, drinking.
By 2028, AI-related data centers in the United States alone are projected to consume 32 billion gallons of water annually. That's enough to supply the indoor water needs of 360,000 households. And the industry is just getting started.
Built Where the Water Isn't
Two-thirds of all data centers built or in development since 2022 sit in water-stressed areas: southern Arizona, the Colorado River Basin, Texas. The same regions where farmers are losing water allocations, cities are rationing, and aquifers are dropping measurably year over year.
This isn't accidental. Water-stressed regions tend to offer cheap land, abundant solar energy, and local governments desperate for tax revenue. The data center arrives with promises of jobs and infrastructure investment. The water bill comes later.
In Tucson, Arizona, the city council unanimously rejected a data center proposal after community members packed the hearing. Their argument was straightforward: the city is in a drought, water is rationed, and the proposed facility would have consumed more than surrounding neighborhoods combined.
Tucson wasn't alone. More than 100 counties and cities across the United States have passed temporary moratoriums, zoning limits, or new environmental review requirements for data centers since 2023. Communities in Georgia, Oregon, Virginia, and California have all pushed back. The fights are local, uncoordinated, and spreading.
The Transparency Gap
Here's what makes the water problem harder than the energy problem: nobody knows the real numbers.
Data center operators are not required to publicly disclose water consumption in most jurisdictions. Google, Microsoft, Amazon, and Meta publish annual sustainability reports, but the figures aggregate all data center operations without distinguishing AI workloads from email storage or video streaming. If you want to know how much water Claude or Gemini or GPT-5 actually consumed last quarter, no public document will tell you.
States are starting to force the issue. An E&E News investigation found multiple states drafting legislation to end data center water secrecy, requiring operators to file consumption data with public utilities commissions. Environmental groups have urged Congress to pause federal data center approvals until water impact assessments catch up with construction timelines.
The industry's response has been to promise efficiency improvements while breaking ground on the next facility. Microsoft pledged to become "water positive" by 2030 — replenishing more water than it consumes. Google set a similar target. Neither company has published a credible roadmap for achieving it while simultaneously tripling their AI compute capacity.
The Math Nobody Wants to Do
A Ceres report mapped the cumulative water stress from data center clustering. When multiple facilities concentrate in the same watershed — as they do in northern Virginia, central Texas, and the Phoenix metro area — the aggregate draw can shift the water stress classification of an entire region. What was "moderate stress" becomes "high stress." Farmers who had enough water last year don't this year.
The projections are worse. Data center water consumption could increase by 870 percent as new facilities come online through the decade. That estimate comes from a period when most AI models ran inference on text. The shift toward multimodal models — processing video, audio, images, and real-time agent workloads — multiplies the compute per query and the cooling load per rack.
Shaolei Ren puts it plainly: "Every time you ask an AI chatbot a question, you are also consuming water — without realizing it."
The Quiet Crisis
The AI water crisis doesn't have a Bernie Sanders or a Ron DeSantis championing it on cable news. It doesn't have a catchy metric like "$100 billion in electricity costs." Water is local, invisible in the product, and politically complex in ways that electricity isn't. You can't see a data center drinking from the same aquifer your well draws from.
But the 100 communities that passed moratoriums figured it out anyway. They didn't need a UC Riverside study to notice the creek running lower or the utility rate increasing or the new gray building at the edge of town with no windows and armed guards.
The AI industry built its infrastructure in drought country, promised jobs and taxes, and assumed nobody would count the water. One hundred towns counted. The other thousand haven't yet.
If you work with AI, check out my AI prompt engineering packs on Polar — battle-tested prompts for developers.
Top comments (0)