Recently, you've probably seen those viral Ghibli-style images created by AI. Even if you didn't generate them yourself, they were everywhere on the internet. While making these, many of us noticed something — the images took forever to load. Maybe you blamed your Wi-Fi. Maybe you thought too many people were using the tool. But what if I told you... it just didn't have enough water to drink?
Sounds crazy, right? What does water have to do with AI or image generation?
Well, it turns out that our beloved AIs — from image generators to GPTs — are thirsty. Not metaphorically. Literally. These systems consume millions of liters of water to stay "healthy" (i.e., cool enough to function). We used to worry about AI taking our jobs. But at this rate, it might take our water first.
Yup. You heard that right.
In this blog, we explore the hidden link between AI and water, shedding light on the environmental cost of our digital experiences. We’ll look at how data centers stay cool, the scale of AI’s water usage, what companies are doing to address it, and how we, as users, can help make AI more sustainable.
🖥️ What Is a Data Center?
Whenever you search something online, your request doesn't just fly around in the air — it lands somewhere physically, in a data center.
A data center is a physical room, building, or facility that houses IT infrastructure — servers, networking equipment, storage systems — for building, running, and delivering applications and services. It also stores and manages the data behind everything you see on your screen.
Inside these centers are thousands of servers — powerful computers stacked in rows like bookshelves. These machines are always on, constantly processing data, running machine learning models, hosting websites, and handling cloud operations.
Think of it as the brain of the internet — and just like any hard-working brain, it gets hot.
According to the International Energy Agency, data centers already consume about 1% of global electricity use and contribute to roughly 0.3% of all global carbon emissions.1 With the explosion of AI, these numbers are only expected to grow.
🔥 Why AI Makes Things Hot (Literally)
Modern AI models — like ChatGPT or image generators — require massive computational power. Training a model like GPT-3 or running daily queries on image generators involves complex mathematical calculations across thousands of GPUs (Graphics Processing Units).
These GPUs generate an enormous amount of heat, and if not cooled properly, the system can crash or degrade in performance. That's where water quietly enters the scene.
The computational demands of AI are extraordinary. According to a 2022 study in the journal Science, training a large language model can require more than 1,000 MWh of electricity – equivalent to the yearly consumption of 35 average U.S. homes. That energy turns into heat that must be dissipated.
💧 How Is Water Used in Data Centers?
To prevent servers from overheating, data centers use cooling systems, much like car engines or gaming PCs.
Three common cooling methods are:
1. Air Cooling – The simplest method, using fans to blow air across components. While it doesn't directly use water, the electricity generation powering these systems often does.
2. Evaporative Cooling – Warm air passes over water. As the water evaporates, it cools the air — which is then circulated to cool down the servers. This method is efficient but directly consumes water.
3. Chilled Water Systems – Water is cooled in large chillers, then pumped through pipes that absorb and remove heat from the server racks. This closed-loop system is efficient but still experiences some water loss.
Water is ideal because it can absorb a lot of heat before its temperature rises, thanks to its high thermal conductivity and specific heat capacity.
A 2021 report from the U.S. Department of Energy found that a typical data center uses 3-5 million gallons of water per megawatt of capacity annually. For context, some of the largest AI data centers now exceed 100 megawatts.
📊 How Much Water Are We Talking About?
A research paper titled "Making AI Less Thirsty" by scholars at UC Riverside and UT Arlington revealed something surprising:
Training GPT-3, the large language model behind ChatGPT, likely consumed ~700,000 liters of clean water.
To put that in perspective:
That's enough to produce 370 BMW cars
Or enough for over 5,000 long showers
Or the daily water needs of about 7,000 people in water-stressed regions
And that's just for training. Once trained, AI models continue consuming water during inference — every time you type a prompt, ask a question, or generate an image.
According to Microsoft's own sustainability reports, their data centers used 4.4 billion gallons of water in 2022 – a 34% increase from the previous year, largely attributed to AI operations.
Multiply that by millions of users, billions of queries per month — and it becomes clear: AI's water footprint is huge.
🌎 Industry Comparisons: Is AI Really That Thirsty?
To put AI's water consumption in perspective, let's compare it to other industries:
While AI's total water footprint is still smaller than traditional industries, it's the growth rate that's concerning. The water usage of major tech companies has increased by 20-50% annually in recent years, far outpacing other sectors.
Also worth noting: unlike agriculture, where water is often returned to local watersheds, data center cooling frequently results in water that evaporates completely, removing it from the local water cycle.
🌍 Why This Matters
We often think of AI as a "cloud-based" thing — floating above us, digital and intangible. But AI lives in the real world, on real machines, in real buildings — using real electricity and real water.
And here's the twist: many data centers are located in water-stressed areas, where local communities already struggle with droughts or limited clean water access.
Google has data centers in Mesa, Arizona, and The Dalles, Oregon – both regions facing serious water scarcity issues. In 2021, residents in The Dalles sued Google to release water consumption data, concerned about the company's impact on local resources.
This raises critical questions:
Should companies disclose their water usage more transparently?
Can we design "water-efficient" AI models, just like energy-efficient ones?
Is the convenience of AI worth the hidden environmental cost?
How do we balance technological progress with resource sustainability?
🔧 What's Being Done to Fix It?
Some companies are stepping up:
Google has committed to being "water positive" by 2030, replenishing more water than they consume. They've also implemented AI-driven cooling optimization that reduced water use by 30% in some facilities.
Microsoft is researching underwater data centers (Project Natick), which use the ocean for cooling rather than freshwater resources.
OpenAI and Anthropic have begun publishing environmental impact reports for their models, including water footprints.
Technological solutions are also emerging:
Immersion cooling – Servers are submerged in specialized non-conductive fluids that absorb heat, reducing water needs by up to 95%.
Air-side economization – Using outside air for cooling when climate conditions permit, requiring minimal water.
Waste heat recovery – Capturing the heat from data centers to warm nearby buildings or for industrial processes.
AI optimization – Developing more efficient algorithms that require less computation, directly reducing cooling needs.
🛠️ What Can You Do?
As users, we're not powerless:
1. Limit unnecessary AI use – Do you really need to generate 50 variations of that AI image?
2. Support companies with transparent sustainability practices – Look for published environmental impact reports.
3. Ask questions – Pressure for environmental disclosures from the AI tools you use.
4. Use lightweight models when possible – Smaller models require less computational power and thus less cooling.
5. Advocate for water rights – Support policies that prioritize community water needs over industrial uses.
🧠 Final Thought
The next time you marvel at an AI-generated masterpiece or get a smart response from ChatGPT, remember — it didn't come from nowhere. It came from massive servers working hard behind the scenes, using energy, data, and a surprising amount of water.
AI is shaping the future. But it's up to us to make sure that future is sustainable — not just smart.
As we continue the AI revolution, we need to ask not just what AI can do for us, but what it's costing our planet. The true intelligence may lie not in creating the most powerful models, but in creating the most efficient ones.
References
International Energy Agency. (2023). Data Centres and Data Transmission Networks. https://www.iea.org/reports/data-centres-and-data-transmission-networks ↩
Patterson, D., et al. (2022). Carbon Emissions and Large Neural Network Training. Science, 378(6624), 1102-1105. ↩
U.S. Department of Energy. (2021). Data Center Water Usage Report. Office of Energy Efficiency & Renewable Energy. ↩
Luccioni, A.S., Viguier, S., & Ligozat, A.L. (2023). Making AI Less Thirsty: Uncovering and Addressing the Secret Water Footprint of AI Models. arXiv preprint arXiv:2304.05057. ↩
Microsoft. (2023). Environmental Sustainability Report 2022. https://www.microsoft.com/en-us/corporate-responsibility/sustainability/report ↩
Cook, G., & Jardim, E. (2023). Clicking Clean: Who is Winning the Race to Build a Green Internet? Greenpeace. ↩
Corbin, K. (2021). Oregon city sues to keep Google water use secret. DataCenter Knowledge. ↩
Google. (2023). Environmental Report 2023. https://sustainability.google/reports/ ↩
Microsoft Research. (2022). Project Natick: Underwater Data Centers. https://natick.research.microsoft.com/ ↩
Top comments (0)