<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: SnackIQ</title>
    <description>The latest articles on DEV Community by SnackIQ (@snackiq_app).</description>
    <link>https://dev.to/snackiq_app</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/snackiq_app"/>
    <language>en</language>
    <item>
      <title>AI Water Consumption Myth Debunked</title>
      <dc:creator>SnackIQ</dc:creator>
      <pubDate>Mon, 20 Apr 2026 14:02:55 +0000</pubDate>
      <link>https://dev.to/snackiq_app/ai-water-consumption-myth-debunked-1idm</link>
      <guid>https://dev.to/snackiq_app/ai-water-consumption-myth-debunked-1idm</guid>
      <description>&lt;p&gt;The &lt;a href="https://[snackiq](https://snackiq.app/glossary/snackiq).app/glossary/ai-water-consumption-myth-debunked" rel="noopener noreferrer"&gt;ai water consumption myth debunked&lt;/a&gt; starts here: yes, AI uses water, but probably not in the way you've been told. A widely shared estimate suggests that generating a short conversation with a large AI model consumes roughly the equivalent of a small bottle of water. That number sounds alarming in isolation. But context changes everything. A 2025 explainer from MIT News confirmed that while AI data centres do use significant water for cooling, the figures cited in viral headlines frequently conflate training runs — rare, massive one-off events — with everyday inference, which is far cheaper. Panic spread. Nuance didn't. This article examines both sides of the debate and lands on what the evidence actually shows.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Case For a Real Water Problem
&lt;/h2&gt;

&lt;p&gt;The concern isn't invented. It's just frequently misapplied.&lt;/p&gt;

&lt;p&gt;Large AI models require enormous amounts of computing power to train. That computing happens inside data centres, and data centres generate heat — a lot of it. Cooling that heat requires water, either directly through evaporative cooling towers or indirectly through the energy grid, which itself consumes water at power stations. Both pathways are real, and both show up in environmental accounting.&lt;/p&gt;

&lt;p&gt;The numbers attached to &lt;strong&gt;AI training runs&lt;/strong&gt; are genuinely large. Training a frontier model from scratch can consume hundreds of thousands of litres of water. Some researchers have estimated that training a single large language model produces carbon emissions comparable to several transatlantic flights, and the water footprint follows similar logic. These are not trivial figures.&lt;/p&gt;

&lt;p&gt;Geography makes it worse in some cases. Data centres built in water-stressed regions — parts of the American Southwest, for instance — draw on already-scarce local supplies. When Microsoft or Google announces a new facility in a drought-prone area, local water authorities notice. Some communities have pushed back, raising legitimate questions about who bears the environmental cost of AI expansion.&lt;/p&gt;

&lt;p&gt;So the concern has a foundation. &lt;strong&gt;Water is used. Some regions feel the pressure.&lt;/strong&gt; The problem is that the conversation rarely stops there — and when it doesn't, accuracy suffers.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Case Against the Panic
&lt;/h2&gt;

&lt;p&gt;Here's the part that gets quietly dropped from most headlines: &lt;strong&gt;training and inference are completely different things&lt;/strong&gt;, and the water costs are not remotely comparable.&lt;/p&gt;

&lt;p&gt;Training a model is a one-time event. You train GPT-4 once. You don't retrain it every time someone asks it to write an email. The vast majority of AI interactions — every search, every chatbot reply, every autocomplete suggestion — happen during inference, which is orders of magnitude less computationally intensive than training. The viral water-per-query figures that spread across social media in 2023 and 2024 often blurred this line entirely.&lt;/p&gt;

&lt;p&gt;MIT's 2025 environmental impact explainer made this distinction clearly. Researchers there noted that inference workloads, while growing fast, consume a fraction of the resources attributed to training — and that efficiency is improving rapidly as chip design advances. The water cost per useful output is falling, not rising, year on year.&lt;/p&gt;

&lt;p&gt;Comparisons also matter. Consider what else uses water:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A standard beef burger requires roughly 2,400 litres of water to produce&lt;/li&gt;
&lt;li&gt;A cotton T-shirt takes approximately 2,700 litres&lt;/li&gt;
&lt;li&gt;A 10-minute shower uses around 80–100 litres&lt;/li&gt;
&lt;li&gt;Running a dishwasher uses roughly 12–15 litres per cycle&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;A single AI query sits well below any of these — yet none of them generate the same alarm. That asymmetry isn't an argument that AI water use is fine. It's an argument that the framing is inconsistent.&lt;/p&gt;

&lt;h2&gt;
  
  
  What the Data Actually Shows
&lt;/h2&gt;

&lt;p&gt;When you strip away the rhetoric, a clearer picture emerges — and it's more complicated than either side admits.&lt;/p&gt;

&lt;p&gt;AI data centres do consume meaningful water. Microsoft reported in its 2023 sustainability report that its global water consumption rose significantly year-over-year, a jump it linked in part to AI infrastructure buildout. Google published similar findings. These are real disclosures from real companies, and they matter.&lt;/p&gt;

&lt;p&gt;But the same reports show those companies investing heavily in &lt;strong&gt;water recycling, closed-loop cooling systems, and renewable energy sourcing&lt;/strong&gt;. Microsoft has committed to being water positive by 2030 — meaning it aims to replenish more water than it consumes. Whether those targets get hit is a fair question. That they exist at all signals something different from the narrative of unchecked environmental destruction.&lt;/p&gt;

&lt;p&gt;Researchers who study this field consistently point to a measurement problem. &lt;strong&gt;Water consumption figures vary enormously&lt;/strong&gt; depending on whether you count direct use only, or also include the upstream water embedded in electricity generation. When indirect water is included, every industry's numbers balloon — not just AI's. Coal-fired power plants, for instance, are among the most water-intensive electricity sources on earth.&lt;/p&gt;

&lt;p&gt;Efficiency trends also cut against the worst-case projections. Modern AI chips — from Nvidia's latest GPU generations to custom silicon built by Google and Amazon — deliver dramatically more computation per watt than chips from five years ago. More compute per watt means less heat, which means less cooling, which means less water. The trajectory is improving even as the absolute scale grows.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Nuance Most Coverage Misses
&lt;/h2&gt;

&lt;p&gt;The real issue isn't whether AI uses water. It does. The real issue is whether AI's water use is disproportionate to its output — and that's a question almost nobody asks.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Productivity-adjusted comparisons&lt;/strong&gt; change the picture entirely. If an AI system saves a researcher three hours of literature review, prevents a factory from running an unnecessary production cycle, or optimises a city's traffic flow to reduce fuel burn, the water consumed by that inference query looks different against what it replaced. This isn't a licence to ignore environmental costs. It's a reminder that all resource use needs to be weighed against value delivered.&lt;/p&gt;

&lt;p&gt;The geography problem is real and underreported. A data centre in Iceland, cooled by geothermal energy and ambient Arctic air, has a radically different water footprint from an identical facility in Phoenix, Arizona, drawing on the Colorado River basin. Aggregate global figures hide this variation entirely. Where AI infrastructure is built matters as much as how much of it is built.&lt;/p&gt;

&lt;p&gt;There's also a substitution effect worth considering. AI is increasingly used to model climate systems, accelerate drug discovery, improve solar panel efficiency, and reduce agricultural water waste. Research groups at institutions including Stanford and the European Centre for Medium-Range Weather Forecasts have used AI to improve climate prediction accuracy at a fraction of the computational cost of traditional models. &lt;strong&gt;The water AI uses may, in some cases, help conserve far more water elsewhere.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;None of this makes the data centre water question unimportant. It makes it more complex than a single scary number can capture.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Honest Answer on AI and Water
&lt;/h2&gt;

&lt;p&gt;AI's water footprint is real, measurable, and growing in absolute terms. It is not, by most evidence-based assessments, the civilisation-threatening drain that viral content suggests.&lt;/p&gt;

&lt;p&gt;The honest verdict: the problem is real but routinely exaggerated, and the exaggeration happens because &lt;strong&gt;training costs get presented as per-query costs&lt;/strong&gt;, because indirect water use gets bundled in inconsistently, and because AI makes a compelling villain in an era of climate anxiety.&lt;/p&gt;

&lt;p&gt;The more useful questions are narrower ones. Are data centres being built in water-stressed regions without adequate mitigation plans? Sometimes, yes — and that deserves scrutiny. Are tech companies publishing transparent, auditable water data? Inconsistently — and pressure for better disclosure is legitimate. Is the industry improving its efficiency faster than it's growing? The evidence suggests mostly yes, though the race is tight.&lt;/p&gt;

&lt;p&gt;What you can dismiss with confidence is the claim that every AI query is secretly draining rivers dry. The per-query water cost of a typical inference task is measurable in millilitres, not gallons. The cumulative scale matters and deserves ongoing attention. But scale and catastrophe are not the same thing — and the gap between those two words is where most of the misinformation about AI water consumption lives.&lt;/p&gt;

&lt;p&gt;AI does use water. That's not a myth — it's documented, disclosed, and worth watching. What is a myth is the idea that every interaction is quietly draining the planet's reserves. The honest picture is messier: a fast-improving efficiency curve, a real geography problem, and an accounting methodology that varies wildly depending on who's doing the counting. Follow the specifics, not the headlines. The question worth asking isn't 'does AI use water?' It's 'compared to what, and measured how?'&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Originally published on &lt;a href="https://snackiq.app/blog/ai-water-consumption-myth-debunked" rel="noopener noreferrer"&gt;SnackIQ&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>aiwaterconsumptionmy</category>
      <category>howdoesaiusewater</category>
      <category>doesaiwastewater</category>
      <category>aienvironmentalimpac</category>
    </item>
    <item>
      <title>How AI Quietly Destroys the Environment</title>
      <dc:creator>SnackIQ</dc:creator>
      <pubDate>Sun, 19 Apr 2026 08:02:37 +0000</pubDate>
      <link>https://dev.to/snackiq_app/how-ai-quietly-destroys-the-environment-3eaj</link>
      <guid>https://dev.to/snackiq_app/how-ai-quietly-destroys-the-environment-3eaj</guid>
      <description>&lt;p&gt;Every time you use an AI chatbot, something burns. Training a single large AI model can emit as much carbon dioxide as five average American cars do over their entire lifetimes — a figure researchers at the University of Massachusetts Amherst calculated when studying the energy cost of natural language processing models. That's before you factor in the ongoing electricity demand of running these systems at scale, 24 hours a day, across thousands of servers worldwide. AI's environmental impact is real, it's growing, and it's hiding in plain sight. This article breaks down exactly how artificial intelligence harms the environment — from carbon emissions and water consumption to hardware waste — and what the numbers actually mean.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why Training an AI Model Burns So Much Carbon
&lt;/h2&gt;

&lt;p&gt;Most people imagine AI as something that lives in the cloud, weightless and clean. The reality is the opposite. &lt;strong&gt;Training a large language model&lt;/strong&gt; requires weeks or months of continuous computation across thousands of specialised processors called GPUs and TPUs, all drawing power from electrical grids that are still heavily fossil-fuel dependent in many parts of the world.&lt;/p&gt;

&lt;p&gt;Researchers at the University of Massachusetts Amherst published a landmark analysis showing that training a single large transformer-based NLP model could produce over 280 tonnes of CO₂ equivalent — roughly five times the lifetime emissions of an average American car, including its manufacture. More recent models are larger still, and while companies have improved efficiency, the sheer scale of newer systems offsets many of those gains.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The location of data centres matters enormously.&lt;/strong&gt; A model trained using coal-heavy electricity in parts of the American Midwest carries a vastly different carbon cost than one trained using Iceland's geothermal grid. Companies rarely disclose which grid powers which training run, making independent verification difficult.&lt;/p&gt;

&lt;p&gt;What complicates the picture further is that training is only the beginning. Once deployed, models run inference — generating responses to user queries — continuously. Inference at scale across billions of daily requests adds up to a carbon load that can dwarf the original training cost over time. Google has estimated that AI-related workloads already account for a meaningful share of its total electricity consumption, a figure that continues to climb as AI features expand across its product suite.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Electricity Demand Nobody Talks About
&lt;/h2&gt;

&lt;p&gt;Data centres already consumed roughly 1–2% of global electricity before the AI boom. That baseline is now accelerating sharply. The International Energy Agency projected in 2024 that data centre electricity demand could double by 2026, driven primarily by AI workloads. To put that in perspective, the additional demand coming online in the next few years could exceed the total electricity consumption of several mid-sized European countries.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The hardware itself is part of the problem.&lt;/strong&gt; Modern AI chips — Nvidia's H100 GPUs being the most prominent example — are extraordinarily power-hungry. A single H100 draws around 700 watts under full load. A large training cluster might contain tens of thousands of them. Running such a cluster for months consumes electricity on a scale comparable to a small town.&lt;/p&gt;

&lt;p&gt;Cooling compounds the demand. For every unit of electricity a server uses for computation, substantial additional energy is needed to prevent it from overheating. The metric used to measure this is called &lt;strong&gt;Power Usage Effectiveness (PUE)&lt;/strong&gt; — a ratio of total facility power to IT equipment power. Even the most efficient hyperscale data centres run PUEs around 1.1–1.2, meaning 10–20% of all electricity consumed goes purely to cooling, not computation.&lt;/p&gt;

&lt;p&gt;Renewable energy investments by major tech firms are real but often involve accounting manoeuvres — buying renewable energy certificates that don't necessarily match actual consumption in real time. Critics argue this creates a gap between stated sustainability goals and physical reality.&lt;/p&gt;

&lt;h2&gt;
  
  
  Fresh Water Evaporates Into Every AI Response
&lt;/h2&gt;

&lt;p&gt;Beyond electricity, AI data centres consume staggering volumes of &lt;strong&gt;freshwater for cooling&lt;/strong&gt;. When air cooling alone can't handle the thermal load, many facilities use evaporative cooling towers that essentially turn water into vapour to dissipate heat — that water doesn't come back.&lt;/p&gt;

&lt;p&gt;Researchers studying the water footprint of AI estimated that training GPT-3 consumed roughly 700,000 litres of freshwater. A single conversation with a modern AI assistant — a back-and-forth of perhaps 20–50 messages — can require around half a litre of water to cool the servers processing it. Multiply that by hundreds of millions of daily users and the aggregate becomes significant.&lt;/p&gt;

&lt;p&gt;The geography of data centres creates acute local pressures. When a massive facility draws from a municipal water supply or a regional aquifer in an already water-stressed area — parts of Arizona, the American Southwest, or arid regions of Europe — the impact on local communities and ecosystems can be direct and measurable. Several municipalities have pushed back against data centre construction specifically because of water competition.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Water stress is unevenly distributed.&lt;/strong&gt; A data centre in rainy Scandinavia consuming local water presents a very different ecological risk than one drawing from the Colorado River Basin, which has been in prolonged drought. The same AI request carries a different environmental weight depending on where the server handling it happens to be.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Hardware Waste Piling Up Silently
&lt;/h2&gt;

&lt;p&gt;AI accelerates a problem that already plagued consumer electronics: hardware obsolescence. The AI chip market moves at an extraordinary pace. Nvidia's GPU generations — A100, H100, B100 — succeed each other rapidly, each offering performance gains that make the previous generation economically uncompetitive for cutting-edge AI workloads. That creates pressure to retire hardware that, in computing terms, is barely old.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Electronic waste (e-waste)&lt;/strong&gt; is one of the fastest-growing waste streams globally. The United Nations Environment Programme has tracked the annual volume of e-waste rising steadily into the tens of millions of tonnes per year. AI hardware — GPUs, custom silicon, networking equipment — is part of that stream. These components contain valuable rare earth elements and toxic materials including lead, cadmium, and mercury.&lt;/p&gt;

&lt;p&gt;Mining the rare materials required for AI chips in the first place carries its own environmental cost. Elements like cobalt, tantalum, and lithium are extracted under conditions that often involve significant land disruption, water contamination, and carbon-intensive processing. The supply chain upstream of a gleaming data centre is anything but clean.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Cobalt&lt;/strong&gt; — critical for battery systems powering backup power supplies in data centres, often mined in the Democratic Republic of Congo&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Tantalum&lt;/strong&gt; — used in capacitors across electronic components, with significant mining concentrated in politically unstable regions&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Rare earth elements&lt;/strong&gt; — used in magnets and sensors, with over 60% of global supply concentrated in China&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The circular economy around AI hardware is still immature. Refurbishment and recycling programmes exist, but they operate at a fraction of the scale needed to offset the volume of hardware being cycled out.&lt;/p&gt;

&lt;h2&gt;
  
  
  Does AI Ever Help the Environment?
&lt;/h2&gt;

&lt;p&gt;The environmental impact of AI isn't purely a ledger of harm. Proponents — including researchers at DeepMind and various climate technology organisations — argue convincingly that AI can accelerate solutions to the very problems it contributes to.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Google DeepMind's AlphaFold&lt;/strong&gt; protein-structure prediction tool has transformed biological research, potentially accelerating the development of drugs and materials in ways that would have taken decades otherwise. DeepMind also applied AI to optimise the cooling systems in Google's own data centres, reportedly reducing cooling energy use by around 40% — a genuine efficiency gain achieved through reinforcement learning.&lt;/p&gt;

&lt;p&gt;AI tools are being used to improve the accuracy of weather and climate modelling, optimise electricity grid management to reduce waste, and identify sites suitable for renewable energy development. In agriculture, AI-driven precision farming systems can reduce water and fertiliser use significantly compared to conventional approaches.&lt;/p&gt;

&lt;p&gt;But the critical question is whether these benefits outpace the environmental cost of developing and running the AI systems that generate them. Right now, &lt;strong&gt;no reliable framework exists&lt;/strong&gt; for calculating this net effect across the industry. Companies are not required to disclose the full lifecycle environmental impact of their AI systems, which means the public debate operates largely without the data it needs.&lt;/p&gt;

&lt;p&gt;The honest answer is that AI can be an environmental asset or liability depending almost entirely on what it's used for, how the electricity powering it is generated, and whether efficiency improvements keep pace with the relentless expansion of scale.&lt;/p&gt;

&lt;p&gt;AI's environmental cost isn't theoretical. It's measured in tonnes of CO₂, millions of litres of freshwater, and mountains of discarded hardware — all invisible to the person typing a prompt. That invisibility is the core problem. When the cost of a technology is hidden from the people using it, market pressure to reduce that cost disappears. The question isn't whether to use AI — it's whether the industry building it will be held to account for what it actually costs the planet to run.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Originally published on &lt;a href="https://snackiq.app/blog/how-ai-quietly-destroys-the-environment" rel="noopener noreferrer"&gt;SnackIQ&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>howdoesaiaffecttheen</category>
      <category>aienvironmentalimpac</category>
      <category>aicarbonfootprint</category>
      <category>artificialintelligen</category>
    </item>
    <item>
      <title>AI Water Use vs Other Industries</title>
      <dc:creator>SnackIQ</dc:creator>
      <pubDate>Fri, 17 Apr 2026 14:01:59 +0000</pubDate>
      <link>https://dev.to/snackiq_app/ai-water-use-vs-other-industries-5cnn</link>
      <guid>https://dev.to/snackiq_app/ai-water-use-vs-other-industries-5cnn</guid>
      <description>&lt;p&gt;&lt;a href="https://[snackiq](https://snackiq.app/glossary/snackiq).app/glossary/ai-water-usage-compared-to-other-industries" rel="noopener noreferrer"&gt;AI water usage compared to other industries&lt;/a&gt; tells a very different story than the headlines suggest. Yes, a large data centre can consume up to 5 million gallons of water per day — roughly equivalent to the daily needs of a town of 50,000 people, according to the Environmental and Energy Study Institute. That sounds alarming. But the global cattle industry alone uses approximately 4,555 billion litres of water annually, compared to roughly 18.2 billion litres attributed to AI systems like ChatGPT — nearly 250 times more for a single agricultural sector. The real picture requires comparison, not isolation. Put AI's water use next to steel, paper, agriculture, and semiconductor manufacturing, and a more nuanced truth emerges: AI isn't the water villain it's been painted as.&lt;/p&gt;

&lt;h2&gt;
  
  
  How much water does AI actually use?
&lt;/h2&gt;

&lt;p&gt;The water demand of AI comes almost entirely from one source: &lt;strong&gt;data centre cooling&lt;/strong&gt;. Servers generate enormous heat. To stop them from failing, facilities pump water through cooling towers, chillers, and increasingly, direct-to-chip systems. When water evaporates in those towers, it's gone — that's the consumption that ends up in the headlines.&lt;/p&gt;

&lt;p&gt;The figure most cited is from research by Shaolei Ren at the University of California, Riverside, who estimated that training GPT-3 consumed around 700,000 litres of freshwater. A single conversation with ChatGPT — roughly 20 to 50 questions — uses approximately 500 millilitres, about the volume of a standard water bottle. Scaled to millions of daily users, that adds up fast.&lt;/p&gt;

&lt;p&gt;Globally, data centres are estimated to account for around 0.2% of total freshwater withdrawals worldwide. That's not nothing. But it's a useful number to hold in your head before we start comparing.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
  &lt;thead&gt;
    &lt;tr&gt;
      &lt;th&gt;AI / Data Centre Activity&lt;/th&gt;
      &lt;th&gt;Estimated Water Use&lt;/th&gt;
      &lt;th&gt;Context&lt;/th&gt;
    &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
    &lt;tr&gt;
      &lt;td&gt;Training GPT-3 (one training run)&lt;/td&gt;
      &lt;td&gt;~700,000 litres&lt;/td&gt;
      &lt;td&gt;UC Riverside estimate&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;One ChatGPT conversation (~25 queries)&lt;/td&gt;
      &lt;td&gt;~500 ml&lt;/td&gt;
      &lt;td&gt;Equivalent to one water bottle&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;Large data centre (daily)&lt;/td&gt;
      &lt;td&gt;Up to 19 million litres&lt;/td&gt;
      &lt;td&gt;Per EESI; serves 10,000–50,000 people equivalent&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;Global data centres (annual)&lt;/td&gt;
      &lt;td&gt;~18.2 billion litres (AI systems)&lt;/td&gt;
      &lt;td&gt;Bryant Research estimate, 2025&lt;/td&gt;
    &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;The numbers are real. But they only mean something when stacked against what else we consume water for.&lt;/p&gt;

&lt;h2&gt;
  
  
  How does AI compare to agriculture and food production?
&lt;/h2&gt;

&lt;p&gt;Agriculture is the single largest consumer of freshwater on the planet — responsible for roughly &lt;strong&gt;70% of all global freshwater withdrawals&lt;/strong&gt;, according to the UN Food and Agriculture Organization. That's not a close race. It's a different category entirely.&lt;/p&gt;

&lt;p&gt;Dairy production alone consumes around 4,555 billion litres of water annually. Beef is even thirstier — producing one kilogram of beef requires approximately 15,400 litres of water when you account for feed crops, drinking water, and processing. A single cotton T-shirt takes around 2,700 litres to produce.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
  &lt;thead&gt;
    &lt;tr&gt;
      &lt;th&gt;Industry / Activity&lt;/th&gt;
      &lt;th&gt;Annual Water Use&lt;/th&gt;
      &lt;th&gt;Ratio vs AI Systems&lt;/th&gt;
    &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
    &lt;tr&gt;
      &lt;td&gt;AI systems (ChatGPT etc.)&lt;/td&gt;
      &lt;td&gt;~18.2 billion litres&lt;/td&gt;
      &lt;td&gt;Baseline (1×)&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;Global dairy production&lt;/td&gt;
      &lt;td&gt;~4,555 billion litres&lt;/td&gt;
      &lt;td&gt;~250× more&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;Global beef production&lt;/td&gt;
      &lt;td&gt;Trillions of litres&lt;/td&gt;
      &lt;td&gt;Far exceeds dairy&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;Global cotton farming&lt;/td&gt;
      &lt;td&gt;~200 billion litres (est.)&lt;/td&gt;
      &lt;td&gt;~11× more&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;Global rice production&lt;/td&gt;
      &lt;td&gt;~1,000 billion litres&lt;/td&gt;
      &lt;td&gt;~55× more&lt;/td&gt;
    &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;The comparison isn't meant to dismiss AI's footprint. It's meant to calibrate the conversation. If the goal is water conservation, redirecting even 5% of global beef consumption would save multiples of everything AI data centres use in a year. That's not a political statement — it's arithmetic.&lt;/p&gt;

&lt;p&gt;None of this means AI gets a free pass. But &lt;strong&gt;framing AI as a primary water threat&lt;/strong&gt; while ignoring agriculture misdirects both public concern and policy energy.&lt;/p&gt;

&lt;h2&gt;
  
  
  Where does heavy industry actually stand?
&lt;/h2&gt;

&lt;p&gt;Before AI entered the discourse, industries like steel, paper, and semiconductor manufacturing were the quiet giants of industrial water use. They still are.&lt;/p&gt;

&lt;p&gt;Steel production requires enormous amounts of water for cooling, descaling, and processing. Producing one tonne of steel can require anywhere from 25,000 to 45,000 litres depending on the process and the facility's recycling efficiency. Global steel output exceeds 1.9 billion tonnes per year — the maths there is staggering.&lt;/p&gt;

&lt;p&gt;The paper and pulp industry is similarly water-intensive. Producing one tonne of paper can require 10,000 litres or more. Semiconductor fabrication — the industry that makes the chips AI runs on — uses ultrapure water in huge volumes. A single chip fabrication plant can consume millions of litres per day in the purification and etching processes alone.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
  &lt;thead&gt;
    &lt;tr&gt;
      &lt;th&gt;Industry&lt;/th&gt;
      &lt;th&gt;Water Use per Unit of Output&lt;/th&gt;
      &lt;th&gt;Global Scale Note&lt;/th&gt;
    &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
    &lt;tr&gt;
      &lt;td&gt;Steel production&lt;/td&gt;
      &lt;td&gt;25,000–45,000 litres per tonne&lt;/td&gt;
      &lt;td&gt;1.9 billion tonnes/year globally&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;Paper &amp;amp; pulp&lt;/td&gt;
      &lt;td&gt;~10,000 litres per tonne&lt;/td&gt;
      &lt;td&gt;400+ million tonnes/year globally&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;Semiconductor fabs&lt;/td&gt;
      &lt;td&gt;Millions of litres per day per plant&lt;/td&gt;
      &lt;td&gt;Ultrapure water required for chip etching&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;Thermal power generation&lt;/td&gt;
      &lt;td&gt;~1,500 litres per MWh (cooling)&lt;/td&gt;
      &lt;td&gt;Powers most data centres indirectly&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;AI data centres&lt;/td&gt;
      &lt;td&gt;~500 ml per 25-query session&lt;/td&gt;
      &lt;td&gt;Growing, but currently minor at macro scale&lt;/td&gt;
    &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Thermal power generation deserves special mention. &lt;strong&gt;Every time a data centre draws electricity from a coal or gas plant&lt;/strong&gt;, that power station uses water too — often more than the data centre itself. The indirect water cost of AI is real, and it's tied directly to how clean the electricity grid powering those servers actually is. Countries running data centres on hydropower or renewables have a fundamentally different footprint from those still reliant on fossil-fuelled generation.&lt;/p&gt;

&lt;h2&gt;
  
  
  Is AI's water footprint actually growing?
&lt;/h2&gt;

&lt;p&gt;Honestly? Yes. And that's the part that deserves honest scrutiny.&lt;/p&gt;

&lt;p&gt;The AI boom is driving a &lt;strong&gt;rapid expansion of data centre infrastructure&lt;/strong&gt; globally. Microsoft, Google, Amazon, and Meta have all announced multi-billion dollar data centre buildouts in 2024 and 2025. The International Energy Agency projects that data centre electricity consumption could more than double by 2030, and water use scales alongside power demand.&lt;/p&gt;

&lt;p&gt;New AI workloads — particularly the large generative models used in tools like ChatGPT, Gemini, and Claude — are significantly more compute-intensive than traditional search or streaming. Inference (running the model to answer your question) is less demanding than training, but it happens billions of times per day. The aggregate matters.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
  &lt;thead&gt;
    &lt;tr&gt;
      &lt;th&gt;Factor&lt;/th&gt;
      &lt;th&gt;Current Status&lt;/th&gt;
      &lt;th&gt;Trend&lt;/th&gt;
    &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
    &lt;tr&gt;
      &lt;td&gt;Data centre water intensity&lt;/td&gt;
      &lt;td&gt;Improving with newer cooling tech&lt;/td&gt;
      &lt;td&gt;Efficiency rising, but scale growing faster&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;Total data centre count&lt;/td&gt;
      &lt;td&gt;8,000+ globally (est.)&lt;/td&gt;
      &lt;td&gt;Expanding rapidly through 2030&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;AI model size (parameters)&lt;/td&gt;
      &lt;td&gt;Hundreds of billions in frontier models&lt;/td&gt;
      &lt;td&gt;Continuing to grow&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;Cooling innovation (immersion/direct-chip)&lt;/td&gt;
      &lt;td&gt;Early adoption phase&lt;/td&gt;
      &lt;td&gt;Could cut water use 90%+ vs air cooling&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;Grid carbon intensity&lt;/td&gt;
      &lt;td&gt;Varies widely by region&lt;/td&gt;
      &lt;td&gt;Slowly decarbonising in most markets&lt;/td&gt;
    &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;The technology to dramatically reduce data centre water use already exists. Direct-to-chip liquid cooling and full immersion cooling — where servers sit in tanks of non-conductive fluid — can cut water consumption by up to 90% compared to traditional evaporative cooling towers. Some newer hyperscale facilities are targeting a &lt;strong&gt;Water Usage Effectiveness (WUE) rating of near zero&lt;/strong&gt; for direct water consumption, relying entirely on closed-loop systems.&lt;/p&gt;

&lt;p&gt;The trajectory matters as much as the current number. AI's water footprint is growing — but so is the industry's ability to shrink the water cost per query.&lt;/p&gt;

&lt;h2&gt;
  
  
  What does context-adjusted responsibility actually look like?
&lt;/h2&gt;

&lt;p&gt;The goal of this comparison isn't to let AI off the hook — it's to assign responsibility proportionally.&lt;/p&gt;

&lt;p&gt;Water stress is a genuine global crisis. Roughly &lt;strong&gt;2 billion people currently live in water-stressed regions&lt;/strong&gt;, according to the United Nations. When a new data centre opens near a community already facing aquifer depletion, that's a real conflict worth scrutinising. Local impact and global percentage share are different things. A data centre drawing from an already-strained river basin is a problem regardless of what the aggregate global numbers show.&lt;/p&gt;

&lt;p&gt;But the policy response should match the scale of the problem. If a government is debating water regulation and focuses primarily on tech campuses while ignoring irrigated monoculture farms using orders of magnitude more water — that's a misallocation of regulatory attention.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
  &lt;thead&gt;
    &lt;tr&gt;
      &lt;th&gt;Sector&lt;/th&gt;
      &lt;th&gt;Share of Global Freshwater Withdrawals (est.)&lt;/th&gt;
      &lt;th&gt;Primary Use&lt;/th&gt;
    &lt;/tr&gt;
  &lt;/thead&gt;
  &lt;tbody&gt;
    &lt;tr&gt;
      &lt;td&gt;Agriculture&lt;/td&gt;
      &lt;td&gt;~70%&lt;/td&gt;
      &lt;td&gt;Irrigation for crops and livestock feed&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;Industry (steel, paper, chemicals)&lt;/td&gt;
      &lt;td&gt;~20%&lt;/td&gt;
      &lt;td&gt;Cooling, processing, manufacturing&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;Municipal / domestic use&lt;/td&gt;
      &lt;td&gt;~10%&lt;/td&gt;
      &lt;td&gt;Drinking, sanitation, household&lt;/td&gt;
    &lt;/tr&gt;
    &lt;tr&gt;
      &lt;td&gt;Data centres (all, not just AI)&lt;/td&gt;
      &lt;td&gt;~0.2%&lt;/td&gt;
      &lt;td&gt;Server cooling&lt;/td&gt;
    &lt;/tr&gt;
  &lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;The honest answer is that &lt;strong&gt;AI's water footprint is a legitimate and growing concern&lt;/strong&gt; — but it's currently a minor contributor within a much larger industrial water system. The industries consuming 99.8% of the rest deserve proportionally more scrutiny, more regulation, and more investment in efficiency.&lt;/p&gt;

&lt;p&gt;Holding AI accountable while ignoring the rest is bad policy dressed up as environmentalism.&lt;/p&gt;

&lt;p&gt;AI's water use is real, measurable, and growing. That deserves honest scrutiny — especially when new data centres are built near already-stressed water supplies. But perspective is not the same as dismissal. Agriculture uses 70% of global freshwater. Steel and paper dwarf AI's consumption. The semiconductor fabs that build AI chips use more water per facility than most data centres. The conversation about AI and water needs better numbers, not just louder headlines. Context is how we get to solutions.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Originally published on &lt;a href="https://snackiq.app/blog/ai-water-use-vs-other-industries" rel="noopener noreferrer"&gt;SnackIQ&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>aiwaterusagecompared</category>
      <category>howdoesaiusewater</category>
      <category>howdoesaiaffecttheen</category>
      <category>datacenterwaterconsu</category>
    </item>
    <item>
      <title>How AI Actually Thinks Without a Brain</title>
      <dc:creator>SnackIQ</dc:creator>
      <pubDate>Thu, 16 Apr 2026 14:02:05 +0000</pubDate>
      <link>https://dev.to/snackiq_app/how-ai-actually-thinks-without-a-brain-128l</link>
      <guid>https://dev.to/snackiq_app/how-ai-actually-thinks-without-a-brain-128l</guid>
      <description>&lt;p&gt;AI doesn't think. Not in any sense you'd recognise. When ChatGPT answers your question or Midjourney paints a landscape, no mind is at work — no curiosity, no comprehension, no intent. What's happening instead is a form of extraordinarily sophisticated pattern-matching, operating at a scale the human brain genuinely cannot visualise. A 2023 report from Stanford University's Institute for Human-Centered AI estimated that leading AI models are now trained on datasets exceeding one trillion tokens — roughly equivalent to millions of books. The result is a system that produces outputs so fluent, so contextually apt, that 'thinking' feels like the only word that fits. It isn't. Understanding the actual mechanism changes how you see every AI tool you use.&lt;/p&gt;

&lt;h2&gt;
  
  
  What is AI actually doing when it responds?
&lt;/h2&gt;

&lt;p&gt;The honest answer is uncomfortable for most people: AI is guessing. Very, very good guessing — but guessing nonetheless.&lt;/p&gt;

&lt;p&gt;Every time a language model generates a word, it's running a statistical calculation: given everything that came before, what token (word fragment, punctuation, character) is most likely to come next? It does this thousands of times per second, chaining predictions together until a coherent response emerges. There's no lookup table, no fact database being consulted, no logic engine reasoning through the problem. Just probability, cascading.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;This is called autoregressive generation&lt;/strong&gt; — each output becomes part of the input for the next prediction. It's why AI can maintain the thread of a long conversation, and also why it can confidently state something completely wrong. It's not checking truth. It's matching patterns that look like truth.&lt;/p&gt;

&lt;p&gt;The underlying architecture — the &lt;strong&gt;transformer model&lt;/strong&gt;, introduced by Google researchers in a 2017 paper titled 'Attention Is All You Need' — uses a mechanism called self-attention to weigh how relevant every word in a sentence is to every other word. This lets the model capture long-range context that earlier systems missed entirely. A sentence like 'The trophy didn't fit in the suitcase because it was too big' requires knowing which 'it' refers to. Transformers handle this. Earlier models didn't.&lt;/p&gt;

&lt;p&gt;So when AI seems to understand nuance, it's because it has seen millions of examples of humans navigating that nuance, and it has learned the statistical signature of what a good response looks like.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why does training on data produce something that feels like knowledge?
&lt;/h2&gt;

&lt;p&gt;This is the part that trips most people up. Training feels like studying, but it's closer to osmosis at industrial scale.&lt;/p&gt;

&lt;p&gt;During training, a model is shown an enormous corpus of text — books, web pages, code, scientific papers, Reddit threads, legal documents. For each piece of text, it's repeatedly asked to predict missing words. When it gets it wrong, the error is fed back through the network via a process called &lt;strong&gt;backpropagation&lt;/strong&gt;, nudging billions of numerical parameters (called weights) slightly in the direction of a better prediction. Do this enough times across enough data, and the model's internal representations begin to encode something that functions like conceptual knowledge.&lt;/p&gt;

&lt;p&gt;GPT-4, for instance, has been reported to contain roughly 1.8 trillion parameters across its architecture — each one a small numerical dial that was adjusted during training. Nobody programmed any of those dials manually. They were shaped entirely by exposure to human language.&lt;/p&gt;

&lt;p&gt;The result is eerie. Ask a well-trained model about the French Revolution, and it will give you a historically coherent answer — not because anyone explained the French Revolution to it, but because patterns about the French Revolution are baked into its weights from thousands of overlapping sources. It has, in a functional sense, &lt;strong&gt;compressed human knowledge into a statistical shape&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;But compression isn't understanding. The model has no mental model of Paris, no concept of injustice, no grasp of what 'revolution' feels like from the inside. It knows the word's neighbourhood — what words typically surround it, what contexts it appears in — and nothing more.&lt;/p&gt;

&lt;h2&gt;
  
  
  How does AI handle reasoning and logic?
&lt;/h2&gt;

&lt;p&gt;Here's where the illusion gets most convincing — and most fragile.&lt;/p&gt;

&lt;p&gt;Ask an AI to solve a maths problem, and it often gets it right. Ask it to reason through a multi-step logic puzzle, and it can follow the chain. This feels like genuine reasoning. Researchers at institutions including MIT and DeepMind have studied whether large language models perform something structurally similar to logical inference, or whether they're retrieving cached patterns that look like reasoning.&lt;/p&gt;

&lt;p&gt;The current evidence suggests it's mostly the latter — with a twist. A technique called &lt;strong&gt;chain-of-thought prompting&lt;/strong&gt;, developed and popularised by Google researchers around 2022, discovered that if you ask a model to 'think step by step', its accuracy on reasoning tasks improves dramatically. Why? Because generating intermediate steps forces the model to produce text that looks like working-out, and working-out text in training data is reliably followed by correct answers. The model is, in effect, pattern-matching its way to the right answer by mimicking the structure of human reasoning.&lt;/p&gt;

&lt;p&gt;This works surprisingly well — until it doesn't. AI models are notoriously brittle on novel logical structures they haven't encountered in training. Change a classic logic puzzle by a single word, and performance can collapse. A human who genuinely understands logic adjusts. The model, lacking real comprehension, often fails.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;AI excels at problems that resemble training data closely&lt;/li&gt;
&lt;li&gt;AI struggles with genuinely novel reasoning structures&lt;/li&gt;
&lt;li&gt;Chain-of-thought prompting improves performance by mimicking human working&lt;/li&gt;
&lt;li&gt;Failures are often confident and coherent-sounding — which makes them dangerous&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is why researchers distinguish between &lt;strong&gt;in-distribution&lt;/strong&gt; performance (things similar to training data) and &lt;strong&gt;out-of-distribution&lt;/strong&gt; performance (genuinely new problems). The gap between the two reveals exactly how much of AI's apparent reasoning is real.&lt;/p&gt;

&lt;h2&gt;
  
  
  Does AI ever actually learn in real time?
&lt;/h2&gt;

&lt;p&gt;Most people assume AI gets smarter the more you talk to it. Within a single conversation, that's partly true. Across conversations, for most deployed systems, it's false.&lt;/p&gt;

&lt;p&gt;Standard large language models have a fixed &lt;strong&gt;context window&lt;/strong&gt; — a maximum amount of text they can hold in 'working memory' during a session. GPT-4 can handle roughly 128,000 tokens, which is substantial. But once the conversation ends, nothing is retained. The weights don't update. The model you talk to tomorrow is identical to the one you talked to today, no matter how much you taught it.&lt;/p&gt;

&lt;p&gt;Updating a model's weights — actual learning — requires a new round of training, which is extraordinarily expensive. Training GPT-3 was estimated to cost over $4 million in compute alone, according to researchers at the University of Massachusetts Amherst who analysed the energy and cost footprints of large model training. Retraining happens in major version releases, not in response to individual conversations.&lt;/p&gt;

&lt;p&gt;There are emerging exceptions. Techniques like &lt;strong&gt;retrieval-augmented generation (RAG)&lt;/strong&gt; allow models to query external databases in real time, giving them access to current information without retraining. Fine-tuning lets organisations adapt a base model to their specific domain on a smaller dataset. And a growing area of research called 'continual learning' is trying to solve the problem of catastrophic forgetting — the tendency of neural networks to lose old knowledge when learning new things.&lt;/p&gt;

&lt;p&gt;But for now, the AI you're using is mostly a very sophisticated snapshot of human language as it existed up to a particular training cutoff date. It's not watching the world. It's remembering it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why does AI sound so confident when it's wrong?
&lt;/h2&gt;

&lt;p&gt;This might be the most practically important thing to understand about AI — and most users never grasp it.&lt;/p&gt;

&lt;p&gt;Confidence in human speech is a signal. When someone speaks hesitatingly, we infer uncertainty. When they speak fluently and directly, we infer knowledge. AI has learned this signal perfectly. It has ingested millions of examples of authoritative text — textbooks, journalism, expert commentary — and it replicates the cadence and tone of authority with no mechanism to flag when that authority is hollow.&lt;/p&gt;

&lt;p&gt;Researchers call incorrect but confident AI outputs &lt;strong&gt;hallucinations&lt;/strong&gt;. The term is somewhat misleading — the model isn't 'seeing' things that aren't there. It's generating text that is statistically consistent with authoritative-sounding text, regardless of whether the underlying facts are real. It invents citations, dates, names, and statistics with the same fluency it uses for things that are true.&lt;/p&gt;

&lt;p&gt;A widely-cited 2023 study of AI use in legal research found that ChatGPT had fabricated case citations that looked entirely plausible — correct format, plausible court names, believable dates — but referred to cases that simply didn't exist. Lawyers who didn't verify the citations submitted them in real court filings.&lt;/p&gt;

&lt;p&gt;The model wasn't lying. It had no intent. It was doing exactly what it was designed to do: produce the most statistically likely next token. A fake citation that looks real is, from a pure pattern-matching perspective, more likely than an admission of ignorance — because training data contains far more confident expert text than humble uncertainty.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The lesson isn't that AI is useless&lt;/strong&gt; — it's that calibrating your trust requires understanding the mechanism. AI is a tool that mirrors human knowledge at scale, not a source of truth.&lt;/p&gt;

&lt;p&gt;The gap between how AI sounds and how AI works is one of the most consequential mismatches in modern technology. It produces outputs that feel like understanding, wisdom, even intuition — while running on something closer to a very deep statistical reflex. That's not a reason to dismiss it. Pattern-matching at planetary scale is genuinely, remarkably useful. But it changes the question you should ask every time you use it: not 'is this right?' — the model can't tell you — but 'does this match what I already know, and is it worth verifying?' That shift in posture turns AI from an oracle into what it actually is: an extraordinarily powerful tool.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Originally published on &lt;a href="https://snackiq.app/blog/how-ai-actually-thinks-without-a-brain" rel="noopener noreferrer"&gt;SnackIQ&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>howdoesaiwork</category>
      <category>howdoesaithink</category>
      <category>howaimakesdecisions</category>
      <category>artificialintelligen</category>
    </item>
    <item>
      <title>The AI Water Myth Actually Debunked</title>
      <dc:creator>SnackIQ</dc:creator>
      <pubDate>Wed, 15 Apr 2026 08:02:45 +0000</pubDate>
      <link>https://dev.to/snackiq_app/the-ai-water-myth-actually-debunked-4go2</link>
      <guid>https://dev.to/snackiq_app/the-ai-water-myth-actually-debunked-4go2</guid>
      <description>&lt;p&gt;The &lt;a href="https://[snackiq](https://snackiq.app/glossary/snackiq).app/glossary/ai-water-consumption-myth-debunked" rel="noopener noreferrer"&gt;ai water consumption myth debunked&lt;/a&gt; starts here: AI does use water — but the viral claim that a single ChatGPT conversation "drinks" a bottle of water is almost certainly wrong, or at least wildly misleading. That figure comes from a 2023 paper by researchers at UC Riverside that estimated water consumption across entire data centre cooling systems, then divided by query volume in ways critics argue conflate very different processes. AI's water footprint is real, measurable, and worth scrutinising. But lifecycle analysis — the method scientists use to track resource use from production to disposal — is ferociously complicated, and most headlines skip the complications entirely. Understanding what's actually happening inside a data centre, and how that compares to other industries, changes the picture completely.&lt;/p&gt;

&lt;h2&gt;
  
  
  Where does the 'AI drinks a water bottle per query' claim actually come from?
&lt;/h2&gt;

&lt;p&gt;The statistic that shocked millions originated from a preprint paper by researchers Pengfei Li, Jianyi Yang, and colleagues at UC Riverside, published in 2023. Their core argument was that training large language models and running inference queries requires enormous amounts of &lt;strong&gt;cooling water&lt;/strong&gt; in data centres — and when you add up the gallons evaporated to keep servers at safe temperatures, the per-query figure sounds alarming.&lt;/p&gt;

&lt;p&gt;The number itself — roughly 500ml per short conversation — was not fabricated. But it was calculated using a specific methodology that several engineers and environmental scientists have since challenged. The figure averaged water consumption across entire data centre campuses, including water used for cooling unrelated workloads like cloud storage, video streaming, and enterprise software. Attributing all of that to the AI query running at that moment is a bit like blaming one passenger for all the fuel a plane burns.&lt;/p&gt;

&lt;p&gt;Hank Green, a science communicator with millions of subscribers, made exactly this point in a widely-watched 2025 video. His argument: lifecycle and resource-use analyses are genuinely hard, and journalists — including well-meaning ones — routinely misapply their conclusions. A number that sounds precise often has a confidence interval wide enough to drive a truck through.&lt;/p&gt;

&lt;p&gt;None of this means the researchers were wrong to raise the issue. &lt;strong&gt;Water stress near data centre hubs is a documented problem&lt;/strong&gt;, particularly in arid US states like Arizona and Nevada where major facilities cluster. The question is whether the per-query framing communicates a useful truth or a misleading one.&lt;/p&gt;

&lt;h2&gt;
  
  
  How do data centres actually use water?
&lt;/h2&gt;

&lt;p&gt;Most large data centres use one of two cooling strategies, and understanding the difference matters enormously for any honest accounting.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Air-cooled systems&lt;/strong&gt; use chillers and fans to move heat away from servers. They consume electricity but relatively little water. &lt;strong&gt;Evaporative cooling towers&lt;/strong&gt; — the dominant approach in older and larger facilities — work like industrial-scale swamp coolers: water evaporates, carrying heat with it. This is where most of the reported water consumption comes from.&lt;/p&gt;

&lt;p&gt;The key variable is where that water goes. Evaporated water is lost to the atmosphere — it doesn't return to the local water table, which matters in drought-prone regions. But it also isn't contaminated or destroyed. Whether this counts as a serious environmental harm depends entirely on local hydrology, which varies by facility.&lt;/p&gt;

&lt;p&gt;Here's what the narrative often misses:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Many of the newest hyperscale data centres — including facilities built by Google and Microsoft in recent years — use &lt;strong&gt;closed-loop cooling systems&lt;/strong&gt; that dramatically reduce evaporative water loss.&lt;/li&gt;
&lt;li&gt;Microsoft has publicly committed to being water positive by 2030, meaning it aims to replenish more water than it consumes globally.&lt;/li&gt;
&lt;li&gt;Google reported in its 2023 environmental report that it reduced its water usage intensity — litres per megawatt-hour — compared to previous years, even as AI workloads grew.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The infrastructure is not static. The data centres being built today look very different from the ones the 2023 UC Riverside paper was modelling.&lt;/p&gt;

&lt;h2&gt;
  
  
  Is AI's water footprint actually large compared to other industries?
&lt;/h2&gt;

&lt;p&gt;This is where the myth really starts to unravel — or at least get complicated in ways that should make anyone pause before sharing a viral infographic.&lt;/p&gt;

&lt;p&gt;Agriculture accounts for roughly &lt;strong&gt;70% of all global freshwater withdrawals&lt;/strong&gt;, according to the Food and Agriculture Organization of the United Nations. A single kilogram of beef requires approximately 15,000 litres of water to produce, when you account for feed crops, drinking water, and processing. A cotton T-shirt takes around 2,700 litres. These aren't contested estimates — they come from widely replicated lifecycle analyses conducted over decades.&lt;/p&gt;

&lt;p&gt;The entire global data centre sector, including everything from Netflix servers to banking infrastructure to AI, accounts for less than 1% of global water withdrawals by most credible estimates. That's not nothing — location and timing matter, and a data centre drawing heavily from a stressed aquifer in Phoenix causes more harm than the same facility in water-rich Norway. But the proportional scale matters when we're allocating public attention and policy energy.&lt;/p&gt;

&lt;p&gt;Researchers who study &lt;strong&gt;technology's environmental footprint&lt;/strong&gt; generally argue that electricity sourcing is a far more important variable than water. A data centre running on coal-heavy grid power causes significantly more environmental damage than one running on hydroelectric or wind power — and that trade-off barely registers in the water-centric discourse.&lt;/p&gt;

&lt;p&gt;The framing of AI as an environmental villain also tends to ignore the counterfactual. AI tools used in climate modelling, grid optimisation, and materials science research may offset significant resource consumption elsewhere. That's genuinely hard to quantify, but so is the water figure — and critics rarely apply the same scepticism to both.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why do the headlines get this so consistently wrong?
&lt;/h2&gt;

&lt;p&gt;Lifecycle analysis is, in the words of engineers who use it professionally, &lt;strong&gt;inherently political&lt;/strong&gt;. Not in a partisan sense — in the sense that every methodological choice embeds assumptions, and those assumptions determine the answer you get before you've done any arithmetic.&lt;/p&gt;

&lt;p&gt;Do you count the water used to manufacture the server chips? The water embedded in the concrete of the data centre building? The water footprint of the electricity generation? Each decision can shift your final number by an order of magnitude. The UC Riverside researchers made reasonable choices for their purposes. But those choices aren't the only defensible ones.&lt;/p&gt;

&lt;p&gt;News cycles reward alarming, concrete-sounding numbers. "AI uses a water bottle per query" fits in a tweet and triggers a clear emotional response. "AI's water consumption varies by between 50ml and 2,000ml per query depending on facility type, location, cooling technology, grid mix, and how you define the system boundary" does not get shared.&lt;/p&gt;

&lt;p&gt;There's also a deeper asymmetry: &lt;strong&gt;tech companies are easy targets&lt;/strong&gt; for environmental criticism, and often deserve it. But the specific criticism has to be accurate to be useful. Misdirected outrage can crowd out pressure on the actual levers — like pushing for renewables-backed data centres, investing in liquid cooling research, or zoning AI infrastructure away from water-stressed regions.&lt;/p&gt;

&lt;p&gt;Science communicator Hank Green put it bluntly in his 2025 analysis: the story of AI and water is interesting and worth telling, but the way it's currently being told teaches people to distrust nuance rather than demand it.&lt;/p&gt;

&lt;h2&gt;
  
  
  What does the evidence actually say we should worry about?
&lt;/h2&gt;

&lt;p&gt;The honest answer is: specific harms in specific places, not a global civilisational water crisis driven by chatbots.&lt;/p&gt;

&lt;p&gt;The &lt;strong&gt;most credible concern&lt;/strong&gt; is geographic concentration. When a single hyperscale campus draws millions of gallons per day from a local aquifer that's already under pressure — as has happened in parts of Arizona and the American Southwest — that's a real, documentable harm to local communities and ecosystems. Residents in Mesa, Arizona raised legitimate concerns when data centres expanded rapidly in their region during the early 2020s, coinciding with Lake Mead hitting historic lows.&lt;/p&gt;

&lt;p&gt;This is a genuine problem. It is also a solvable one through zoning policy, water pricing reform, and the accelerating shift to advanced cooling technologies — not through individual users feeling guilty about asking an AI to draft an email.&lt;/p&gt;

&lt;p&gt;Research consistently suggests that the three highest-impact levers for reducing AI's environmental footprint are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Decarbonising the electricity grid that powers data centres&lt;/li&gt;
&lt;li&gt;Deploying next-generation cooling technologies, including direct liquid cooling of chips&lt;/li&gt;
&lt;li&gt;Improving model efficiency — smaller, faster models that do the same job with fewer computations&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;All three are actively being pursued by researchers and, to varying degrees, by the industry itself. &lt;strong&gt;Model efficiency&lt;/strong&gt; in particular has improved dramatically over the last five years: newer AI architectures achieve comparable performance to earlier models at a fraction of the compute cost, which reduces both energy and water consumption proportionally.&lt;/p&gt;

&lt;p&gt;The story of AI and water is real. It's just not the story most people have heard.&lt;/p&gt;

&lt;p&gt;The ai water consumption myth debunked doesn't mean AI has no environmental footprint — it absolutely does. But a number stripped of its methodology, its geographic context, and its comparison class isn't information. It's a vibe. The real issues are local water stress near specific facilities, electricity sourcing, and model efficiency. Those are fixable problems with measurable solutions. Panic about a bottle of water per ChatGPT message is not. Demanding precision from the people measuring AI's impact is the same skill as demanding precision from AI itself — and it matters just as much.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Originally published on &lt;a href="https://snackiq.app/blog/the-ai-water-myth-actually-debunked" rel="noopener noreferrer"&gt;SnackIQ&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>aiwaterconsumptionmy</category>
      <category>howdoesaiusewater</category>
      <category>doesaireallyusethatm</category>
      <category>aienvironmentalimpac</category>
    </item>
    <item>
      <title>Why AI Secretly Drinks More Water Than You</title>
      <dc:creator>SnackIQ</dc:creator>
      <pubDate>Mon, 13 Apr 2026 14:03:04 +0000</pubDate>
      <link>https://dev.to/snackiq_app/why-ai-secretly-drinks-more-water-than-you-2070</link>
      <guid>https://dev.to/snackiq_app/why-ai-secretly-drinks-more-water-than-you-2070</guid>
      <description>&lt;p&gt;&lt;a href="https://[snackiq](https://snackiq.app/glossary/snackiq).app/glossary/ai-water-consumption-environmental-impact" rel="noopener noreferrer"&gt;AI water consumption environmental impact&lt;/a&gt; is far larger than most people realise — and it's not a minor footnote. Research published by University of California Riverside and UT Arlington estimated that a conversation of roughly 20 to 50 questions with ChatGPT consumes around 500 millilitres of water — roughly one standard drinking bottle. That's not a typo. Every time you ask an AI to draft an email, debug code, or summarise a document, a data centre somewhere is gulping freshwater to keep its servers from overheating. The Environmental and Energy Study Institute has reported that large data centres can consume up to 5 million gallons of water per day — equivalent to the daily water needs of a town of 10,000 to 50,000 people. AI is making this problem dramatically worse.&lt;/p&gt;

&lt;h2&gt;
  
  
  Where Does All That Water Actually Go?
&lt;/h2&gt;

&lt;p&gt;The answer isn't mysterious — it's physics. Computers generate heat. Enormous amounts of it. And heat, if left unchecked in a building packed with thousands of servers running 24 hours a day, will cause those servers to fail within minutes.&lt;/p&gt;

&lt;p&gt;The dominant solution, used by the majority of the world's data centres, is &lt;strong&gt;evaporative cooling&lt;/strong&gt;. This works exactly like sweating does on a human body. Water is circulated through cooling towers, where it absorbs heat from the facility and then evaporates into the atmosphere. That evaporation dissipates the heat — but the water is gone. It doesn't go back into a reservoir or a treatment plant. It vanishes into the air.&lt;/p&gt;

&lt;p&gt;A useful metric here is something engineers call &lt;strong&gt;Water Usage Effectiveness (WUE)&lt;/strong&gt; — the litres of water consumed per kilowatt-hour of energy used. The lower the number, the more efficient the facility. Microsoft reported a WUE of around 0.49 litres per kWh for its global operations in recent years. Sounds small. Multiply that by the billions of kWh consumed annually across global AI infrastructure, and the number becomes staggering.&lt;/p&gt;

&lt;p&gt;There's also a less-discussed layer: &lt;strong&gt;indirect water consumption&lt;/strong&gt;. Generating electricity — from gas-fired power plants, coal stations, even nuclear reactors — requires enormous amounts of water for steam generation and cooling. So every unit of electricity a data centre pulls from the grid carries a hidden water cost before it even reaches the server rack. Researchers refer to this as the "water withdrawal" footprint, as distinct from direct consumption. When you account for both, AI's real water footprint roughly doubles.&lt;/p&gt;

&lt;h2&gt;
  
  
  Training an AI Model Is a Water Emergency in Slow Motion
&lt;/h2&gt;

&lt;p&gt;Most people interact with AI at the inference stage — typing a prompt, getting a response. That's the 500ml-per-conversation figure. But the water consumed during &lt;strong&gt;model training&lt;/strong&gt; dwarfs inference costs by a factor that's difficult to wrap your head around.&lt;/p&gt;

&lt;p&gt;Training a large language model means running millions of calculations across thousands of specialised chips — GPUs and TPUs — for weeks or months at a time. These chips run hot. Continuously. Researchers at the University of Massachusetts Amherst found that training a single large AI model can emit as much carbon dioxide as five cars over their entire lifetimes. The water footprint follows a similar logic: training runs are concentrated, intense, and relentless in their demand for cooling.&lt;/p&gt;

&lt;p&gt;GPT-3, which OpenAI released in 2020, is estimated to have been trained using a facility in Iowa. Research has suggested that training that model alone consumed hundreds of thousands of litres of water. GPT-4 is considerably larger, and while OpenAI has not released detailed training figures, independent estimates suggest the water consumption scaled proportionally.&lt;/p&gt;

&lt;p&gt;The key distinction is this:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Training&lt;/strong&gt; is a one-time event per model version, but it's extraordinarily intensive — weeks of maximum-load computation.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Fine-tuning&lt;/strong&gt; (adapting a base model for specific tasks) is repeated constantly across the industry.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Inference&lt;/strong&gt; (answering your actual queries) happens billions of times per day globally.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Inference looks cheap per query. But aggregate it across ChatGPT's hundreds of millions of users, Google Gemini, Microsoft Copilot, Meta's AI tools, and dozens of enterprise platforms — and the daily inference water bill rivals training costs over any given month.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why Location Determines Everything About AI's Water Impact
&lt;/h2&gt;

&lt;p&gt;A data centre in Iceland and a data centre in Arizona are both running the same AI models. Their water footprints are not remotely comparable.&lt;/p&gt;

&lt;p&gt;Iceland's data centres can use geothermal and hydroelectric energy — low-carbon, low water-intensity power sources — and cool facilities using the naturally cold ambient air. In some cases, they barely need active water cooling at all. &lt;strong&gt;Arizona data centres&lt;/strong&gt;, by contrast, operate in one of the most water-stressed regions on the planet, drawing on aquifers and river systems already under severe pressure from agriculture, population growth, and climate-driven drought.&lt;/p&gt;

&lt;p&gt;The Colorado River basin — which supplies water to much of the American Southwest — has been in crisis for years, with Lake Mead and Lake Powell dropping to historically low levels. And yet, data centre construction in states like Arizona, Nevada, and Texas has accelerated sharply through the early 2020s, precisely because land is cheap and permitting has historically been lenient.&lt;/p&gt;

&lt;p&gt;The Lincoln Institute of Land Policy reported in 2025 that communities near major data centre clusters are increasingly raising concerns about local water depletion, with some municipalities reporting that a single large facility can consume more water than all local residents combined.&lt;/p&gt;

&lt;p&gt;MIT researchers and environmental policy analysts have noted that the opacity of tech companies' water reporting makes independent oversight nearly impossible. Most companies disclose only direct water use, not the far larger indirect footprint from their electricity supply chains. When regulators and communities can't see the full picture, &lt;strong&gt;local water stress goes unaccounted for&lt;/strong&gt; until the damage is done.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Myth That Efficiency Gains Will Save Us
&lt;/h2&gt;

&lt;p&gt;Here's the convenient story the tech industry likes to tell: yes, AI uses a lot of water today, but engineering is improving, cooling technologies are getting better, and efficiency gains will solve the problem. It sounds reassuring. The data doesn't support it.&lt;/p&gt;

&lt;p&gt;This dynamic has a name: &lt;strong&gt;Jevons paradox&lt;/strong&gt;, named after 19th-century economist William Stanley Jevons, who observed that improvements in coal-engine efficiency during Britain's Industrial Revolution didn't reduce coal consumption — they increased it, because efficiency made the technology cheaper and more widely adopted. The same logic applies to AI. Every efficiency gain that makes large language models cheaper to run also makes them more commercially viable to deploy at greater scale.&lt;/p&gt;

&lt;p&gt;The numbers bear this out. Data centre water efficiency — measured by WUE — has genuinely improved over the past decade. Hyperscale facilities built today are meaningfully better than those built in 2010. And yet total global data centre water consumption has risen year after year, because the volume of AI computation is growing faster than any efficiency improvement can offset.&lt;/p&gt;

&lt;p&gt;Novel cooling technologies do offer real promise. &lt;strong&gt;Direct-to-chip liquid cooling&lt;/strong&gt; circulates chilled liquid directly to the processor, dramatically reducing the need for room-level air conditioning and evaporative cooling towers. &lt;strong&gt;Immersion cooling&lt;/strong&gt; — submerging servers in non-conductive dielectric fluid — is even more efficient, potentially eliminating most evaporative water use entirely. The Environmental and Energy Study Institute has highlighted both approaches as meaningful paths forward.&lt;/p&gt;

&lt;p&gt;But retrofitting existing facilities is expensive and slow. Most data centres being built today still rely on conventional evaporative cooling, meaning the infrastructure being locked in now will draw on local water supplies for 20 to 30 years.&lt;/p&gt;

&lt;h2&gt;
  
  
  What This Actually Means For the Water You Drink
&lt;/h2&gt;

&lt;p&gt;This isn't an abstract planetary problem. It plays out at the level of specific rivers, aquifers, and municipal water supplies — and the communities that depend on them.&lt;/p&gt;

&lt;p&gt;Only about 3% of Earth's water is freshwater, and only 0.5% of all water on Earth is both accessible and safe for human consumption, according to the Environmental and Energy Study Institute. That fraction is already under pressure from agriculture, population growth, and climate change. Data centres now compete directly with residential water systems for access to that narrow resource.&lt;/p&gt;

&lt;p&gt;In Prince William County, Virginia — part of the dense data centre corridor around Ashburn sometimes called "Data Centre Alley" — residents and local officials have raised concerns about the pace of development outstripping water infrastructure. Similar friction has emerged in communities across the Netherlands, Chile, and Uruguay, where major cloud providers have drawn criticism for water usage during drought periods.&lt;/p&gt;

&lt;p&gt;For the individual reader, the practical implication is uncomfortable: &lt;strong&gt;every AI tool you use has a physical resource cost that doesn't show up on any bill you receive&lt;/strong&gt;. The water cost is externalised onto local communities, many of which have no say in the data centre siting decisions that affect their supplies.&lt;/p&gt;

&lt;p&gt;Asking AI systems to perform lower-intensity tasks — using simpler search functions rather than generative AI where appropriate, batching queries rather than running continuous sessions — does reduce demand at the margin. But the bigger lever is policy: water reporting requirements, environmental impact assessments for new facilities, and incentives to locate AI infrastructure in regions with renewable cooling capacity.&lt;/p&gt;

&lt;p&gt;The technology isn't going away. The question is whether the communities that bear its water costs will have any voice in how that cost is managed.&lt;/p&gt;

&lt;p&gt;AI's water footprint is invisible by design. It happens in server rooms in desert states, in cooling towers that evaporate freshwater into the sky, in power plants that consume water before the electricity even reaches a data centre. The efficiency story the industry tells is real but incomplete — Jevons paradox means better technology and higher consumption can, and often do, happen simultaneously. The more useful frame isn't 'is AI sustainable?' but 'who pays the water bill?' Right now, the answer is mostly communities that never voted for a data centre in their backyard.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Originally published on &lt;a href="https://snackiq.app/blog/why-ai-secretly-drinks-more-water-than-you" rel="noopener noreferrer"&gt;SnackIQ&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>aiwaterconsumptionen</category>
      <category>howdoesaiusewater</category>
      <category>howdoesaiwastewater</category>
      <category>howdoesaiaffecttheen</category>
    </item>
    <item>
      <title>Why Does Science Use Latin? History Explained</title>
      <dc:creator>SnackIQ</dc:creator>
      <pubDate>Thu, 09 Apr 2026 14:02:43 +0000</pubDate>
      <link>https://dev.to/snackiq_app/why-does-science-use-latin-history-explained-18f3</link>
      <guid>https://dev.to/snackiq_app/why-does-science-use-latin-history-explained-18f3</guid>
      <description>&lt;p&gt;Science uses Latin because it was the universal language of educated Europeans for over a thousand years — and the naming systems built during that era were too useful to abandon. When scholars in 16th-century Europe wanted to share discoveries across dozens of languages and borders, Latin was the one tongue every university-trained mind could read. Today, more than 250,000 species carry Latin or Latinised scientific names. Medical students still memorise Latin roots. Anatomists use terms coined by Renaissance physicians. None of this happened by accident. It is the accumulated weight of institutional momentum — centuries of scientists building on each other's work in a shared language, creating a system so deeply embedded that replacing it would cost more than keeping it.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why did scholars choose Latin in the first place?
&lt;/h2&gt;

&lt;p&gt;Latin's grip on science did not begin with biology or chemistry. It started with the fall of the Western Roman Empire in 476 CE and the survival of one institution that kept using Rome's language: the Catholic Church.&lt;/p&gt;

&lt;p&gt;Monasteries across Europe became the custodians of written knowledge through the early Middle Ages. Monks copied manuscripts, ran schools, and corresponded with each other — all in Latin. When universities began appearing in the 11th and 12th centuries, starting with Bologna in 1088 and Oxford around 1096, they were deeply entwined with Church culture. Latin was simply the medium of learning. A student in Paris could read a text written in Cologne without translation. A physician in Lisbon could follow an anatomical treatise published in Padua. That frictionless exchange was enormously valuable.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Latin was nobody's native language&lt;/strong&gt;, which paradoxically made it perfect for international scholarship. It belonged to no single nation, carried no political allegiance, and had a fixed written form that did not shift with spoken dialects. Scholars in the 13th century were reading the same Latin grammar as scholars five centuries earlier.&lt;/p&gt;

&lt;p&gt;The language also carried the weight of classical authority. Works by Aristotle, Galen, and Pliny the Elder — the intellectual giants of ancient science — existed in Latin translation. To engage seriously with natural philosophy meant engaging with Latin. The choice was less a decision than a default, one that accumulated force with every generation.&lt;/p&gt;

&lt;h2&gt;
  
  
  How did Linnaeus turn Latin into the global standard for species names?
&lt;/h2&gt;

&lt;p&gt;By the 1700s, naming living things had become a chaotic mess. Naturalists used long descriptive phrases in Latin, sometimes running to a dozen words, to distinguish one plant or animal from another. A single species might carry a different name in every country, and even among Latin-using scholars, the descriptions varied wildly.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Carl Linnaeus&lt;/strong&gt;, a Swedish botanist born in 1707, fixed this with elegant simplicity. In his 1753 work &lt;em&gt;Species Plantarum&lt;/em&gt; and the 1758 tenth edition of &lt;em&gt;Systema Naturae&lt;/em&gt;, he introduced what we now call &lt;strong&gt;binomial nomenclature&lt;/strong&gt; — a two-word naming system in which every organism receives a genus name and a species name, both in Latin or Latinised form. Humans became &lt;em&gt;Homo sapiens&lt;/em&gt;. The domestic cat became &lt;em&gt;Felis catus&lt;/em&gt;. The common daisy became &lt;em&gt;Bellis perennis&lt;/em&gt;.&lt;/p&gt;

&lt;p&gt;The genius was the compression. Two words replaced paragraphs. A naturalist anywhere on Earth could pick up Linnaeus's catalogue and know exactly which organism was being discussed, regardless of what local people called it. The house sparrow might be &lt;em&gt;moineau domestique&lt;/em&gt; in French, &lt;em&gt;Haussperling&lt;/em&gt; in German, or &lt;em&gt;gorrión común&lt;/em&gt; in Spanish — but it is &lt;em&gt;Passer domesticus&lt;/em&gt; everywhere science is practiced.&lt;/p&gt;

&lt;p&gt;Linnaeus did not invent the use of Latin in science. He systematised it. His framework was so effective that the International Code of Nomenclature for algae, fungi, and plants — the official rulebook governing species names today — still requires names to be in Latin or treated as Latin. More than 270 years after &lt;em&gt;Species Plantarum&lt;/em&gt;, his system remains intact.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;em&gt;Homo sapiens&lt;/em&gt; — named by Linnaeus in 1758- &lt;em&gt;Felis catus&lt;/em&gt; — domestic cat, Linnaeus 1758- &lt;em&gt;Tyrannosaurus rex&lt;/em&gt; — named 1905, following Linnaean rules- &lt;em&gt;Bellis perennis&lt;/em&gt; — common daisy, Linnaeus 1753&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Why do medicine and anatomy still use Latin today?
&lt;/h2&gt;

&lt;p&gt;Walk into any anatomy lecture and you will hear words that would not sound strange in a 16th-century dissection theatre. &lt;em&gt;Femur. Patella. Cerebellum. Anterior. Posterior.&lt;/em&gt; Medical Latin is not nostalgia — it is infrastructure.&lt;/p&gt;

&lt;p&gt;The anatomical tradition was formalised during the Renaissance, largely at the University of Padua, where Andreas Vesalius published &lt;em&gt;De humani corporis fabrica&lt;/em&gt; in 1543. This seven-volume illustrated masterpiece named virtually every structure in the human body in Latin. Vesalius was working in a tradition that ran through Galen, the 2nd-century Greek physician whose works had been translated into Latin and dominated European medicine for over a thousand years.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Medical terminology built from Latin and Greek roots&lt;/strong&gt; has a structural advantage beyond history: it is modular. Once you know that &lt;em&gt;cardio&lt;/em&gt; means heart, &lt;em&gt;vascular&lt;/em&gt; means relating to vessels, and &lt;em&gt;-itis&lt;/em&gt; means inflammation, you can decode &lt;em&gt;cardiovascular&lt;/em&gt;, &lt;em&gt;carditis&lt;/em&gt;, &lt;em&gt;pericarditis&lt;/em&gt;, and dozens of other terms without being told what each one means. Studies of medical education suggest that students who systematically learn Latin and Greek roots learn new clinical terminology significantly faster than those who treat each term as an isolated word to memorise.&lt;/p&gt;

&lt;p&gt;The same logic applies to pharmacology. Drug names are often built from Latin and Greek components that describe mechanism or target. Understanding the roots is not just historical trivia — it is a practical decoding tool. A physician reading a foreign colleague's notes, or a researcher parsing a clinical trial from another country, benefits from a shared terminological backbone that has been accumulating precision for five centuries.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why don't we just switch to English?
&lt;/h2&gt;

&lt;p&gt;This question gets asked regularly, and the answer is more interesting than a simple defence of tradition.&lt;/p&gt;

&lt;p&gt;First, there is the stability problem. &lt;strong&gt;English changes constantly.&lt;/strong&gt; Words shift meaning, fall out of use, or acquire new connotations within a generation. Latin, as a dead language, is frozen. &lt;em&gt;Quercus robur&lt;/em&gt; meant English oak in 1753 and it means English oak today. If species were named in living languages, names would require constant updating as those languages evolved — or worse, the same name would mean different things in different eras.&lt;/p&gt;

&lt;p&gt;Second, there is the neutrality problem. English is the native language of roughly 400 million people and the second language of perhaps a billion more — but that still leaves several billion scientists, students, and researchers for whom it is neither. Choosing English as the universal scientific language would effectively privilege English-speaking researchers and create barriers for everyone else. Latin, belonging to no modern nation, carries none of that political weight. No country can claim Latin as its own.&lt;/p&gt;

&lt;p&gt;Third, there is the sheer size of the existing catalogue. Scientists have formally described and named approximately 8.7 million species on Earth, according to estimates published by researchers at Dalhousie University. Only a fraction of those have been discovered. Every new species gets a Latin name, slotting into a system that already contains millions of cross-referenced entries. Switching languages would require renaming everything — an undertaking so enormous it would introduce confusion on a catastrophic scale.&lt;/p&gt;

&lt;p&gt;That said, science is not monolithic. Physics and chemistry largely abandoned Latin-based naming for symbols and numbers. Genetic nomenclature uses letter-number codes. Different fields have found different solutions, but biology and medicine — the disciplines with the most entities to name — have held closest to Latin.&lt;/p&gt;

&lt;h2&gt;
  
  
  Are there any problems with keeping Latin in science?
&lt;/h2&gt;

&lt;p&gt;The system is not without critics, and some of those criticisms are worth taking seriously.&lt;/p&gt;

&lt;p&gt;One persistent concern is accessibility. Latin names can feel like a wall between the public and scientific knowledge. When a journalist reports on a newly discovered deep-sea creature, the Latin binomial often gets buried or omitted entirely because editors know general readers find it alienating. This creates a gap: the official name that scientists use and the colloquial name the public knows may refer to different things, causing confusion in conservation, policy, and journalism.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Common names are notoriously unreliable.&lt;/strong&gt; The creature called a "sea louse" in Ireland, a "slater" in Australia, and a "pill bug" in the United States is the same woodlouse — &lt;em&gt;Armadillidium vulgare&lt;/em&gt; — and those are just the English variants. In a world with 7,000 spoken languages, the problem multiplies exponentially. Here Latin's neutrality becomes a genuine practical advantage, not just a historical artefact.&lt;/p&gt;

&lt;p&gt;There is also an ongoing debate about colonial legacy. Many species were named by European naturalists during the 18th and 19th centuries, often after European patrons or explorers while ignoring indigenous names that local communities had used for generations. Some taxonomists now argue that newly named species should incorporate indigenous language names or acknowledge traditional ecological knowledge in their formal descriptions. The International Code of Nomenclature allows some flexibility here — names need to be &lt;em&gt;treated as&lt;/em&gt; Latin, but they do not need to be etymologically Latin. A species can be named after a person, a place, or a word from any language, as long as it follows Latin grammatical rules in its written form.&lt;/p&gt;

&lt;p&gt;The debate is live and unresolved. But the consensus, at least for now, is that the costs of abandoning the system far outweigh the costs of improving it from within.&lt;/p&gt;

&lt;h2&gt;
  
  
  What does Latin in science actually sound like today?
&lt;/h2&gt;

&lt;p&gt;Most working scientists encounter Latin not as a language they read fluently but as a set of conventions they absorb over time. A biologist learns the rules of binomial nomenclature without necessarily studying Latin grammar. A medical student learns that &lt;em&gt;-ectomy&lt;/em&gt; means removal and &lt;em&gt;-plasty&lt;/em&gt; means reconstruction without memorising Cicero.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;The living practice of scientific Latin&lt;/strong&gt; is really a shared vocabulary of roots, suffixes, and naming conventions rather than a spoken or written language in the classical sense. When palaeontologists name a new dinosaur, they follow Linnaean rules: a genus name, a species epithet, both italicised. When an anatomist describes the &lt;em&gt;musculus orbicularis oculi&lt;/em&gt; — the circular muscle around the eye — they are using Latin grammar, but they are not speaking Latin. They are using a technical register that happens to be Latin-derived.&lt;/p&gt;

&lt;p&gt;Some fields have pushed back creatively. Since the 1980s, taxonomists have named species after celebrities, fictional characters, and internet memes — but always in Latinised form. There is a wasp named &lt;em&gt;Ampulex dementor&lt;/em&gt;, after Harry Potter's Dementors. A spider named &lt;em&gt;Aptostichus stephencolberti&lt;/em&gt; after the American comedian. A slime-mould beetle named &lt;em&gt;Agathidium bushi&lt;/em&gt; after George W. Bush. The names follow the rules. The spirit is anything but ancient.&lt;/p&gt;

&lt;p&gt;This is perhaps the most honest picture of why science still uses Latin. It is not reverence for Rome. It is the accumulated weight of a system that works — stable, neutral, universal, and just flexible enough to absorb the present without losing the past.&lt;/p&gt;

&lt;p&gt;Latin stuck in science not because anyone decided to keep it, but because no one could afford to throw it away. The naming systems built in the 16th, 17th, and 18th centuries were too interlocked, too widely adopted, too practically useful to dismantle. Dead languages, it turns out, make excellent foundations — they do not shift beneath you. Every time you read &lt;em&gt;Homo sapiens&lt;/em&gt; or &lt;em&gt;penicillin&lt;/em&gt; or &lt;em&gt;anterior cruciate ligament&lt;/em&gt;, you are reading a message written in a tongue no one speaks anymore, and understanding it perfectly.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Originally published on &lt;a href="https://snackiq.app/blog/why-does-science-use-latin-history-explained" rel="noopener noreferrer"&gt;SnackIQ&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>whydoesscienceuselat</category>
      <category>whydoscientistsusela</category>
      <category>latininsciencehistor</category>
      <category>scientificlatinexpla</category>
    </item>
    <item>
      <title>Why Does Science Change Over Time?</title>
      <dc:creator>SnackIQ</dc:creator>
      <pubDate>Thu, 09 Apr 2026 09:03:22 +0000</pubDate>
      <link>https://dev.to/snackiq_app/why-does-science-change-over-time-3n3p</link>
      <guid>https://dev.to/snackiq_app/why-does-science-change-over-time-3n3p</guid>
      <description>&lt;p&gt;Science changes over time because that's exactly what it's designed to do. Unlike dogma, science is a self-correcting process — built to revise, update, and occasionally overturn itself when better evidence appears. The University of California, Berkeley's Understanding Science project puts it plainly: accepted theories can be modified or discarded as new evidence and perspective emerges. That isn't a weakness. It's the mechanism. Every time a scientific idea shifts, it means the system worked. From Newtonian physics holding firm for two centuries before Einstein rewrote the rulebook, to entire fields of nutrition science reversing course on dietary fat, the story of science is a story of controlled, evidence-driven change. Understanding why it happens makes you a smarter reader of headlines — and a lot harder to fool.&lt;/p&gt;

&lt;h2&gt;
  
  
  What does it actually mean for science to 'change'?
&lt;/h2&gt;

&lt;p&gt;Most people imagine scientific change as scientists simply being wrong, then being right. The reality is messier — and more interesting.&lt;/p&gt;

&lt;p&gt;Scientific change operates at several levels simultaneously. Individual &lt;strong&gt;facts&lt;/strong&gt; get refined as measurement tools improve. A figure like the age of the universe has shifted repeatedly — from early 20th-century estimates of a few billion years to today's widely accepted figure of around 13.8 billion years, as telescopes and theoretical models improved together. That's not instability. That's precision getting sharper.&lt;/p&gt;

&lt;p&gt;Then there are &lt;strong&gt;theories&lt;/strong&gt; — the big explanatory frameworks. These change more slowly and more dramatically when they do. The Internet Encyclopedia of Philosophy describes this as the core question of the philosophy of science: how gradual or rapid is scientific change, and how radical is it when it comes? The answer varies by field. Geology inches forward. Particle physics lurches.&lt;/p&gt;

&lt;p&gt;Finally, there are paradigm shifts — the wholesale replacement of one explanatory framework with another. The philosopher Thomas Kuhn introduced this concept in his 1962 book &lt;em&gt;The Structure of Scientific Revolutions&lt;/em&gt;, one of the most cited academic books ever written. Kuhn argued that science doesn't progress smoothly. It accumulates anomalies until the old framework can no longer hold them, then snaps into a new one.&lt;/p&gt;

&lt;p&gt;All three types of change — factual refinement, theoretical revision, and paradigm shift — are happening simultaneously, in different fields, at different speeds. Science isn't one thing changing. It's thousands of ongoing conversations, each at a different stage.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why do well-established theories ever get overturned?
&lt;/h2&gt;

&lt;p&gt;The short answer: because reality doesn't care how elegant your theory is.&lt;/p&gt;

&lt;p&gt;Even the most successful scientific theories carry an expiry date — not because scientists are careless, but because every theory is built on the observational tools and conceptual vocabulary available at the time. Isaac Newton's laws of motion worked brilliantly for over 200 years. Engineers still use them to build bridges. But they broke down at very high speeds and very small scales. Einstein's special relativity in 1905 and general relativity in 1915 didn't prove Newton wrong so much as reveal the boundary conditions of where he was right.&lt;/p&gt;

&lt;p&gt;This is a crucial point the University of California, Berkeley emphasises: scientists are likely to accept a new or modified theory if it &lt;strong&gt;explains everything the old theory did, and more&lt;/strong&gt;. New theories don't erase their predecessors — they contain them as special cases. Einstein's equations reduce to Newton's at everyday speeds. That's not contradiction. That's refinement.&lt;/p&gt;

&lt;p&gt;Anomalies drive the process. When experimental results consistently fail to match theoretical predictions, something has to give. In the early 20th century, the behaviour of light posed an anomaly that classical physics couldn't resolve. That single, stubborn problem eventually unravelled centuries of certainty and gave us quantum mechanics.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Newtonian mechanics&lt;/strong&gt; — held for ~220 years, then bounded by relativity&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;The geocentric model&lt;/strong&gt; — dominant for over 1,000 years before Copernicus and Galileo&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Phlogiston theory&lt;/strong&gt; — the 18th-century explanation for combustion, replaced by oxygen chemistry&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;The static universe&lt;/strong&gt; — assumed by Einstein himself until Hubble's observations of cosmic expansion in 1929&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;None of these shifts happened overnight. They took years, sometimes decades — because scientists are appropriately conservative. Extraordinary claims require extraordinary evidence.&lt;/p&gt;

&lt;h2&gt;
  
  
  How does new evidence actually enter science?
&lt;/h2&gt;

&lt;p&gt;Evidence doesn't just appear. It has to be produced, tested, challenged, and replicated before it reshapes anything.&lt;/p&gt;

&lt;p&gt;The engine of scientific change is &lt;strong&gt;peer review and replication&lt;/strong&gt;. When a researcher publishes a finding, other scientists attempt to reproduce it. If they can't, the finding stays provisional. If they can — repeatedly, across different labs and contexts — it gets incorporated into the body of knowledge. This is slow by design. The friction is a feature, not a bug.&lt;/p&gt;

&lt;p&gt;New tools accelerate the process dramatically. The invention of the microscope in the 17th century didn't just reveal bacteria — it restructured medicine, biology, and our entire conception of disease. The development of functional MRI in the 1990s opened up neuroscience in ways that would have been impossible a generation earlier. Better instruments don't just answer old questions; they generate entirely new ones.&lt;/p&gt;

&lt;p&gt;Technology also changes what counts as evidence. For most of human history, we had no way to observe the deep interior of stars. Now, neutrino detectors buried kilometres underground can capture particles emitted by stellar cores. Each technological leap expands the evidential frontier — and the frontier is where theories go to be stress-tested.&lt;/p&gt;

&lt;p&gt;Social factors matter too, though this is uncomfortable to acknowledge. The philosopher and historian of science Thomas Kuhn observed that paradigm shifts often face fierce resistance from established researchers who have invested careers in the old framework. Change in science is rational, but it isn't emotionally neutral. &lt;strong&gt;Careers, funding, and institutional prestige&lt;/strong&gt; are all tied to prevailing theories. This is why the philosopher Max Planck once quipped — with some bitterness — that science advances one funeral at a time.&lt;/p&gt;

&lt;h2&gt;
  
  
  Does science changing mean you can't trust it?
&lt;/h2&gt;

&lt;p&gt;This is the question that trips most people up — and it's the one most often exploited by bad-faith actors.&lt;/p&gt;

&lt;p&gt;The logic goes: 'Scientists said X was fine, now they say it's harmful. Scientists keep changing their minds. Therefore science can't be trusted.' It sounds reasonable. It's wrong.&lt;/p&gt;

&lt;p&gt;Consider the alternative: a system of knowledge that &lt;strong&gt;never&lt;/strong&gt; changed, no matter what the evidence showed. That's not trustworthiness. That's dogma. The fact that science updates is precisely why it's more reliable than any fixed belief system. A map that gets corrected when roads change is more useful than one that's been frozen since 1950.&lt;/p&gt;

&lt;p&gt;The Berkeley Understanding Science project makes an important distinction here: accepted scientific ideas are well-supported and reliable, but they could be revised if the evidence warrants it. 'Reliable' and 'final' are not the same thing. Vaccine science is reliable. Evolutionary biology is reliable. The specific mechanisms of both are still being actively refined — and that refinement is part of what makes them work.&lt;/p&gt;

&lt;p&gt;The confusion often stems from conflating &lt;strong&gt;frontier science&lt;/strong&gt; with &lt;strong&gt;settled science&lt;/strong&gt;. A study on the effects of a newly identified compound is not the same as the germ theory of disease. Both are 'science', but they operate at very different levels of evidential support. News coverage rarely makes this distinction. Readers who do will be far less likely to misread normal scientific progress as institutional failure.&lt;/p&gt;

&lt;p&gt;Healthy scepticism about a specific new finding is rational. Concluding that science itself is untrustworthy because findings evolve is a category error — and one that has real costs when it shapes medical decisions or policy.&lt;/p&gt;

&lt;h2&gt;
  
  
  What can the history of science teach us about the future?
&lt;/h2&gt;

&lt;p&gt;Every era of science believed it was close to the finish line. Every era was wrong.&lt;/p&gt;

&lt;p&gt;At the end of the 19th century, some physicists reportedly believed that the major work was done — that all that remained was to measure known quantities to more decimal places. Then came X-rays, radioactivity, the photoelectric effect, and quantum mechanics. The 20th century shattered every confident assumption the 19th had made.&lt;/p&gt;

&lt;p&gt;This should make us both humble and excited. Humble, because the scientific consensus of today almost certainly contains errors that future generations will identify. Excited, because those errors mean there are discoveries still to make. &lt;strong&gt;Every anomaly in current science is a potential doorway.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The philosopher of science Karl Popper argued that what makes science genuinely scientific is falsifiability — the capacity to be proven wrong. A theory that can't be tested can't be trusted. A theory that survives rigorous attempts to falsify it becomes more reliable with each test it passes. This is why decades of failed attempts to disprove natural selection have made it stronger, not shakier.&lt;/p&gt;

&lt;p&gt;The Internet Encyclopedia of Philosophy notes that one of the most important insights from studying scientific change is that no single answer applies to all sciences. Physics changes differently from medicine, which changes differently from ecology. The shape of change is always local — tied to specific communities, tools, and questions at specific moments in history.&lt;/p&gt;

&lt;p&gt;What unifies all of it is the commitment to following evidence, even when it's inconvenient. That's the thread that runs from Galileo's telescope to the Large Hadron Collider. Science changes not despite that commitment, but because of it.&lt;/p&gt;

&lt;p&gt;Science changes over time because the universe doesn't hand out final answers. It hands out data, anomalies, and the occasional result that breaks everything you thought you knew. That's not a crisis — it's the job description. The theories that survive aren't the ones that were never questioned. They're the ones that kept answering questions, decade after decade, under pressure from every instrument and intellect thrown at them. Change is how science earns its credibility. Not in spite of it.&lt;/p&gt;




&lt;p&gt;&lt;em&gt;Originally published on &lt;a href="https://snackiq.app/blog/why-does-science-change-over-time" rel="noopener noreferrer"&gt;SnackIQ&lt;/a&gt;&lt;/em&gt;&lt;/p&gt;

</description>
      <category>whydoessciencechange</category>
      <category>howdoesscientifickno</category>
      <category>scienceexplainedsimp</category>
    </item>
  </channel>
</rss>
