51% of all internet traffic in 2024 was not human. That's not a prediction. That's the number from Thales' 2025 Imperva Bad Bot Report. For the first time in a decade, bots outnumber people online. Not by a small margin. They are the majority.
I noticed this before I ever read that report. I was scrolling through Google Maps reviews for a client in Częstochowa and something felt wrong. The language was too smooth. Three different profiles used the same phrasing. The accounts were two weeks old. No photos, no history, no other activity. Classic bot patterns. But then I looked at their competitors' websites and the content had that same hollow quality. Clean sentences that said nothing. Paragraphs that existed to fill space.
That's when it clicked. The internet is filling up with content that nobody wrote.
What the Dead Internet Theory Got Right
The dead internet theory started as a conspiracy theory around 2021. The original claim was extreme: governments and corporations had secretly replaced most online activity with bots to manufacture consensus. That part was never provable.
But the core observation turned out to be accurate in ways nobody expected.
Bots didn't just stay in the shadows. They became the majority. Automated traffic hit 51% of all web traffic. Bad bots alone account for 37%. Reddit co-founder Alexis Ohanian said it plainly: "so much of the internet is dead." The founder of one of the biggest human platforms on the internet is telling you it's over.
A recent academic survey on arXiv mapped out how artificial interactions are reshaping social media entirely. It's not fringe anymore. Researchers are studying this as a structural shift in how the internet functions.
And that's only half the story.
AI Slop Became the Word of the Year. Think About That.
In late 2025, "slop" was named Word of the Year by Macquarie Dictionary, Merriam Webster, and the American Dialect Society. All three. Independently. The word describes exactly what you think: low effort, AI generated content that floods every platform and adds nothing.
In March 2026, a game called Your AI Slop Bores Me went viral on Hacker News. The concept is brilliant. Instead of trying to detect AI, players pretend to be AI and generate the most generic, soulless content they can. The joke lands because everyone recognizes it instantly. We've all been on the receiving end. Blog posts answering questions nobody asked. About pages describing a company as "passionate about delivering innovative solutions." LinkedIn posts that exist only to exist.
The satire works because the reality is already mainstream. People have even started responding to suspected AI content with "ai;dr", a spin on "tl;dr" that means "I'm not reading this, it's obviously AI." That's not a niche joke. That's a cultural shift.
How big is the flood? Ahrefs analyzed 900,000 newly published web pages and found that 74.2% contained AI generated content. Not traffic. Content. Three out of four new pages on the internet were not written by a person.
A University of Florida study from March 2026 confirmed what creators already knew: AI slop hurts both consumers and the people making real content. TechCrunch covered the same question from the creator economy angle. And in January 2026, YouTube ran its largest mass termination of AI driven channels in the platform's history.
The signal is clear. The internet is drowning in generated noise. And people are starting to fight back.
The Myth That Google Doesn't Care
There's a popular misconception floating around: "Google doesn't penalize AI content." That's technically true. And completely misleading.
Google's official position is that they evaluate content quality, not how it was produced. Fair enough. But here's what actually happened in practice.
In February 2026, Google rolled out a core algorithm update that caused massive ranking volatility across industries. The pattern was consistent: content demonstrating real, first person experience moved up. Generic content moved down. The E-E-A-T framework (Experience, Expertise, Authoritativeness, Trustworthiness) is no longer a suggestion. It's a requirement for ranking.
The emphasis is on that first E. Experience. Google wants to know: did you actually do this thing you're writing about?
A study by Rankability found that 83% of top ranking results use human generated content. Meanwhile, Google traffic to publishers dropped 33% globally between November 2024 and November 2025. The sites that survived that drop? The ones with original, experience based content. Google doesn't want to serve summaries of summaries of summaries. It wants sources. It wants the person who actually did the work.
For local businesses, this is even more direct. When a nail salon in Częstochowa gets asked "how did you find us?" and the answer is "Google," that means Google trusted that website. It trusted it because it had real photos, real reviews, real language, real local signals. A bot can't write about what parking looks like on Śląska Street at 7pm on a Tuesday. That level of specificity is what wins.
What a Generated Website Sounds Like vs. a Real One
Let me show you the difference.
Generated About page:
"We are a passionate team dedicated to providing high quality services. With years of experience and a commitment to excellence, we deliver innovative solutions tailored to your needs."
A real About page, built from a 20 minute conversation:
"Ewa has been cutting hair on Aleja NMP since 2003. She started with one chair in a room above a flower shop. Twenty years later, her clients still call to book because they don't trust online forms. That's fine. She picks up every time."
The first one could describe any business in any country. The second one could only be Ewa's salon. That's the difference between content that fills a page and content that builds trust. Google knows the difference. Your customers definitely know the difference.
I build websites for Polish small businesses. When I write copy for a client, I don't generate it. I ask questions. What do your regulars say about you? What do you do differently from the place two streets over? What's the neighborhood like? Those answers don't exist in any training dataset. They exist in a conversation.
Why This Matters More in Poland Than Anywhere Else
I've spent months researching why Polish SMBs don't have websites. Most of them run their businesses through Booksy, Instagram, and Google Business profiles. They're not behind. They've optimized for the platforms available to them.
But here's the thing. When 90% of new business websites are built with AI tools, they all sound the same. Same structure, same vocabulary, same promises. Built fast and it shows. Meanwhile, a local business in Częstochowa has something inherently unique: the owner's name, the neighborhood, the services locals actually ask for, photos from inside the shop.
That specificity is a competitive moat right now. It can't be generated. It has to be gathered.
The businesses that don't have websites yet have an unexpected advantage. They haven't been poisoned by template language. When they finally get a site, it can be built from scratch with real content. No legacy AI slop to clean up. No generic copy to replace. Just their story, told for the first time, properly.
If you're wondering what that investment looks like, I wrote a detailed breakdown of real website costs in Poland.
The Window Is Open Right Now
The AI slop flood is a problem. But it's also a gap.
When the majority of new websites sound like they were assembled by a machine in 30 seconds, the ones clearly made by a real person for a real business stand out immediately. Customers notice. Search engines notice. Google's February 2026 update proved this with data.
I use AI in my workflow constantly. I built an entire automation system to find businesses that need websites. But there's a difference between using AI to help build something human and using AI to replace the human entirely. The first produces a great website. The second produces noise.
The businesses I work with don't need to win the internet. They need to win their city. A bakery in Gliwice doesn't need to rank globally. It needs the person two kilometers away searching for a birthday cake to find it, trust it, and call.
That's a solvable problem. But you need a real website to solve it. One that sounds like a person wrote it, because a person did. One that Google trusts because it has the SEO fundamentals done right. One that your customers recognize as yours the moment they land on it.
The internet got loud. The businesses that win now are the ones that sound like someone actually wrote their website. Because someone did.
Questions? Reach out. I reply within 24 hours.
Also available in Polish.
Top comments (1)
Really solid piece, Max. The Ewa salon example nails it — that kind of specificity is impossible to generate from a training set.
I run a programmatic SEO site with 100k+ pages across 12 languages, and I use a local LLM (Llama 3) to generate the initial analysis for each page. So I'm literally the person this article is arguing against — and I mostly agree with you.
The distinction you make between "using AI to help build something human" vs. "using AI to replace the human entirely" is the whole game. My pages that perform best aren't the ones where the LLM wrote everything. They're the ones where the generated content is a scaffold that gets layered with real data, unique financial metrics, and structured analysis that you can't get from generic prompting.
The 74.2% stat from Ahrefs is wild but tracks with what I'm seeing in Google Search Console. Pages that are just LLM output with no unique data layer get crawled and rejected. Google literally crawls them and says "nah." Meanwhile the pages with genuine analytical depth — even if AI-assisted — get indexed and start ranking.
Your point about E-E-A-T and that first "E" for Experience is where I think the real opportunity lives. The flood of AI slop is actually making it easier to stand out if you have real expertise to inject into the content. The bar for "unique" has never been lower.
Curious about your automation system for finding businesses that need websites — that's a smart pipeline approach.