AI Overviews don't create clicks—they reduce them. Two large studies, from Ahrefs and Amsive, have put numbers to what many site owners already feel: organic traffic from informational queries is dropping, and AI Overviews are a major factor. The data also exposes a contradiction: Google has spent years telling publishers not to create thin, derivative content—yet AI Overviews do exactly that at scale, pulling from multiple sources and serving a repackaged "answer" that often satisfies the user without a click. This post summarises the findings, explains why the technical layer matters for any response, and outlines a practical compensation strategy.
What the studies show
Ahrefs (300,000 keywords)
Ahrefs analysed 300,000 keywords, with a 99.2% overlap between informational intent and AI Overview presence. Their key result: a 34.5% drop in CTR for position 1 when AI Overviews are present, comparing March 2024 (pre-rollout) to March 2025 (post-rollout). A later update found the impact has deepened—clicks for the top result can now drop by around 58% in some contexts. The informational queries that used to drive discovery and consideration are the ones taking the hit.
Amsive (700,000 keywords)
Amsive studied 700,000 keywords across 10 sites in five industries (finance, education, SaaS, healthcare, pets). Their findings:
- -15.49% average CTR drop across all keywords that trigger AI Overviews
- -37.04% when AI Overviews and featured snippets appear together—the worst combination
- -19.98% for non-branded keywords
- -27.04% for keywords ranking outside the top 3
The exception: branded queries. Only 4.79% of branded keywords trigger AI Overviews, but when they do, CTR increases by +18.68%. AI Overviews disproportionately harm non-branded, informational queries while branded visibility holds up or improves.
The pattern
Informational clicks used to build brand awareness, shape perception, and spark consideration before purchase. That’s where businesses are losing visibility and revenue. Google Search Console does not yet report AI Overview data as a search appearance item—though that may change—so site owners are left piecing together graphs and anecdote to see the effect. Every audit tells the same story: AI Overviews do not send more clicks.
The contradiction
Google’s spam and helpful content policies stress that publishers must create original, valuable, experience-driven content—not thin summaries or regurgitations. They explicitly discourage content that is “stitched together” or “rephrased” without adding real insight.
AI Overviews pull content from multiple sources, repackage it, and serve it back as a “new” answer. Citations may appear, but the visible answer is a Google-generated digest, not the original publisher’s work. That is exactly the kind of derivative use Google has long discouraged. The stance for publishers and the behaviour of AI Overviews are at odds.
The measurement gap
Google Search Console does not report AI Overviews as a search appearance item. You can see clicks, impressions, and position for traditional organic results, but not how many queries triggered an AI Overview or whether your page was cited in one. AI mode data may arrive in future, but right now site owners rely on third-party studies (like Ahrefs and Amsive) and manual spot-checks to infer impact. The result: "crocodile graphs" in GSC—traffic lines that taper off without a clear cause. If your compensatory strategy depends on knowing which keywords are affected, the gap makes it harder to prioritise and attribute. Until GSC surfaces AI Overview data, treat performance and crawlability as what you can control; they affect both traditional search and any future AI-mediated traffic.
Compensatory strategies
You need more high-quality content—not just text, but video, tools, and other formats—to offset the lost clicks. For publishers, that is challenging. For SEOs, the shift is toward compensatory strategies: maximise quality content, cull what’s dead, distribute new content across social (YouTube, TikTok, Instagram Shorts), and adapt. The industry has been impacted; the response is to adapt, not to assume the old playbook still works.
But compensatory content only works if it is discoverable and crawlable. That’s where the technical layer comes in.
Why the technical layer matters
When you double down on content, video, and tools to compensate for lost clicks, you are adding more assets that need to be crawled, indexed, and ranked. Google’s crawlers—and the AI systems that feed AI Overviews—still need to fetch your pages efficiently. Core Web Vitals (LCP, INP, CLS) affect crawl budget, user experience, and ranking signals. Slow, unstable, or interaction-heavy pages are harder to process and less likely to surface in both traditional search and AI-mediated answers.
Our earlier post on AI crawlers explains why fast, crawlable pages matter for GPTBot and similar crawlers. The same logic applies here: if your compensatory content is slow, blocks rendering, or triggers layout shifts, it will underperform. The technical layer is not optional—it is the foundation any content strategy sits on.
Common failures we see: LCP images lazy-loaded above the fold, layout shifts from images without dimensions, third-party scripts blocking the main thread and inflating INP. Each of these delays how quickly crawlers and users can access your content. How Core Web Vitals affect SEO rankings ties these metrics directly to discoverability—slow or unstable pages are at a disadvantage whether the user reaches you via traditional search or via an AI system that may cite you.
What to do
- Audit priority content. Run PageSpeed Insights or an automated monitoring setup on the pages you are betting on for compensatory traffic. Fix LCP, INP, and CLS issues before scaling content volume.
- Set performance budgets. Define thresholds (e.g. LCP ≤ 2.5s, CLS ≤ 0.1) and alert when new deployments or content changes push metrics out of range. Catching regressions early keeps your compensatory assets in shape.
- Treat performance as part of the strategy. Content volume alone will not offset lost clicks if the pages are slow or unstable. Build performance into your planning, not as an afterthought.
- Monitor continuously. A new plugin, a content update, or a third-party script can introduce regressions. Manual PageSpeed runs catch issues only when you remember to run them. Automated monitoring and performance budgets alert you when LCP, INP, or CLS drift out of range—before the next crawl or the next deploy pushes your compensatory content further down the stack.
Summary
The Ahrefs and Amsive data confirms that AI Overviews are reducing clicks on informational, non-branded queries. The contradiction with Google’s own content policies is real. Compensatory strategies—more quality content, video, tools, and distribution—make sense, but they depend on a solid technical base. Fast, crawlable pages with good Core Web Vitals give your content the best chance to be discovered, indexed, and cited. Performance is not a side quest; it is part of the response.
FAQ
Do AI Overviews always reduce clicks?
No. Branded queries that trigger AI Overviews saw a CTR increase of about 18% in the Amsive study. The harm is concentrated in non-branded, informational queries, especially when AI Overviews appear with featured snippets.
Why doesn't Google Search Console show AI Overview data?
At the time of writing, GSC does not report AI Overviews as a search appearance type. That may change. For now, site owners rely on third-party data and manual tracking to assess impact.
Does improving Core Web Vitals help with AI Overviews?
Core Web Vitals affect crawlability, indexing, and user experience—factors that influence how content is discovered and used. Fast, stable pages are easier for crawlers and users to process. Performance does not guarantee citation in AI Overviews, but slow or broken pages are at a disadvantage.
What is a compensatory content strategy?
Creating more high-quality content (text, video, tools) and distributing it across channels to offset clicks lost to AI Overviews. The strategy only works if that content is technically sound: crawlable, fast, and meeting Core Web Vitals thresholds.
Should I stop optimising for informational keywords?
No. Informational queries still drive awareness and consideration. The shift is toward compensatory tactics: more quality content, better distribution, and a technical foundation that lets that content perform. The goal is to maximise the value of the clicks you do get and to create assets that can be cited when AI systems do surface them.
Top comments (0)