DEV Community

Cover image for Why Google Refuses to Index Your Next.js Site
Yusufhan Sacak
Yusufhan Sacak

Posted on

Why Google Refuses to Index Your Next.js Site

You deploy your site.

It loads fast.

Lighthouse looks great.

And yet…

Google refuses to index it.

Search Console throws cryptic messages at you:

  • Page with redirect
  • Discovered – currently not indexed
  • Alternate page with proper canonical

No clear explanation. No clear fix.

If you’re building with Next.js on Vercel, this is far more common than you think.

Let’s break down why this happens — and how to fix it.


The uncomfortable truth

Google doesn’t index websites.

Google indexes URLs.

And modern Next.js apps are very good at accidentally breaking URL consistency.

Most indexing issues aren’t “SEO problems”.

They’re infrastructure and routing problems.


1. Redirects that look harmless (but aren’t)

One of the most common issues:


/about → 308 → /

Enter fullscreen mode Exit fullscreen mode

From a developer’s perspective, this seems fine.

From Google’s perspective, it’s a red flag.

When Google sees:

  • a URL
  • that always redirects
  • to another URL
  • without a strong reason

…it often decides:

“I won’t index this. It’s a duplicate or soft-canonical.”

Why does this happen in Next.js

  • Trailing slash mismatches
  • redirects() in next.config.js
  • Middleware redirects
  • App Router defaults using 308 Permanent Redirect

308 is permanent. Google takes it very seriously.

If /about should exist → serve it as 200.

If it shouldn’t → remove internal links pointing to it.


2. “Discovered – currently not indexed” isn’t a crawl problem

This one scares people.

It sounds like Google is still working on it.

In reality, it often means:

“We saw this URL. We decided it’s not worth indexing.”

Common causes:

  • No sitemap
  • URL only reachable through redirects
  • Weak canonical signals
  • Conflicting metadata

Search Console doesn’t tell you which one.


3. Missing sitemap = Google is guessing

Yes, Google can crawl without a sitemap.

But for modern SPA / SSR hybrids, that’s a gamble.

Without a sitemap:

  • Google relies on internal links
  • Redirected pages may never be considered canonical
  • Crawling frequency drops

In Next.js App Router, the fix is trivial — but often forgotten.

No sitemap → lower trust → slower or no indexing.


4. robots.txt isn’t optional anymore

A surprising number of Vercel deployments ship with no robots.txt.

Google then has:

  • no crawl hints
  • no sitemap reference
  • no explicit allow/disallow rules

This doesn’t block indexing —

but it removes a strong trust signal.

In competitive SERPs, that matters.


5. Canonicals you didn’t mean to create

Next.js generates metadata beautifully — until it doesn’t.

Problems I see constantly:

  • Canonical points to /
  • Canonical mismatches between HTTP / HTTPS
  • Canonical missing entirely on subpages

Google follows canonicals more than internal links.

If your canonical is wrong, Google will obey it — even if it kills indexing.


6. Vercel + Next.js = hidden crawler behavior

Vercel is fantastic, but it introduces quirks:

  • Edge middleware redirects
  • Platform-level HTTPS enforcement
  • Automatic handling of www vs non-www

These are invisible in your code — but very visible to crawlers.

If you don’t observe the raw HTTP behavior, you won’t see the issue.


Why Search Console doesn’t help much

Google Search Console reports symptoms, not causes.

It tells you:

  • something is wrong

It does not tell you:

  • which redirect
  • which header
  • which canonical
  • which URL pattern triggered it

That gap is where most developers get stuck.


The fix: audit your site like a crawler

You need to think like Googlebot:

  • What status code do I get?
  • Do I get redirected?
  • Is there a canonical?
  • Is this URL in the sitemap?
  • Is robots.txt guiding me?

That’s exactly why I built vercel-seo-audit.

npm i vercel-seo-audit
vercel-seo-audit https://yoursite.com
Enter fullscreen mode Exit fullscreen mode

Open source CLI

👉 https://github.com/JosephDoUrden/vercel-seo-audit

It doesn’t guess.
It shows you what Google actually sees — and what to fix.

It doesn’t guess.
It shows you what Google actually sees — and what to fix.


Final thought

If Google isn’t indexing your Next.js site, it’s rarely because of:

  • keywords
  • content quality
  • backlinks (at first)

It’s almost always because:

your URLs are lying to Google.

Fix the signals.
Make your routing boring.
And Google will follow.

If you’re shipping on Vercel and fighting indexing issues, you’re not alone — and you’re not crazy.

You’re just dealing with modern web complexity.

Top comments (0)