DEV Community

Cover image for How to Rank Next.js Apps in Google: Full SEO Guide for 2026
Alamin Sarker
Alamin Sarker

Posted on

How to Rank Next.js Apps in Google: Full SEO Guide for 2026

I spent an entire afternoon wondering why my freshly deployed Next.js app wasn't showing up in Google Search — not even for its own brand name. Turns out, I was server-rendering the shell but client-rendering all the meaningful content. Google saw a blank <div id="__next"></div> and moved on. The fix was restructuring a single page.tsx and adding 6 lines of metadata config. This guide covers everything I wish I'd known: metadata, dynamic OG tags, structured data, and automated auditing — all with copy-paste-ready code.

Why Next.js SEO Is a Different Beast

Traditional SEO wisdom assumes the server delivers complete HTML. Next.js gives you four rendering strategies — SSG, SSR, ISR, and CSR — and only some of them are Google-friendly by default.

Google's crawler can execute JavaScript, but it processes pages in a two-wave system. The first wave indexes raw HTML. The second wave (days later) renders JavaScript. If your <title>, meta description, and canonical tags live inside a useEffect, you're invisible until wave two — and sometimes wave two never comes for low-priority pages.

The baseline rule: anything Google needs to understand your page must be in the initial HTML response.

Next.js 14+ makes this straightforward with the Metadata API. Here's the correct pattern:

// app/blog/[slug]/page.tsx
import { Metadata } from 'next';

type Props = {
  params: { slug: string };
};

// This runs on the SERVER — output lands in the initial HTML
export async function generateMetadata({ params }: Props): Promise<Metadata> {
  const post = await fetchPost(params.slug); // your data fetcher

  return {
    title: post.title,
    description: post.excerpt,
    alternates: {
      canonical: `https://your-site.com/blog/${params.slug}`,
    },
    openGraph: {
      title: post.title,
      description: post.excerpt,
      url: `https://your-site.com/blog/${params.slug}`,
      type: 'article',
      publishedTime: post.publishedAt,
      images: [{ url: post.ogImage, width: 1200, height: 630 }],
    },
  };
}

export default async function BlogPost({ params }: Props) {
  const post = await fetchPost(params.slug);
  return <article dangerouslySetInnerHTML={{ __html: post.content }} />;
}
Enter fullscreen mode Exit fullscreen mode

Result: Every blog post gets a unique <title>, description, canonical URL, and OG image — all in the initial HTML, all crawlable on wave one.

Structured Data: The Part Most Developers Skip

Rich results (star ratings, FAQs, breadcrumbs in SERPs) come from structured data. Most Next.js tutorials stop at meta tags and never touch this. That's leaving SERP real estate on the table.

Add a JSON-LD component — don't use a third-party library for this, it's 20 lines of code:

// components/JsonLd.tsx
type JsonLdProps = {
  data: Record<string, unknown>;
};

export function JsonLd({ data }: JsonLdProps) {
  return (
    <script
      type="application/ld+json"
      dangerouslySetInnerHTML={{ __html: JSON.stringify(data) }}
    />
  );
}
Enter fullscreen mode Exit fullscreen mode

Then use it in your page:

// app/blog/[slug]/page.tsx
import { JsonLd } from '@/components/JsonLd';

export default async function BlogPost({ params }: Props) {
  const post = await fetchPost(params.slug);

  const articleSchema = {
    '@context': 'https://schema.org',
    '@type': 'Article',
    headline: post.title,
    description: post.excerpt,
    author: {
      '@type': 'Person',
      name: post.authorName,
      url: `https://your-site.com/authors/${post.authorSlug}`,
    },
    datePublished: post.publishedAt,
    dateModified: post.updatedAt,
    image: post.ogImage,
    publisher: {
      '@type': 'Organization',
      name: 'Your Site',
      logo: { '@type': 'ImageObject', url: 'https://your-site.com/logo.png' },
    },
  };

  return (
    <>
      <JsonLd data={articleSchema} />
      <article dangerouslySetInnerHTML={{ __html: post.content }} />
    </>
  );
}
Enter fullscreen mode Exit fullscreen mode

Test it with Google's. If your schema is valid, you'll see it eligible for enhanced SERP features within a few index cycles.

Dynamic Sitemaps and Robots.txt (The Right Way)

A static sitemap.xml file is a maintenance nightmare. Every time you add a page, you forget to update it. Next.js has built-in support for dynamic sitemaps — use it:

// app/sitemap.ts
import { MetadataRoute } from 'next';

export default async function sitemap(): Promise<MetadataRoute.Sitemap> {
  const posts = await fetchAllPosts(); // your CMS/DB call

  const blogUrls = posts.map(post => ({
    url: `https://your-site.com/blog/${post.slug}`,
    lastModified: new Date(post.updatedAt),
    changeFrequency: 'weekly' as const,
    priority: 0.8,
  }));

  return [
    {
      url: 'https://your-site.com',
      lastModified: new Date(),
      changeFrequency: 'daily',
      priority: 1,
    },
    {
      url: 'https://your-site.com/blog',
      lastModified: new Date(),
      changeFrequency: 'daily',
      priority: 0.9,
    },
    ...blogUrls,
  ];
}
Enter fullscreen mode Exit fullscreen mode
// app/robots.ts
import { MetadataRoute } from 'next';

export default function robots(): MetadataRoute.Robots {
  return {
    rules: [
      {
        userAgent: '*',
        allow: '/',
        disallow: ['/api/', '/admin/', '/_next/'],
      },
    ],
    sitemap: 'https://your-site.com/sitemap.xml',
  };
}
Enter fullscreen mode Exit fullscreen mode

Next.js serves these at /sitemap.xml and /robots.txt automatically — no extra configuration needed.

Automating SEO Audits Before You Deploy

The three sections above fix known problems. But what about the ones you don't know about yet — a page where someone accidentally removed the canonical, or a dynamic route where the OG image URL is malformed?

This is where automated auditing earns its keep. I wired @power-seo/analytics into my pre-deploy script to catch regressions before they hit production:

npm install @power-seo/analytics --save-dev
Enter fullscreen mode Exit fullscreen mode
// scripts/seo-audit.mjs
import { auditPage } from '@power-seo/analytics';

const criticalPages = [
  'https://staging.your-site.com/',
  'https://staging.your-site.com/blog',
  'https://staging.your-site.com/blog/your-latest-post',
  'https://staging.your-site.com/pricing',
];

const results = await Promise.all(
  criticalPages.map(url =>
    auditPage(url, {
      checkCanonical: true,
      checkStructuredData: true,
      checkOpenGraph: true,
    })
  )
);

const failures = results.filter(r => r.score < 85);

if (failures.length > 0) {
  console.error('\n❌ SEO audit failed:\n');
  failures.forEach(f => {
    console.error(`  ${f.url} — Score: ${f.score}/100`);
    f.issues.forEach(issue => console.error(`    ✗ ${issue.message}`));
  });
  process.exit(1);
}

console.log('All pages passed SEO audit');
Enter fullscreen mode Exit fullscreen mode

Add it to your package.json:

{
  "scripts": {
    "build": "next build",
    "seo:audit": "node scripts/seo-audit.mjs",
    "predeploy": "npm run build && npm run seo:audit"
  }
}
Enter fullscreen mode Exit fullscreen mode

Now a broken canonical or missing OG tag fails the deploy before it ever reaches production. I've written up the full configuration options and CI integration at Next.js SEO if you want to go deeper.

What I Learned

  • Rendering strategy is an SEO decision, not just a performance one. If a page needs to rank, it needs SSG or SSR. CSR pages are invisible on wave one.
  • The Next.js Metadata API is not optional. <head> tags managed by useEffect or third-party libraries inside client components will not be in the initial HTML consistently.
  • Structured data is underused by developers and rewarded by Google. Twenty lines of JSON-LD can unlock rich results that improve CTR significantly.
  • Treat SEO like you treat type safety — catch regressions automatically, not manually after the fact.

Let's Talk

Why is SEO analysis for JavaScript websites different from traditional SEO? Is it the two-wave crawling, the hydration timing, the client-side routing — or something else that's bitten you specifically?

Drop your war story in the comments. I'm especially curious whether anyone has hit the wave-two delay in production and how long it actually took Google to render their JS-heavy pages.

Top comments (0)