Disclosure: Power SEO Meta is built by our team at CyberCraft Bangladesh. This comparison reflects our hands-on experience with both libraries — we've tried to be fair, but you should know where we're coming from.
Three weeks after deploying a 60-page product catalog, a client's marketing team messaged me: "Why are our product images not showing when we share links on LinkedIn?"
I opened Google Search Console. Twenty-three out of sixty product pages had max-image-preview missing from their robots directives. Six draft pages were indexed when they should have been noindex. Zero build errors. Zero runtime warnings. Everything looked perfect in the code.
That was the moment I stopped trusting raw metadata strings and started thinking seriously about how App Router changed the SEO game — and which tools have actually caught up.
Why Next.js App Router broke the old metadata approach
Before Next.js 13's App Router, the standard move was <NextSeo> — drop it into your component, pass props, done. It works beautifully on the Pages Router. Genuinely great DX.
But <NextSeo> is a React component. And React components can't run inside React Server Components.
The only way to use it on App Router is to add 'use client' to your page — which ships unnecessary JavaScript to the browser for something that should be pure server logic. Or you skip it entirely and write raw Metadata objects by hand.
That's the architectural mismatch. App Router introduced generateMetadata() — a server function that returns a Metadata object at request time, with zero client JS. next-seo has no equivalent.
So if you're on App Router and typing import { NextSeo } from 'next-seo', you're either working around the framework or ignoring it entirely.
The metadata drift problem (and how to fix it)
Here's the scenario I see constantly: a product catalog with 50+ pages, each needing its own title, description, canonical URL, and Open Graph image. Without a shared utility, the robots logic drifts. One page has max-image-preview:large, another doesn't, a third has a missing canonical.
This is how I structure it now using @power-seo/meta:
// lib/seo/product-metadata.ts
import { createMetadata } from '@power-seo/meta';
import type { Metadata } from 'next';
interface Product {
name: string;
description: string;
slug: string;
imageUrl: string;
inStock: boolean;
}
// One function = one source of truth for all 50 product pages.
// Change robots logic here → every page updates immediately.
export function buildProductMetadata(product: Product): Metadata {
return createMetadata({
title: `${product.name} | Shop`,
description: product.description.slice(0, 160),
canonical: `https://shop.example.com/products/${product.slug}`,
openGraph: {
type: 'website',
title: `${product.name} | Shop`,
description: product.description.slice(0, 160),
images: [{ url: product.imageUrl, width: 1200, height: 630, alt: product.name }],
},
robots: {
index: product.inStock, // out-of-stock pages become noindex automatically
follow: true,
maxImagePreview: 'large', // typed union: 'none' | 'standard' | 'large'
},
});
}
// app/products/[slug]/page.tsx
import { createMetadata } from '@power-seo/meta';
import type { Metadata } from 'next';
import { getProduct } from '@/lib/products';
import { buildProductMetadata } from '@/lib/seo/product-metadata';
export async function generateMetadata(
{ params }: { params: Promise }
): Promise {
const { slug } = await params;
const product = await getProduct(slug);
return buildProductMetadata(product); // 1 line per page file
}
export default function ProductPage() {
return {/* product content */};
}
One shared utility. One line per page file. Updating robots logic across all 50 product pages means editing one function. I've shipped this pattern on three client projects and the metadata drift issues haven't come back.
Compare this to the next-seo equivalent on App Router — there isn't one. Each page writes its own Metadata object from scratch: 20+ repeated lines with openGraph.title, openGraph.description, and the full robots object rebuilt every time. That's where the drift comes from.
The silent robots directive bug TypeScript can catch
Here's the bug from the client situation I described above. This is a real robots string I found in production:
// next-seo approach on Pages Router
additionalMetaTags={[{
name: 'robots',
// TypeScript cannot validate this string.
// A typo here is a silent SEO failure.
content: 'index, follow, max-image-preveiw:large, max-snippet:160',
// ^^^^^^^^ "preveiw" — two letters transposed
}]}
Spot it? preveiw instead of preview. The build passes. The page renders. Google silently ignores the directive. You find out three weeks later in Search Console.
There's a second issue too: next-seo's noindex prop emits <meta name="robots" content="noindex">. Then the additionalMetaTags robots entry emits another <meta name="robots"> tag. Google's behavior with duplicate robots tags is not guaranteed. I've seen this cause real indexing inconsistencies on two separate client projects.
Here's the typed approach:
// app/products/[slug]/page.tsx
import { createMetadata } from '@power-seo/meta';
import type { Metadata } from 'next';
import { getProduct } from '@/lib/products';
export async function generateMetadata(
{ params }: { params: Promise }
): Promise {
const { slug } = await params;
const product = await getProduct(slug);
return createMetadata({
title: product.title,
description: product.description,
canonical: `https://example.com/products/${slug}`,
robots: {
index: !product.isDraft, // boolean, not a string
follow: true,
maxImagePreview: 'large', // TypeScript error if you write 'Large'
maxSnippet: 160,
unavailableAfter: product.expiresAt, // auto-serialised to unavailable_after
},
});
}
Two specific things worth calling out:
If you write maxImagePreview: 'Large' (capital L), TypeScript catches it at compile time. It never reaches production.
unavailableAfter is automatically serialised to the correct unavailable_after snake_case format. Build that string manually and a missing underscore silently makes Google ignore the directive entirely.
One gotcha that catches everyone during migration
If you're switching from next-seo to @power-seo/meta, there's one thing that will catch you:
Open Graph title and description do not auto-fallback.
In next-seo, openGraph inherits title and description from top-level props automatically. In @power-seo/meta, you must set openGraph.title and openGraph.description explicitly. Miss this and your OG tags will be empty. It's the most common mistake I've seen in migration.
// Wrong — OG tags will be empty
return createMetadata({
title: post.title,
description: post.excerpt,
openGraph: { type: 'article' }, // missing title and description
});
// Correct — set them explicitly
return createMetadata({
title: post.title,
description: post.excerpt,
openGraph: {
type: 'article',
title: post.title, // required — no auto-fallback
description: post.excerpt, // required — no auto-fallback
},
});
What I learned
-
App Router and client components are fundamentally mismatched for metadata. A server function returning a
Metadataobject works with the framework. A component that injects tags at runtime works around it. - Raw string props in SEO libraries are a liability. The typo you can't see is the bug you won't find until it's in Search Console three weeks later.
- Metadata drift is a consistency problem that grows slowly and gets noticed late. Shared utility functions that return metadata are the fix — one place to update, zero pages to hunt down.
-
Community size matters. next-seo's 7,500+ stars and 800K weekly downloads mean your problem has probably been solved somewhere.
@power-seo/metadoesn't have that yet. For Pages Router projects, that community advantage is real and worth respecting.
If you want to explore the typed approach, the Open source repo is here: Power SEO
FAQ
Can I use next-seo on Next.js App Router?
You can, but only by marking your page 'use client' — which ships JavaScript to the browser for something that should be pure server logic. For App Router, generateMetadata() with a typed utility is the correct approach.
What is generateMetadata() in Next.js?
It's a server function introduced in Next.js 13 App Router that returns a Metadata object at request time. It runs on the server only, adds zero client JavaScript, and is the recommended way to handle dynamic metadata in App Router projects.
Why do duplicate robots meta tags cause SEO problems?
When two <meta name="robots"> tags appear on the same page, Google's behavior is not guaranteed. It may honor one, the other, or neither. The noindex + additionalMetaTags pattern in next-seo can produce this conflict on Pages Router.
Is @power-seo/meta production-ready?
The library is small (~4.2 KB) and focused on a narrow problem. The GitHub star count (~180 as of April 2026) is low compared to next-seo, which means less community documentation. For Pages Router projects, next-seo remains the safer choice. For App Router projects, the architectural fit is better.
How do I handle JSON-LD schema with @power-seo/meta?
JSON-LD is handled by a separate package: @power-seo/schema. It's a deliberate split — metadata and schema are different concerns. next-seo bundles both together.
Still deciding?
The short version: if you're on Pages Router and have no migration planned, next-seo is still excellent — use it. If you're on App Router and writing 'use client' just to get meta tags working, that's the sign to reconsider.
What's your current setup? Are you on App Router using generateMetadata(), or still on Pages Router? And if you've hit the duplicate robots tag issue I described — curious whether you caught it before or after it showed up in Search Console. Drop a comment below.
Top comments (0)