Disclosure: Power SEO Sitemap is built by our team at CyberCraft Bangladesh. This article reflects our hands-on experience — where next-sitemap is the better choice, I've said so directly.
I spent two weeks staring at "Discovered — currently not indexed" in Google Search Console. The pages were live. Lighthouse passed. The sitemap file existed and looked fine when I opened it in the browser.
Then I actually counted the URLs.
My app had 47 routes. The sitemap had 31. Sixteen pages had completely vanished — no error, no build warning, nothing. Just gone.
Every missing page was inside an App Router route group.
The bug: next-sitemap and App Router route groups
next-sitemap is the most downloaded Next.js sitemap package — around 416,000 weekly downloads. It's been the standard for years, and for Pages Router projects it still works well.
But its architecture is a postbuild script that crawls your built .next/ output and writes static XML files into public/. That model was designed before App Router existed.
GitHub issue #700, open since August 2023, documents the problem: pages inside route groups like (marketing), (shop), or (auth) don't appear in the generated sitemap. Route groups are a standard App Router pattern — you use parentheses to organize layouts without affecting the actual URL.
The crawler sees the folder name with parentheses, strips them to construct the URL (correct behavior), but loses track of the file in the process. End result: those pages silently disappear from your sitemap.
Here's a quick check to see if you're affected:
# Count sitemap URLs vs your actual routes
cat public/sitemap.xml | grep -c ""
# If the number is lower than your actual page count
# and you're using route groups — you've hit this bug
No error. No warning. You find out weeks later in Search Console.
Why it's hard to fix
The root issue is architectural. next-sitemap uses Node.js fs and path APIs to crawl your built output at build time. It doesn't use Next.js internals to resolve routes — it infers them from the filesystem. Route group resolution is tightly coupled to how Next.js handles it internally, and next-sitemap doesn't have access to that.
The last published version of next-sitemap is 4.2.3 from approximately 2022. App Router shipped as stable in Next.js 13. That gap explains the bug.
The fix: generate the sitemap in a Route Handler
The cleanest solution is to stop using a build-time crawler entirely. Instead, generate the sitemap dynamically via a Next.js Route Handler and build your URL list directly from your database or CMS.
Route group folder names become completely irrelevant because you're never inferring URLs from the filesystem. You're constructing them explicitly.
Here's a minimal version:
// app/sitemap.xml/route.ts
export async function GET() {
const hostname = 'https://yoursite.com';
const xml = `
${hostname}/
daily
1.0
${hostname}/about
monthly
0.8
`;
return new Response(xml, {
headers: { 'Content-Type': 'application/xml' },
});
}
This works for small static sites. For larger projects — dynamic routes, product catalogs, image extensions — you'll want a library to handle XML escaping, URL validation, and type safety.
Adding dynamic URLs and image sitemaps
For a product catalog, I switched to @power-seo/sitemap. Zero Node.js-specific dependencies (so it runs on edge runtimes), TypeScript-first, and supports image, video, and news sitemap extensions.
// app/sitemap.xml/route.ts
import { generateSitemap, validateSitemapUrl } from '@power-seo/sitemap';
import { db } from '@/lib/db';
export const dynamic = 'force-dynamic';
export async function GET() {
const products = await db.product.findMany({
select: { slug: true, updatedAt: true, imageUrl: true, name: true },
});
const urls = [
{ loc: '/', changefreq: 'daily' as const, priority: 1.0 },
{ loc: '/products', changefreq: 'daily' as const, priority: 0.9 },
...products.map((p) => ({
loc: `/products/${p.slug}`,
lastmod: p.updatedAt.toISOString(),
changefreq: 'weekly' as const,
priority: 0.8,
images: [{ loc: p.imageUrl, caption: p.name, title: p.name }],
})),
];
const invalid = urls.filter((url) => !validateSitemapUrl(url).valid);
if (invalid.length > 0) {
console.warn(`${invalid.length} invalid sitemap URLs detected`);
}
const xml = generateSitemap({ hostname: 'https://yoursite.com', urls });
return new Response(xml, {
headers: {
'Content-Type': 'application/xml',
'Cache-Control': 'public, max-age=3600, stale-while-revalidate=86400',
},
});
}
A few things worth noting:
The image extensions matter. Adding <image:image> tags is how Google Images picks up your product photos alongside regular search results. next-sitemap doesn't support this in additionalPaths.
Runtime data means a fresh sitemap. A product added at 2pm appears the next time Google crawls /sitemap.xml — no rebuild required. With next-sitemap, the sitemap is frozen at build time.
validateSitemapUrl() catches bad data before you serve it. One malformed URL can cause Google to reject the entire file.
Handling large catalogs
The sitemap spec caps a single file at 50,000 URLs. For larger sites, splitSitemap() handles chunking and index generation automatically:
import { splitSitemap } from '@power-seo/sitemap';
import { writeFileSync, mkdirSync } from 'fs';
async function main() {
const allUrls = await fetchAllUrls(); // your full URL list
const { index, sitemaps } = splitSitemap(
{ hostname: 'https://yoursite.com', urls: allUrls },
'/sitemaps/chunk-{index}.xml'
);
mkdirSync('./public/sitemaps', { recursive: true });
writeFileSync('./public/sitemap.xml', index);
sitemaps.forEach(({ filename, xml }) => {
writeFileSync(`./public${filename}`, xml);
});
}
main();
For memory-constrained environments, streamSitemap() yields XML chunks one <url> at a time instead of building the full string in memory. On a 50,000-URL catalog, memory stays around 8MB instead of peaking at ~45MB. (Tested on Node.js 20, M2 MacBook Pro.)
Replacing robots.txt
One thing next-sitemap does that @power-seo/sitemap intentionally doesn't: robots.txt generation. The clean replacement is Next.js's native app/robots.ts convention:
// app/robots.ts
import type { MetadataRoute } from 'next';
export default function robots(): MetadataRoute.Robots {
return {
rules: { userAgent: '*', allow: '/' },
sitemap: 'https://yoursite.com/sitemap.xml',
};
}
Eight lines. No extra package. Native Next.js.
When to stick with next-sitemap
I want to be fair here. next-sitemap is still the right choice if:
- You're on Pages Router with no App Router migration planned
- The site is mostly static and content rarely changes
- You want robots.txt handled automatically without any extra code
- Your team needs the simplest possible onboarding
The 416,000 weekly downloads and years of Stack Overflow answers represent a real community safety net. That matters when you need answers fast.
The route group bug only affects App Router projects. If you're on Pages Router, it doesn't apply.
What I actually learned from this
Silent failures are the most dangerous kind. A build error stops you immediately. A missing warning doesn't. Always verify your sitemap URL count against your actual route count — not just that the file exists.
Build-time sitemap generation ages poorly for content that changes frequently. Products, blog posts, user profiles — these need a sitemap that reflects the current state, not the state at the last deployment.
Route groups are too central to modern Next.js to work around. If your sitemap tool can't handle them, it's not compatible with how App Router projects are structured.
The source for Power SEO Sitemap is at GitHub Power SEO Repo if you want to look at the implementation.
Is your sitemap actually showing all your pages? If you're on App Router with route groups, it's worth a quick check right now — run cat public/sitemap.xml | grep -c "<loc>" after your next build and compare to your actual route count. The numbers might surprise you.
Drop your findings in the comments. And if anyone's found a working workaround for the next-sitemap route group bug that doesn't require maintaining a URL list manually — I'd genuinely like to know.
Top comments (0)