Two weeks ago I shared how I built a 10,000+ page SEO site with Next.js and PostgreSQL. Today I'm back with the results.
TL;DR: 33,467 impressions, 70 clicks, pages ranking on page 1 of Google — all within 14 days of deploying to Vercel. No paid ads. No backlinks. Just programmatic SEO done right.
Here's exactly what worked, what didn't, and what I'd do differently.
The Numbers (first 14 days)
| Metric | Value |
|---|---|
| Total impressions | 33,467 |
| Total clicks | 70 |
| Average CTR | 0.21% |
| Average position | 23.5 |
| Indexed pages | 10,000+ |
| Countries reached | 50+ |
| Peak daily clicks | 22 (day 12) |
The growth curve looked like this:
- Day 1: 0 impressions
- Day 2: 2 impressions
- Day 3: 132 impressions
- Day 5: 705 impressions
- Day 10: 4,507 impressions
- Day 11: 12,129 impressions (something clicked)
That jump from 700 to 12K impressions in one day? That's when Google finished crawling the bulk of my sitemap. More on that below.
The Stack (Quick Recap)
- Next.js 16 (App Router) on Vercel
- PostgreSQL (Neon) + Prisma ORM
- ISR with 24-hour revalidation
- 300+ entities in the database, each generating 8+ pages
Every entity generates: a main review page, a pricing breakdown, feature details, compliance analysis, and alternatives page. Plus head-to-head comparison pages, category ranking pages, country-specific pages, glossary terms, and educational guides.
Total: 10,000+ unique URLs, all from one database.
What Actually Moved the Needle
1. Sitemap + IndexNow = Fast Crawling
The single most impactful thing I did was submit a comprehensive sitemap on day 1 and ping all IndexNow endpoints.
// Sitemap generated dynamically from DB
const [products, rankings, comparisons, articles, glossary] =
await Promise.all([
prisma.product.findMany({ where: { isActive: true } }),
prisma.ranking.findMany(),
prisma.comparison.findMany(),
prisma.article.findMany({ where: { publishedAt: { not: null } } }),
// ... more queries
])
I submitted all 10K URLs to Bing, Yandex, Naver, and Seznam via IndexNow on launch day. Google doesn't support IndexNow directly, but Bing shares crawl signals with Google.
const INDEXNOW_ENDPOINTS = [
{ name: "Bing", url: "https://www.bing.com/indexnow" },
{ name: "Yandex", url: "https://yandex.com/indexnow" },
{ name: "Naver", url: "https://searchadvisor.naver.com/indexnow" },
{ name: "Seznam", url: "https://search.seznam.cz/indexnow" },
]
// Submit in batches of 10,000 (IndexNow limit)
const payload = {
host: DOMAIN,
key: INDEXNOW_KEY,
keyLocation: `https://${DOMAIN}/${INDEXNOW_KEY}.txt`,
urlList: urls,
}
await fetch(endpoint.url, {
method: "POST",
headers: { "Content-Type": "application/json; charset=utf-8" },
body: JSON.stringify(payload),
})
Result: Google started crawling within hours. By day 3, hundreds of pages were indexed.
2. Structured Data on Every Page
Every page type has its own schema.org markup:
-
Review pages:
Review+AggregateRating+FAQPage -
Rankings:
ItemList+Article -
Comparisons:
Article+FAQPage -
Data pages:
Dataset+Table -
All pages:
BreadcrumbList
const reviewSchema = {
"@context": "https://schema.org",
"@type": "Review",
itemReviewed: {
"@type": "Product",
name: product.name,
},
reviewRating: {
"@type": "Rating",
ratingValue: product.rating,
bestRating: 5,
},
author: { "@type": "Organization", name: "MySite" },
}
Google Search Console flagged some issues early on (duplicate FAQPage schemas, missing breadcrumb items) — I fixed them within the first week. Monitor GSC daily in the early days.
3. Programmatic but Not Thin
This is where most programmatic SEO projects fail. They generate thousands of pages with template text and zero unique value.
Every page on my site has:
- Real data — actual pricing, features, compliance status from the DB
- Computed comparisons — "pricing is 40% lower than the industry average"
- Dynamic FAQs — 15 questions per entity, each using real data points
- Contextual internal links — related rankings, comparisons, glossary terms
// FAQ example — uses actual data, not filler text
{
question: `What are ${product.name}'s fees?`,
answer: `${product.name} charges ${pricing.base ?
`a base fee of $${pricing.base}` : "no base fee"}` +
`${pricing.premium ? ` plus $${pricing.premium} premium` :
", with no additional premium"}.`
}
The key insight: every page must answer a question that no other page on the internet answers in exactly that way. A pricing page with actual numbers is valuable. The same template with placeholder text is spam.
4. URL Architecture Matters
My URL structure creates natural topical clusters:
/product/acme → main review
/product/acme/pricing → pricing breakdown
/product/acme/features → feature details
/product/acme/compliance → regulatory status
/product/acme/alternatives → competitors
/compare/acme-vs-globex → head-to-head
/rankings/best-in-category → category ranking
Each cluster interlinks heavily. The main review links to all sub-pages. Sub-pages link back and to related rankings. Rankings link to individual product pages.
Google understands this as topical authority. When one page in a cluster ranks, it lifts the others.
What the Data Tells Me
Query Types That Ranked Fastest
| Query pattern | Position | Page Type |
|---|---|---|
| "Brand A vs Brand B" | 2–8 | Compare pages |
| "Brand + pricing/compliance" | 5–12 | Sub-pages |
| Country-specific ("best X in Estonia") | 6–9 | Country pages |
| Niche feature rankings | 8–16 | Ranking pages |
Comparison pages ranked fastest. Long-tail, low competition, high intent. If you're building a comparison site, these are your quick wins.
Traffic Split
| Device | Clicks | Impressions |
|---|---|---|
| Desktop | 39 (56%) | 28,254 (84%) |
| Mobile | 29 (41%) | 5,160 (15%) |
Desktop dominates impressions for B2B/comparison queries. But mobile has 3.5x higher CTR — mobile users who find you are more likely to click.
Traffic came from 50+ countries, with US leading (12K impressions) followed by UK, Southeast Asia, and India.
Mistakes I Made
1. Blocking /_next/ in robots.txt
I added /_next/ to my robots.txt disallow list, thinking it was just build artifacts. Wrong. Google needs access to JS and CSS bundles to render pages properly. This may have slowed initial indexing.
Fix: Remove /_next/ from disallow. Only block truly private routes like /api/ and redirect handlers.
2. Duplicate Structured Data
My FAQSection React component rendered its own JSON-LD schema internally. Several pages also rendered FAQ schema inline in the server component. Result: duplicate FAQPage warnings across thousands of pages in GSC.
// The component had renderSchema={true} by default
<FAQSection faqs={faqs} />
// Pages that already had inline schema needed:
<FAQSection faqs={faqs} renderSchema={false} />
Lesson: When you have a component that renders structured data, make sure the parent page doesn't render the same schema. Sounds obvious — but at scale, it's easy to miss.
3. Breadcrumb Without URL
My breadcrumb component allowed intermediate crumbs without an href. Google requires the item (URL) field for every breadcrumb element except the last one.
// Bug: "Category" has no href but isn't the last item
crumbs={[
{ label: "Home", href: "/" },
{ label: "Glossary", href: "/glossary" },
{ label: "Category" }, // Missing href!
{ label: "Current Page" }, // Last item — OK without href
]}
This caused "Missing field item" errors across hundreds of pages.
4. Not Monitoring GSC from Day 1
I waited a week before checking Google Search Console. By then I'd accumulated structured data errors across thousands of pages. Set up GSC monitoring before you launch.
The Technical Bits
ISR Configuration
Every page uses Incremental Static Regeneration with a 24-hour window:
export const revalidate = 86400 // 24h
This means pages are static (fast, cacheable) but refresh daily with fresh data. Google sees fast load times AND fresh content.
Meta Description Templates That Work
Generic descriptions kill CTR. I optimized every template to include specific data:
// Before (generic)
`Compare ${a.name} and ${b.name} on pricing, features and more.`
// After (specific, with ratings)
`${a.name} vs ${b.name} — which is better in ${year}? ` +
`Pricing, features, compliance compared side by side. ` +
`Scores: ${a.name} ${ratingA}/5 vs ${b.name} ${ratingB}/5.`
The "after" version includes the year (freshness signal), a question (matches search intent), and ratings (rich snippet potential).
Internal Linking at Scale
Every page has contextual internal links generated from the database:
// Sub-pages link to relevant category rankings
const RELATED_RANKINGS = {
pricing: [
{ href: "/rankings/lowest-cost", label: "Lowest Cost" },
{ href: "/rankings/best-free-tier", label: "Best Free Tier" },
],
features: [
{ href: "/rankings/most-features", label: "Most Features" },
{ href: "/rankings/best-api", label: "Best API" },
],
}
Plus glossary term auto-linking in long-form content, related comparisons, and "explore more" sections. Every internal link is contextual — never random.
What's Next
At 2 weeks in, the site is in Google's "sandbox" — new domains get limited trust. Based on what I've seen from others doing programmatic SEO:
- Month 1–2: Impressions grow as more pages get indexed
- Month 2–3: Positions start improving as Google builds trust
- Month 3–6: Organic traffic hockey stick (if content is genuinely useful)
I'm focusing on:
- Building backlinks through original datasets and data visualizations
- Monitoring which page types rank fastest and doubling down
- Fixing any new GSC issues within 24 hours
- Optimizing meta descriptions for pages that have impressions but low CTR
Key Takeaways
Programmatic SEO works if every page has unique value. Template text across 10K pages = spam. Real data across 10K pages = authority.
Submit your sitemap + IndexNow on day 1. Don't wait for Google to discover you. Hit all four IndexNow endpoints.
Structured data is not optional. Review, FAQ, Dataset, Breadcrumb — use them all. And test with the GSC Rich Results tool.
Comparison pages rank fastest for new sites. Long-tail, low competition, high conversion intent.
Monitor GSC daily. Structured data errors, crawl issues, and indexing problems compound across thousands of pages. One bug = thousands of affected URLs.
33K impressions in 2 weeks is just the beginning. The real game is turning impressions into clicks (better meta descriptions) and clicks into trust (time on site, return visits).
I'll post a month-3 update with traffic numbers once the sandbox period ends. If you're building something similar, I'm happy to answer questions in the comments.
Built with Next.js, PostgreSQL, and Prisma. Live at brokerrank.net/data/average-spreads.
Top comments (0)