DEV Community

vs7ironman
vs7ironman

Posted on

How I Got 15,000 Pages Indexed in 3 Weeks with Programmatic SEO (Next.js + Supabase)

How I Got 15,000 Pages Indexed in 3 Weeks with Programmatic SEO (Next.js + Supabase)

I built SoftwareDuel — an independent software
comparison site for SMB buyers — and got 15,000 pages indexed by Google in
under 3 weeks without writing a single page manually.

Here's exactly how it works.

The idea

Most software review sites rank tools based on who pays for placement.
I wanted to build something independent — honest side-by-side comparisons
across HR, payroll, CRM, ATS, project management, and accounting tools.

The problem: building thousands of comparison pages manually would take years.

The solution: programmatic SEO.

What is programmatic SEO?

Programmatic SEO means using structured data to automatically generate
pages at scale. Instead of writing each page, you define a template and
let the data do the work.

The formula:

  • 1 tool in the database = 1 review page + ~20 comparison pages + 1 alternatives page
  • 120 tools × ~22 pages = 2,600+ same-category pages
  • Plus Google indexes cross-category pages = 15,000+ total indexed

The tech stack

  • Next.js 16 — dynamic routes handle everything
  • Supabase (Postgres) — stores all tool data
  • Vercel — free tier hosting
  • Cloudflare — DNS

Total monthly cost: $0 on free tiers.

How the database works

Each tool in Supabase has these fields:

CREATE TABLE tools (
  id uuid PRIMARY KEY,
  name text,
  slug text,
  category text,
  description text,
  pros text,
  cons text,
  best_for text,
  features text,
  pricing_model text,
  website_url text,
  affiliate_url text,
  updated_at timestamp
);
Enter fullscreen mode Exit fullscreen mode

How Next.js generates the pages

Three dynamic routes handle everything:

/tools/[slug] — individual tool review
/compare/[slug] — where slug is tool-a-vs-tool-b
/alternatives/[slug] — alternatives to a specific tool

The compare page splits the slug on -vs- to get both tool slugs,
then fetches both from Supabase:

const parts = slug.split('-vs-')
const slugA = parts[0]
const slugB = parts.slice(1).join('-')

const [{ data: toolA }, { data: toolB }] = await Promise.all([
  supabase.from('tools').select('*').eq('slug', slugA).single(),
  supabase.from('tools').select('*').eq('slug', slugB).single(),
])
Enter fullscreen mode Exit fullscreen mode

The sitemap

The sitemap dynamically generates all URLs from the database:

export default async function sitemap() {
  const { data: tools } = await supabase
    .from('tools')
    .select('slug, category, updated_at')

  const comparePages = []
  for (let i = 0; i < tools.length; i++) {
    for (let j = 0; j < tools.length; j++) {
      if (i !== j && tools[i].category === tools[j].category) {
        comparePages.push({
          url: `https://www.softwareduel.com/compare/${tools[i].slug}-vs-${tools[j].slug}`,
          lastModified: new Date(tools[i].updated_at),
        })
      }
    }
  }
  // ... tool pages and alternatives pages
}
Enter fullscreen mode Exit fullscreen mode

Key detail: same-category comparisons only. Cross-category comparisons
(Gusto vs Salesforce) are low quality and hurt crawl budget.

SEO optimizations that made a difference

Schema markup on every page:

  • FAQPage schema on tool, compare, and alternatives pages
  • BreadcrumbList schema on all pages
  • SoftwareApplication schema with dateModified on tool pages
  • ItemList schema on alternatives pages
  • Article schema on blog posts

Canonical tags on every dynamic page pointing to the exact URL.

Metadata generated dynamically from database fields — unique title
and description for every page.

Results after 3 weeks

  • 15,000+ pages indexed by Google
  • 12,800+ impressions in first 15 days
  • Average position: 33-35
  • 58 linking websites acquired organically
  • 11 affiliate programs approved

What I learned

1. Crawl budget matters more than I expected
Google doesn't index everything at once. It allocates crawl budget based
on site speed and quality signals. One slow deployment day caused a 5-day
crawl pause. Keep response times under 200ms.

2. Same-category comparisons only
I initially generated all possible comparisons (tool A vs every other tool).
Google saw thousands of low-quality cross-category pages and deprioritized
the site. Restricting to same-category only immediately improved quality signals.

3. Schema pays off fast
Adding FAQ schema to 80+ pages got them appearing in Google's Enhancements
report within days. Review snippets and breadcrumbs followed. These increase
CTR when you do start ranking.

4. Affiliate programs are harder than expected
Many high-traffic tools (Jira, Trello, Salesforce, Asana, Monday.com) don't
have affiliate programs. Plan your tool list around monetizable tools first.

5. The Google crawl pause is real
At the 3-week mark Google paused crawling for 5 days. I initially panicked
but it recovered fully. This appears to be a standard new-site evaluation
period. Don't make major changes during it.

What's next

Still early days — impressions are growing but clicks are still low.
The next milestone is consistent affiliate conversions as rankings mature.

If you're building something similar I'm happy to answer questions in
the comments.


SoftwareDuel is live at softwareduel.com

Top comments (0)