DEV Community

Cover image for The SEO Fix Your Vibe‑Coded Vite/React Site Needs (Without Next.js)
Adam Grenier
Adam Grenier

Posted on

The SEO Fix Your Vibe‑Coded Vite/React Site Needs (Without Next.js)

If you’ve ever vibe‑coded your way to a “pretty good” SaaS or landing page, shipped it, and then checked Google Search Console… you’ve probably seen it:

Discovered – currently not indexed

I was reminded of that wall with SmilePlease, an AI tool that generates professional school portraits from everyday photos. I built it the way a lot of modern apps get built:

  • Vite + React
  • Supabase on the backend
  • Deployed to Railway in an afternoon

The site was fast. The UX felt great. But Googlebot? Googlebot basically pretended it didn’t exist.

Out of 36 known URLs, exactly one page was indexed.

This post walks through how to fix that by adding build‑time prerendering to a Vite/React SPA—without rewriting everything in Next.js or Remix. If you’ve vibe‑coded a Vite app and are only now realizing SEO matters, this is for you.


Why Google Ghosted My SPA

A classic Vite/React SPA ships an almost‑empty index.html that looks something like this:

<body>
  <div id="root"></div>
  <script type="module" src="/src/main.tsx"></script>
</body>
Enter fullscreen mode Exit fullscreen mode

All your real content only exists after the JavaScript bundle runs, mounts React, fetches data, and renders the app.

Google can render JavaScript, but it doesn’t do that up front for every URL:

  • First, it fetches the HTML and parses whatever it finds.
  • Then, if the page looks “worth it,” it gets scheduled for a second pass where Google executes your JS and sees the full DOM.

If your domain is brand new and has basically no authority (no backlinks, no history), those JS‑heavy pages sit in a low‑priority queue. Some of them never get rendered at all.

In my case, I made things worse with a robots.txt own‑goal:

User-agent: *
Disallow: /assets/
Enter fullscreen mode Exit fullscreen mode

Vite was outputting all of my JS and CSS into /assets/. By disallowing that folder, I was literally telling Google:

“ Please don’t download the code you’d need to see what’s on my site.”

So I had:

  • HTML files with no meaningful content
  • JavaScript that Googlebot wasn’t allowed to fetch
  • A brand new domain with zero authority

Perfect recipe for “Discovered – currently not indexed.”


The Standard Advice (That I Didn’t Want to Take)

The common advice for “React SPA SEO” is straightforward:

“Just migrate to Next.js or Remix and use SSR.”

Which is good advice if:

  • You’re early in the project, or
  • You want the framework features and are ready to refactor routing, data fetching, and deployment.

I wasn’t there:

  • The app worked.
  • The stack has some baked complexity.
  • I didn’t want to rebuild the routing layer or re‑thread my Supabase client through SSR entry points.

I wanted something that:

  • Kept my Vite/React SPA as‑is
  • Required minimal code changes
  • Gave Google fully rendered HTML at crawl time

So instead of switching frameworks, I went with build‑time prerendering.


The Strategy: Build‑Time Prerendering

The idea is simple:

  1. Build your SPA as usual with Vite (this is a go-to for most of the current vibe-coding tools like Lovable, Gemini Studio, Replit, Bolt, etc.).
  2. Spin up a tiny web server that serves the built dist folder.
  3. Use a headless browser (Puppeteer) to visit each important route.
  4. Grab the fully rendered HTML from the page.
  5. Save that HTML back into dist as static .html files.

When Googlebot (or any non‑JS client) requests /pricing, your host serves the static HTML we prerendered. The browser can still load your JS and hydrate the page into a full SPA once it’s loaded.

You get:

  • Static‑site behavior for crawlers and no‑JS users
  • SPA behavior for everyone else
  • No framework rewrite

Here’s how I wired it up.


Step 1: Make React Hydrate Instead of Re‑Render

If you dump fully rendered HTML into index.html but keep doing a normal createRoot().render(<App />), React will:

  • Throw away the HTML you prerendered
  • Rebuild the DOM from scratch on the client

We want React to hydrate the existing DOM instead.

In your Vite entry file (often main.tsx or main.jsx), swap the mount logic to preferred hydration when the root already has content:

import { createRoot, hydrateRoot } from 'react-dom/client';
import App from './App';

const rootElement = document.getElementById('root');

if (rootElement) {
  // If the root element has children, the page was prerendered
  if (rootElement.hasChildNodes()) {
    hydrateRoot(rootElement, <App />);
  } else {
    // Standard SPA mounting for non-prerendered pages
    createRoot(rootElement).render(<App />);
  }
}
Enter fullscreen mode Exit fullscreen mode

This keeps the prerendered HTML intact and layers React’s event system on top of it.


Step 2: Write a Puppeteer Prerender Script

Next, we’ll write a Node script that:

  • Starts an Express server to serve dist
  • Uses Puppeteer to open each route
  • Waits for network requests to finish
  • Saves the final HTML to disk

Create something like scripts/prerender.ts:

import puppeteer from 'puppeteer';
import express from 'express';
import { mkdirSync, writeFileSync } from 'fs';
import { join, dirname } from 'path';

const DIST_DIR = join(process.cwd(), 'dist');
const PORT = 3005;

// The routes you care about for SEO.
// Keep this focused: home, pricing, features, blog posts, etc.
const ROUTES = ['/', '/pricing', '/about', '/blog'];

async function prerender() {
  // 1. Start a local server that serves the built assets
  const app = express();

  app.use(express.static(DIST_DIR));

  // SPA fallback: any unknown route serves index.html
  app.use((_, res) => {
    res.sendFile(join(DIST_DIR, 'index.html'));
  });

  const server = app.listen(PORT, () => {
    console.log(`[prerender] Server listening on http://localhost:${PORT}`);
  });

  try {
    // 2. Launch headless Chrome
    const browser = await puppeteer.launch({ headless: true });
    const page = await browser.newPage();

    // Optional: set a consistent viewport / user agent
    await page.setViewport({ width: 1280, height: 720 });

    // 3. Visit each route and save its HTML
    for (const route of ROUTES) {
      const url = `http://localhost:${PORT}${route}`;
      console.log(`[prerender] Visiting ${url}`);

      // Wait until network is idle to ensure data has loaded
      await page.goto(url, { waitUntil: 'networkidle0' });

      const html = await page.content();

      const isRoot = route === '/';
      const filePath = isRoot
        ? join(DIST_DIR, 'index.html')
        : join(DIST_DIR, route, 'index.html');

      mkdirSync(dirname(filePath), { recursive: true });
      writeFileSync(filePath, html);

      console.log(`[prerender] Wrote ${filePath}`);
    }

    await browser.close();
  } finally {
    server.close();
  }
}

prerender().catch((err) => {
  console.error('[prerender] Error during prerender:', err);
  process.exit(1);
});
Enter fullscreen mode Exit fullscreen mode

A few practical notes:

  • Keep ROUTES small and focused on your SEO‑critical pages; every route adds build time.
  • If your app relies on auth, feature flags, or environment‑specific data, you can inject cookies or query params before calling page.goto.

Step 3: Wire It Into Your Build

Now we want this script to run automatically whenever we build for production.

In package.json:

{
  "scripts": {
    "build": "vite build && tsx scripts/prerender.ts",
    "dev": "vite",
    "preview": "vite preview"
  }
}
Enter fullscreen mode Exit fullscreen mode

Key points:

  • vite build runs first and outputs your SPA into dist.
  • tsx scripts/prerender.ts runs next, starts the local server, prerenders routes, and overwrites/creates HTML files in dist.
  • On Railway, set your build command to \pnpm run build in the project settings.

If you’re deploying somewhere else (Vercel, Netlify, Render, plain CI + S3), the idea is the same: make sure this script runs in your build step, not at request time.


Step 4: Fix Your robots.txt (If You Messed It Up Like I Did)

Finally, make sure you’re not accidentally blocking the very assets Google needs to render your app.

A safe starting point is something like:

User-agent: *
Allow: /

Sitemap: https://your-domain.com/sitemap.xml
Enter fullscreen mode Exit fullscreen mode

If you need to block specific URLs or paths, target those explicitly, but don’t broad‑brush disallow your JS/CSS folders.


What Changed After Prerendering

After:

  • Removing the /assets/ block from robots.txt
  • Adding the prerender step to the Vite build
  • Letting Google recrawl

I went from “one lonely indexed page” to seeing my key marketing URLs actually appear in the index. New URLs stopped getting stuck in “Discovered – currently not indexed” purgatory.

Just as importantly, this worked without:

  • Migrating to Next.js/Remix
  • Rewriting routing
  • Changing my hosting setup

For Google and non‑JS clients, SmilePlease behaves like a static site. For users with JS, it behaves like the vibe‑coded SPA I originally shipped.


When This Approach Makes Sense

Build‑time prerendering is a great fit if:

  • You already have a Vite/React SPA in production.
  • You’re seeing indexing issues or thin‑content warnings.
  • You can afford slightly slower builds in exchange for better SEO.
  • You don’t want to overhaul your stack just to get server‑rendered HTML.

It’s less ideal if:

  • You have a ton of highly dynamic, user‑specific pages.
  • Your content changes on every request.
  • You need real‑time personalized HTML.

In those cases, SSR or a hybrid framework may be the better long‑term path.


TL;DR for Vibe Coders

If you:

  • Clicked a few buttons in an AI builder
  • Ended up with a Vite/React app on Railway
  • Are now staring at “Discovered – currently not indexed” in Search Console

You don’t have to rewrite everything in Next.js to fix it.

You can:

  1. Make React hydrate pre‑rendered HTML instead of re‑rendering.
  2. Add a Puppeteer prerender step after vite build.
  3. Focus that prerender on your key marketing routes.
  4. Make sure robots.txt isn’t blocking your JS/CSS.

And suddenly, your vibe‑coded Vite site behaves like a real, crawlable website in the eyes of Google—without killing the vibes that got you shipping in the first place.


If you end up wiring this into your own Vite app, I’d love to hear which platform you built on (v0, Bolt, Lovable, hand‑rolled, etc.) and whether Google started indexing your pages after the change.

Top comments (0)