You've built a clean React app. Routing works. Components are snappy. Users love it. Then you check Google Search Console and find that Google has indexed... three pages. Or zero. Or it shows your homepage with the title "React App" and no meta description.
This isn't a bug. It's a fundamental architectural mismatch between how SPAs work and how most crawlers behave.
The Problem: Crawlers Don't Execute JavaScript (Reliably)
When a browser visits your React app, it receives something like this:
<html>
<head><title>My App</title></head>
<body><div id="root"></div></body>
<script src="/bundle.js"></script>
</html>
The browser downloads that JS bundle, executes it, and React renders your full UI. Users see a fully interactive page within seconds.
But here's what Google's crawler often sees:
<html>
<head><title>My App</title></head>
<body><div id="root"></div></body>
</html>
An empty shell. No content. No links. No metadata. As far as Google is concerned, this page has nothing on it.
Google can execute JavaScript — but it doesn't always, and there's a significant crawl budget cost. Googlebot processes JavaScript in a second wave that can be delayed by days or weeks. And other crawlers (Bing, ChatGPT's GPTBot, Perplexity's PerplexityBot, LinkedIn, Twitter/X) largely can't or don't run JS at all.
The Impact on Real Indexing
Here's what this looks like in practice:
| What users see | What crawlers see |
|---|---|
| Full product page with pricing, features, CTAs | <div id="root"></div> |
| Blog post with 1,200 words of content | Empty body |
| Dynamic meta title + OG image |
React App, no OG tags |
| Internal links across 50 pages | No discoverable links at all |
If you're building an app where SEO matters — landing pages, content sites, e-commerce, SaaS marketing sites — this is a serious problem.
The Solutions (And Their Tradeoffs)
Option 1: Server-Side Rendering (SSR)
Frameworks like Next.js, Nuxt, and Angular Universal render pages on the server so HTML is fully populated before it reaches the client.
Pros: Gold standard. Real HTML, always fresh, great Core Web Vitals.
Cons: You're not always starting from scratch. If you have an existing CRA app, a Vite app, or a Lovable.dev project, migrating to Next.js means weeks of refactoring, potential library incompatibilities, and introducing a Node.js server into your infrastructure.
Option 2: Static Site Generation (SSG)
Pre-build every page as static HTML at deploy time.
Pros: Fast, crawlable, no server needed.
Cons: Only works if your content is known at build time. Dynamic routes, user-generated content, and real-time data make this impractical.
Option 3: Prerendering
Serve pre-rendered HTML snapshots to crawlers while serving your normal SPA to humans. Best of both worlds — no code changes, your app stays as-is.
This is the approach used by services like Prerender.io and Rendertron. The downside has historically been cost (Prerender.io starts at $99/month) and latency (centralized rendering servers add overhead).
Enter Edge-Based Prerendering
A newer approach runs prerendering at the network edge — on CDN nodes physically close to both the crawler and your users. This eliminates the latency penalty and distributes the rendering load globally.
PreRender24 takes this approach, running entirely on Cloudflare's edge network:
- You add a DNS record pointing to PreRender24 (takes about 5 minutes)
- PreRender24 sits in front of your app, detecting incoming requests
- Human visitors are passed straight through to your origin — zero overhead
- Bot traffic (Google, Bing, ChatGPT, Perplexity, social crawlers) gets a pre-rendered HTML snapshot from the nearest edge node in under 250ms
- Cache is automatically refreshed so your content stays current
Here's what a crawler sees after prerendering:
<html>
<head>
<title>Your Actual Page Title | YourApp</title>
<meta name="description" content="Your real meta description">
<meta property="og:image" content="/real-image.png">
</head>
<body>
<nav>
<a href="/features">Features</a>
<a href="/pricing">Pricing</a>
<a href="/blog">Blog</a>
</nav>
<main>
<h1>Your Real H1 Heading</h1>
<p>All of your actual content, fully rendered...</p>
</main>
</body>
</html>
That's what Google actually needs. Full content, real links, proper metadata.
How to Set It Up
The DNS-based setup means no code changes whatsoever. Your React app, your build process, your deployment — nothing changes. You're just inserting a transparent layer between crawlers and your origin.
For Cloudflare-managed domains, it's essentially:
- Sign up for PreRender24
- Update your CNAME record to point to PreRender24's edge
- Add your origin URL in the dashboard
Within 24–48 hours, you'll typically see crawlers re-visiting pages they previously couldn't parse.
Who This Is For
This approach works well if:
- You have an existing SPA and don't want to refactor to SSR
- You're building with Vite, Create React App, or tools like Lovable.dev that output pure client-side apps
- You need SEO but also need dynamic data or complex client-side interactivity
- You want AI crawlers (ChatGPT, Perplexity) to see and surface your content
It's less appropriate if you're building with Next.js or Nuxt already — you have SSR built in.
Takeaway
The gap between "works for users" and "works for crawlers" is a real and costly problem for SPA developers. SSR is the ideal long-term answer, but it's not always realistic. Edge-based prerendering is the pragmatic bridge — and with the DNS-only setup, the time-to-fix can genuinely be under 10 minutes.
If you've been staring at an underwhelming Search Console dashboard on a SPA you're proud of, this might be worth a look.
Have you dealt with SPA SEO issues? What approach did you end up taking? Would love to hear what worked (or didn't) in the comments.
Top comments (0)