DEV Community

Cover image for I Built a React SPA and Tried to Rank on Google. Here Are My Real Numbers After 12 Weeks.
Prashant A
Prashant A

Posted on • Originally published at conversionprobe.com

I Built a React SPA and Tried to Rank on Google. Here Are My Real Numbers After 12 Weeks.

If you've built a single-page app and wondered whether Google will ever find it, I have 12 weeks of real data for you.

I launched a SaaS tool in February 2026. React + Vite + TypeScript frontend, Express backend, deployed on Render. No server-side rendering. No Next.js. Just a client-rendered SPA that I needed Google to crawl and index.

This is what happened.

The numbers

806 impressions. 1 click. Twelve weeks.

That one click was probably me checking if my own site showed up.

Average position moved from about 57 (page 6 of Google) to 47 (page 5). Not exactly a success story. But I learned a lot about what it takes to get a pure SPA indexed.

The technical setup

Google doesn't render JavaScript well enough to index a React SPA reliably. I hit this wall in week 2 when pages were getting crawled but showing up with blank content in Google's index.

The fix was prerendering. At build time, a script visits each route with Puppeteer, waits for the page to finish rendering, and saves the output as static HTML. The deployment platform serves those HTML files to crawlers (matched by user-agent), and regular users get the normal SPA experience.

The prerender config looks like this:

// prerender.mjs — runs at build time
const routes = [
  { route: '/', outputPath: 'index.html' },
  { route: '/guides/landing-page-psychology', outputPath: 'guides/landing-page-psychology/index.html' },
  // ... 15 more routes
];
Enter fullscreen mode Exit fullscreen mode

Each route gets its own index.html in a subdirectory. The hosting config maps clean URLs to these files:

# render.yaml rewrite rules
- source: /guides/landing-page-psychology
  destination: /guides/landing-page-psychology/index.html
Enter fullscreen mode Exit fullscreen mode

Three files have to stay in sync: the sitemap, the prerender script, and the rewrite rules. Every time I add a page, all three need updating. I've shipped broken pages twice by forgetting one. There's no automation for it yet and it's the most annoying part of the setup.

What Google actually did with my pages

Once prerendering was working, pages started showing up in Google within a few days of being submitted through Search Console.

The timeline looked like this:

Weeks 1-2: zero to two impressions per day. Google barely knew the site existed.

Weeks 3-4: jumped to 1-13 per day once the prerendered pages got indexed.

Weeks 5-7: spiked to 18-40 per day. Content was getting picked up.

Weeks 8-12: settled back to 0-16 per day.

That spike in weeks 5-7 would look great in a screenshot if I cropped the chart. "40 impressions per day in 5 weeks!" But zoom out and the line is mostly flat. I think a lot of those growth posts do exactly this: screenshot the spike, skip the surrounding weeks.

The queries people searched

Google was matching my pages to real search queries:

  • "cro consultant" (conversion rate optimization): 289 impressions, 0 clicks (position 57)
  • "landing page conversion rate": 76 impressions, 0 clicks (position 68)

Position 57 is page 6. Nobody looks at page 6. The impressions prove Google knows my pages exist and considers them relevant. The position proves the domain doesn't have enough trust yet to compete.

What I'd do differently

I would use a meta framework from the start. Prerendering works, but it's a workaround. Every new page needs manual config in three places. A framework like Astro, Next.js, or Remix would handle this automatically. I didn't use one because I started this as a tool, not a content site. By the time search ranking mattered, I had too much code to move over. If your SPA needs to rank on Google, build the rendering pipeline before you need it.

I would submit to Search Console on day one. I waited three weeks before setting up Google Search Console. That's three weeks of Google not knowing the site existed. It's free and takes ten minutes.

I would pick search terms nobody else is targeting. The small, obscure terms worked. 8 of my pages are in Google's top 10 for the specific terms I was aiming at:

Page Position
Comparison page 1 3.0
Comparison page 2 4.7
Guide on persuasion 5.8
Guide on urgency 6.2
Audit study 7.1
Interactive quiz 7.2
Above-the-fold guide 8.0
Psychology guide 8.4

The catch: almost nobody searches for these terms. I picked them on purpose, as a test. If the writing can rank at all, I know the content is fine and the problem is domain age and backlinks, not content quality. That test passed.

What I learned about SPAs and search

The React-versus-SEO debate is real, but it's not about whether Google can render JavaScript. It's about whether you're willing to do the extra work to make sure it does.

With prerendering, my SPA gets indexed and ranks just fine. Without it, the pages showed up in Google's index with blank content. It either works or it doesn't. There's no middle ground.

If you're building a client-rendered SPA and want Google traffic, your options are:

  1. Prerender at build time (what I did). Works, but manual. Good if you have a small number of routes.
  2. Server-side render with Next.js, Remix, or Astro. Better long-term, more setup upfront.
  3. Hope Google renders your JS correctly. It sometimes does. I wouldn't bet on it.

After 12 weeks, I have 806 impressions and 8 pages in the top 10. The SPA part isn't holding me back. The thing slowing me down now is domain age and backlinks.

If you're building something similar and your Google numbers look like mine after a few months: that's normal. The prerendering works. The indexing works. The waiting is the hard part.


About the tool: I built ConversionProbe to audit landing pages the way a first-time visitor would see them. Paste a URL, get a report in 60 seconds. The SEO data in this post is from running it on my own site.

Top comments (0)