DEV Community

Cover image for I Scanned 5 Popular JavaScript Sites for SEO Issues — Here's What I Found
JSVisible
JSVisible

Posted on • Originally published at jsvisible.com

I Scanned 5 Popular JavaScript Sites for SEO Issues — Here's What I Found

Everyone assumes big tech companies have flawless SEO. I decided to test that. I scanned five well-known JavaScript-heavy sites — react.dev, vercel.com, stripe.com/docs, linear.app, and shopify.com — running 19 SEO health checks across 10 pages each.

Every single site had issues. Some had a lot of them.

The Scores

  • react.dev — 74/100 (Best of the group)
  • vercel.com — 71/100 (Solid but sloppy on details)
  • stripe.com/docs — 60/100 (Surprising for Stripe)
  • linear.app — 57/100 (Heavy SPA showing its seams)
  • shopify.com — 39/100 (The biggest surprise)

react.dev — 74/100

React's docs scored highest. Strong internal linking (3.5 avg links/page, zero orphan pages), all pages indexable, proper canonicals and OG tags.

Weak spots: zero structured data on every page, meta descriptions too short on 9/10 pages, and a JavaScript file that failed to load on /versions. The irony of React's docs having a broken JS file is hard to ignore.

vercel.com — 71/100

The Next.js creators get SSR right — all pages indexable, good content depth, OG tags everywhere. But the details slip: missing meta description on /abuse, missing canonical on /academy, missing H1 on a subpage, and 9 orphan pages out of 10 scanned.

Internal linking was surprisingly weak — 0.0 average links per page in the scanned set.

stripe.com/docs — 60/100

I expected Stripe to ace this. They nail the fundamentals — titles, meta descriptions, H1s, canonicals, OG tags, proper heading hierarchy. Zero orphan pages.

But: zero structured data on all 10 pages (huge missed opportunity for docs), 7/10 pages missing image alt text, 8/10 pages loading over 3 seconds, and API requests failed on every single page. That last one means some content may not be loading for crawlers.

linear.app — 57/100

Beautiful product, but SEO tells a different story. Zero structured data, every meta description too short, 8/10 titles too short, all 10 pages slow to load, 4 orphan pages.

The internal link average was just 0.5/page — the SPA architecture isn't generating proper crawlable links between pages. JS console errors on 2 pages, failed API requests on 2 more.

shopify.com — 39/100

The lowest score and the biggest company. The crawler landed on their Dutch locale pages, revealing issues you'd miss checking only the English site.

8/10 pages orphaned. Failed API requests on 7/10 pages. Missing H1s on 2 pages. No structured data on 8/10 pages. Even Shopify has SEO gaps.

Key Patterns Across All 5 Sites

Structured data is universally neglected. 4 out of 5 sites had zero Schema.org markup on every page scanned. This is free real estate for rich snippets that everyone leaves on the table.

Meta descriptions are an afterthought. Short, generic, or missing. These directly affect click-through rates from search results.

Image alt text is consistently missing. Every site had pages with images lacking alt text. Easy 2-minute fix per image with high accessibility and SEO impact.

Internal linking is weak on SPAs. Linear and Shopify had most pages orphaned. Server-rendered sites (Stripe, React) did better because links exist in the initial HTML.

Page speed is a universal problem. Most pages across all sites took over 3 seconds to load. JavaScript-heavy sites consistently struggle here.

AI crawlers see even less. These scores reflect what Googlebot sees after JS rendering. AI crawlers from ChatGPT and Perplexity don't render JS at all — they only see raw HTML. Sites using client-side rendering are invisible to AI search.

What You Can Do

  1. Check your page source. Right-click → View Page Source. If your content isn't in the raw HTML, crawlers aren't seeing it.
  2. Add structured data. 20-30 minutes per page type, huge ROI for rich snippets.
  3. Write proper meta descriptions. 150-160 characters, unique per page.
  4. Add alt text to every image. 2 minutes per image.
  5. Fix internal linking. Every important page needs 2-3 internal links. Use real <a href> tags, not JS click handlers.
  6. Use SSR for important pages. Critical now that AI crawlers don't render JavaScript.

Note: these are 10-page scans — a snapshot, not a full audit. But the patterns are consistent and verifiable by viewing page source on any of these sites.


I built JSVisible to automate these checks — it renders pages as both a user and Googlebot and compares the results. Free tier available if you want to try it on your own site.

Top comments (0)