<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: JSVisible</title>
    <description>The latest articles on DEV Community by JSVisible (@jsvisible).</description>
    <link>https://dev.to/jsvisible</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/jsvisible"/>
    <language>en</language>
    <item>
      <title>I Scanned 5 Popular JavaScript Sites for SEO Issues — Here's What I Found</title>
      <dc:creator>JSVisible</dc:creator>
      <pubDate>Sun, 29 Mar 2026 22:07:53 +0000</pubDate>
      <link>https://dev.to/jsvisible/i-scanned-5-popular-javascript-sites-for-seo-issues-heres-what-i-found-22ob</link>
      <guid>https://dev.to/jsvisible/i-scanned-5-popular-javascript-sites-for-seo-issues-heres-what-i-found-22ob</guid>
      <description>&lt;p&gt;Everyone assumes big tech companies have flawless SEO. I decided to test that. I scanned five well-known JavaScript-heavy sites — react.dev, vercel.com, stripe.com/docs, linear.app, and shopify.com — running 19 SEO health checks across 10 pages each.&lt;/p&gt;

&lt;p&gt;Every single site had issues. Some had a lot of them.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Scores
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;react.dev — 74/100&lt;/strong&gt; (Best of the group)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;vercel.com — 71/100&lt;/strong&gt; (Solid but sloppy on details)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;stripe.com/docs — 60/100&lt;/strong&gt; (Surprising for Stripe)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;linear.app — 57/100&lt;/strong&gt; (Heavy SPA showing its seams)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;shopify.com — 39/100&lt;/strong&gt; (The biggest surprise)&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  react.dev — 74/100
&lt;/h2&gt;

&lt;p&gt;React's docs scored highest. Strong internal linking (3.5 avg links/page, zero orphan pages), all pages indexable, proper canonicals and OG tags.&lt;/p&gt;

&lt;p&gt;Weak spots: zero structured data on every page, meta descriptions too short on 9/10 pages, and a JavaScript file that failed to load on /versions. The irony of React's docs having a broken JS file is hard to ignore.&lt;/p&gt;

&lt;h2&gt;
  
  
  vercel.com — 71/100
&lt;/h2&gt;

&lt;p&gt;The Next.js creators get SSR right — all pages indexable, good content depth, OG tags everywhere. But the details slip: missing meta description on /abuse, missing canonical on /academy, missing H1 on a subpage, and 9 orphan pages out of 10 scanned.&lt;/p&gt;

&lt;p&gt;Internal linking was surprisingly weak — 0.0 average links per page in the scanned set.&lt;/p&gt;

&lt;h2&gt;
  
  
  stripe.com/docs — 60/100
&lt;/h2&gt;

&lt;p&gt;I expected Stripe to ace this. They nail the fundamentals — titles, meta descriptions, H1s, canonicals, OG tags, proper heading hierarchy. Zero orphan pages.&lt;/p&gt;

&lt;p&gt;But: zero structured data on all 10 pages (huge missed opportunity for docs), 7/10 pages missing image alt text, 8/10 pages loading over 3 seconds, and API requests failed on every single page. That last one means some content may not be loading for crawlers.&lt;/p&gt;

&lt;h2&gt;
  
  
  linear.app — 57/100
&lt;/h2&gt;

&lt;p&gt;Beautiful product, but SEO tells a different story. Zero structured data, every meta description too short, 8/10 titles too short, all 10 pages slow to load, 4 orphan pages.&lt;/p&gt;

&lt;p&gt;The internal link average was just 0.5/page — the SPA architecture isn't generating proper crawlable links between pages. JS console errors on 2 pages, failed API requests on 2 more.&lt;/p&gt;

&lt;h2&gt;
  
  
  shopify.com — 39/100
&lt;/h2&gt;

&lt;p&gt;The lowest score and the biggest company. The crawler landed on their Dutch locale pages, revealing issues you'd miss checking only the English site.&lt;/p&gt;

&lt;p&gt;8/10 pages orphaned. Failed API requests on 7/10 pages. Missing H1s on 2 pages. No structured data on 8/10 pages. Even Shopify has SEO gaps.&lt;/p&gt;

&lt;h2&gt;
  
  
  Key Patterns Across All 5 Sites
&lt;/h2&gt;

&lt;p&gt;&lt;strong&gt;Structured data is universally neglected.&lt;/strong&gt; 4 out of 5 sites had zero Schema.org markup on every page scanned. This is free real estate for rich snippets that everyone leaves on the table.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Meta descriptions are an afterthought.&lt;/strong&gt; Short, generic, or missing. These directly affect click-through rates from search results.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Image alt text is consistently missing.&lt;/strong&gt; Every site had pages with images lacking alt text. Easy 2-minute fix per image with high accessibility and SEO impact.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Internal linking is weak on SPAs.&lt;/strong&gt; Linear and Shopify had most pages orphaned. Server-rendered sites (Stripe, React) did better because links exist in the initial HTML.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Page speed is a universal problem.&lt;/strong&gt; Most pages across all sites took over 3 seconds to load. JavaScript-heavy sites consistently struggle here.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AI crawlers see even less.&lt;/strong&gt; These scores reflect what Googlebot sees after JS rendering. AI crawlers from ChatGPT and Perplexity don't render JS at all — they only see raw HTML. Sites using client-side rendering are invisible to AI search.&lt;/p&gt;

&lt;h2&gt;
  
  
  What You Can Do
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Check your page source.&lt;/strong&gt; Right-click → View Page Source. If your content isn't in the raw HTML, crawlers aren't seeing it.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Add structured data.&lt;/strong&gt; 20-30 minutes per page type, huge ROI for rich snippets.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Write proper meta descriptions.&lt;/strong&gt; 150-160 characters, unique per page.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Add alt text to every image.&lt;/strong&gt; 2 minutes per image.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Fix internal linking.&lt;/strong&gt; Every important page needs 2-3 internal links. Use real &lt;code&gt;&amp;lt;a href&amp;gt;&lt;/code&gt; tags, not JS click handlers.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Use SSR for important pages.&lt;/strong&gt; Critical now that AI crawlers don't render JavaScript.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;em&gt;Note: these are 10-page scans — a snapshot, not a full audit. But the patterns are consistent and verifiable by viewing page source on any of these sites.&lt;/em&gt;&lt;/p&gt;




&lt;p&gt;&lt;em&gt;I built &lt;a href="https://jsvisible.com" rel="noopener noreferrer"&gt;JSVisible&lt;/a&gt; to automate these checks — it renders pages as both a user and Googlebot and compares the results. Free tier available if you want to try it on your own site.&lt;/em&gt;&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>seo</category>
      <category>react</category>
      <category>webdev</category>
    </item>
    <item>
      <title>Why AI Crawlers Can't See Your React App</title>
      <dc:creator>JSVisible</dc:creator>
      <pubDate>Wed, 04 Mar 2026 18:47:11 +0000</pubDate>
      <link>https://dev.to/jsvisible/why-ai-crawlers-cant-see-your-react-app-137n</link>
      <guid>https://dev.to/jsvisible/why-ai-crawlers-cant-see-your-react-app-137n</guid>
      <description>&lt;p&gt;Google has been rendering JavaScript since 2019. It's not perfect, but it works. So if you're building with React, Vue, or Next.js, you might assume search engines can see your content just fine.&lt;/p&gt;

&lt;p&gt;That assumption is about to cost you traffic.&lt;/p&gt;

&lt;p&gt;Because Google isn't the only crawler that matters anymore. AI-powered search engines are growing fast — ChatGPT, Perplexity, Google's AI Overviews, Microsoft Copilot. They're changing how people find information online. And almost none of their crawlers render JavaScript.&lt;/p&gt;

&lt;h2&gt;
  
  
  How AI Crawlers Actually Work
&lt;/h2&gt;

&lt;p&gt;When GPTBot (OpenAI's crawler), PerplexityBot, or ClaudeBot visit your site, they don't fire up a headless browser. They don't execute your JavaScript. They don't wait for your React components to mount or your API calls to resolve.&lt;/p&gt;

&lt;p&gt;They send a simple HTTP request, download whatever HTML your server returns, and move on.&lt;/p&gt;

&lt;p&gt;If your server returns a fully rendered page with all your content in the HTML, great — the AI crawler sees everything. If your server returns an empty shell with a script tag that loads your app, the AI crawler sees nothing.&lt;/p&gt;

&lt;p&gt;There's no rendering queue. There's no second pass. What your server sends in that first response is all they'll ever see.&lt;/p&gt;

&lt;h2&gt;
  
  
  Why This Is Different from Google
&lt;/h2&gt;

&lt;p&gt;Google's approach to JavaScript has three steps. First, Googlebot fetches your raw HTML and extracts whatever it can. Second, it puts your page in a rendering queue. Third, Google's Web Rendering Service executes your JavaScript in a headless Chromium browser and indexes the final result.&lt;/p&gt;

&lt;p&gt;It's not instant — that rendering queue can take hours or days — but eventually Google sees your fully rendered page.&lt;/p&gt;

&lt;p&gt;AI crawlers skip steps two and three entirely. They only do step one. Fetch the HTML, read it, done.&lt;/p&gt;

&lt;p&gt;This means a client-side rendered React app that Google eventually indexes correctly might be completely invisible to every AI crawler on the internet.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Traffic Impact Is Real
&lt;/h2&gt;

&lt;p&gt;AI-powered search is growing fast. Perplexity reports millions of daily queries. ChatGPT's browsing feature is used by hundreds of millions of users. Google's own AI Overviews now appear on a significant percentage of search results.&lt;/p&gt;

&lt;p&gt;When these tools answer questions, they pull from content their crawlers have indexed. If your content isn't in their index because their crawlers couldn't see it, you won't be cited. You won't be recommended. You won't get that traffic.&lt;/p&gt;

&lt;p&gt;This isn't a future problem. It's happening right now.&lt;/p&gt;

&lt;h2&gt;
  
  
  Which Frameworks Are Affected
&lt;/h2&gt;

&lt;p&gt;Not all JavaScript apps have this problem. It depends on how your framework renders pages.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Client-side rendering&lt;/strong&gt; is the most affected. This is the default for Create React App and vanilla single-page applications. The server sends an empty HTML shell, and the browser builds the page entirely with JavaScript. AI crawlers see nothing but an empty div and a script tag.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Server-side rendering&lt;/strong&gt; is safe. Frameworks like Next.js (with getServerSideProps or Server Components), Nuxt.js, and SvelteKit can render your pages on the server before sending them to the browser. The AI crawler gets a fully rendered HTML page on the first request.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Static site generation&lt;/strong&gt; is also safe. If you pre-build your pages at deploy time (Next.js static export, Astro, Gatsby), the HTML files already contain all your content. No JavaScript execution needed.&lt;/p&gt;

&lt;p&gt;The tricky case is &lt;strong&gt;hybrid rendering&lt;/strong&gt;. Many Next.js and Nuxt apps use a mix — some pages are server-rendered, others are client-rendered. You might have SSR on your landing page but client-side rendering on your blog or product pages. This creates a situation where some pages are visible to AI crawlers and others aren't, which is hard to diagnose without testing each page individually.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to Check If You're Affected
&lt;/h2&gt;

&lt;p&gt;The fastest test takes 30 seconds. Open your site in Chrome, right-click anywhere, and select "View Page Source" — not "Inspect Element." View Page Source shows you the raw HTML your server sends before any JavaScript executes. This is exactly what AI crawlers see.&lt;/p&gt;

&lt;p&gt;If your main content, headings, meta descriptions, and navigation links are all in that source HTML, you're fine. If you see mostly empty divs, script tags, and a loading spinner, AI crawlers are seeing the same empty page.&lt;/p&gt;

&lt;p&gt;For a more thorough check, you can compare how your page looks to a regular user versus how it looks to different crawlers. Tools like &lt;a href="https://jsvisible.com" rel="noopener noreferrer"&gt;JSVisible&lt;/a&gt; render your pages from multiple perspectives — user browser, Googlebot, and raw HTML — so you can see exactly what each crawler sees and where the gaps are.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to Fix It
&lt;/h2&gt;

&lt;p&gt;If you're on a framework that supports server-side rendering, the fix is straightforward. Switch your critical pages from client-side to server-side rendering. In Next.js, this means using Server Components (the default in the App Router) or getServerSideProps in the Pages Router. In Nuxt, it means using universal rendering mode.&lt;/p&gt;

&lt;p&gt;You don't have to server-render everything. Focus on the pages that matter for discovery — your homepage, landing pages, blog posts, product pages, and any page you want search engines and AI tools to find.&lt;/p&gt;

&lt;p&gt;If you can't switch to SSR, prioritize getting critical SEO elements into your server-rendered HTML at minimum. Your page title, meta description, canonical URL, and H1 heading should all be in the initial HTML response, not injected by JavaScript after the page loads. Many frameworks have head management utilities that handle this even in client-rendered apps.&lt;/p&gt;

&lt;p&gt;For internal links, always use proper anchor tags with href attributes instead of JavaScript click handlers. AI crawlers follow links by reading href values from the HTML. If your navigation uses onClick handlers or client-side routing without real anchor tags, crawlers can't discover your other pages.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Bottom Line
&lt;/h2&gt;

&lt;p&gt;The web runs on JavaScript, but AI crawlers don't. Google's ability to render JavaScript gave developers a false sense of security. The new generation of AI-powered search tools has exposed a gap that was always there — the gap between what a browser sees and what a simple HTTP request returns.&lt;/p&gt;

&lt;p&gt;If your content depends on JavaScript to appear, you're visible to Google (eventually) but invisible to the fastest-growing discovery channel on the internet. The fix isn't complicated. Server-side render your important pages, put critical SEO elements in your initial HTML, and use real links.&lt;/p&gt;

&lt;p&gt;The first step is knowing whether you have a problem. Check your page source. If it's empty, so is your presence in AI search.&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>seo</category>
      <category>react</category>
      <category>nextjs</category>
    </item>
    <item>
      <title>What Google Actually Sees on Your JavaScript Site (And Why It Might Surprise You)</title>
      <dc:creator>JSVisible</dc:creator>
      <pubDate>Mon, 02 Mar 2026 18:30:35 +0000</pubDate>
      <link>https://dev.to/jsvisible/what-google-actually-sees-on-your-javascript-site-and-why-it-might-surprise-you-j5c</link>
      <guid>https://dev.to/jsvisible/what-google-actually-sees-on-your-javascript-site-and-why-it-might-surprise-you-j5c</guid>
      <description>&lt;p&gt;You spent months building your React app. The design is polished, the content is solid, and your pages load fast in the browser. But here's the thing — what you see in Chrome is not necessarily what Google sees when it crawls your site.&lt;/p&gt;

&lt;p&gt;If your site relies on JavaScript to render content, there's a gap between the user experience and the search engine experience. Sometimes it's small. Sometimes your entire page content is invisible to Google. And in 2026, the problem just got worse — because AI crawlers like ChatGPT and Perplexity can't render JavaScript at all.&lt;/p&gt;

&lt;p&gt;This post explains what's actually happening under the hood, how to check if your site is affected, and what to do about it.&lt;/p&gt;

&lt;h2&gt;
  
  
  How Google processes JavaScript sites
&lt;/h2&gt;

&lt;p&gt;When Googlebot visits a URL, it doesn't work like your browser. It processes pages in two phases. First, it fetches the raw HTML — the source code before any JavaScript runs. At this point, it extracts links and basic content from whatever is in that initial HTML response. Second, it puts the page into a rendering queue. Eventually, Google's Web Rendering Service fires up a headless Chromium browser, executes your JavaScript, and captures the final rendered page.&lt;/p&gt;

&lt;p&gt;The key word there is "eventually." That rendering queue isn't instant. It can take seconds, hours, or even days depending on Google's resource availability and how many pages it needs to process. During that gap, Google is working with whatever was in your raw HTML — which for many JavaScript apps is essentially an empty shell.&lt;br&gt;
If you're using a framework like React with client-side rendering, your initial HTML might look something like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight html"&gt;&lt;code&gt;&lt;span class="nt"&gt;&amp;lt;div&lt;/span&gt; &lt;span class="na"&gt;id=&lt;/span&gt;&lt;span class="s"&gt;"root"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&amp;lt;/div&amp;gt;&lt;/span&gt;
&lt;span class="nt"&gt;&amp;lt;script &lt;/span&gt;&lt;span class="na"&gt;src=&lt;/span&gt;&lt;span class="s"&gt;"/static/js/bundle.js"&lt;/span&gt;&lt;span class="nt"&gt;&amp;gt;&amp;lt;/script&amp;gt;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That's what Google sees on the first pass. No content, no meta tags, no internal links — just an empty div and a script reference. Everything meaningful only appears after JavaScript executes in the second phase.&lt;/p&gt;

&lt;h2&gt;
  
  
  Where things go wrong
&lt;/h2&gt;

&lt;p&gt;The rendering gap creates several common problems that are surprisingly hard to detect because everything looks perfect in your browser.&lt;/p&gt;

&lt;p&gt;Meta tags are the most frequent casualty. If your page title and meta description are set by JavaScript after page load, Google might index your fallback text instead. Ever searched for your site and seen "React App" as the title or "Loading..." as the description? That's this problem in action.&lt;/p&gt;

&lt;p&gt;Internal links are another blind spot. Single-page applications often use JavaScript click handlers for navigation instead of proper anchor tags with href attributes. Google can't follow onClick handlers during the initial HTML crawl — it needs real links to discover your pages. If your navigation is JavaScript-driven, entire sections of your site might not get crawled at all.&lt;/p&gt;

&lt;p&gt;Dynamic content that loads from APIs is particularly vulnerable. Product listings, blog posts, user reviews — anything fetched from an API after page load might not exist in Google's initial view. And even after rendering, if the API call is slow or fails, that content stays invisible.&lt;/p&gt;

&lt;h2&gt;
  
  
  The AI crawler problem
&lt;/h2&gt;

&lt;p&gt;Here's what makes this urgent in 2026: it's not just about Google anymore.&lt;br&gt;
AI crawlers — GPTBot (OpenAI), ClaudeBot (Anthropic), PerplexityBot, and others — are now indexing the web to power AI search results, chatbot answers, and content recommendations. Unlike Google, these crawlers don't render JavaScript at all. They fetch your raw HTML and that's it. No rendering queue, no second pass, no JavaScript execution.&lt;/p&gt;

&lt;p&gt;If your content only exists after JavaScript runs, you're completely invisible to AI-powered search. And with AI search growing rapidly, that's an increasingly large chunk of how people discover content online.&lt;/p&gt;

&lt;p&gt;Server-side rendering isn't just a Google optimization anymore. It's becoming a requirement for visibility across an entire ecosystem of crawlers and AI tools.&lt;/p&gt;

&lt;h2&gt;
  
  
  How to check what Google actually sees
&lt;/h2&gt;

&lt;p&gt;The simplest test is to view your page source (not Inspect Element — actual page source). Right-click your page in Chrome and select "View Page Source." This shows you the raw HTML before JavaScript runs. If your main content, titles, and navigation links aren't there, Google is relying on JavaScript rendering to see them.&lt;br&gt;
You can also disable JavaScript in Chrome DevTools to simulate the first-pass experience. Open DevTools, press Ctrl+Shift+P (or Cmd+Shift+P on Mac), type "Disable JavaScript," and reload the page. If your content disappears, that's exactly what crawlers see before rendering.&lt;/p&gt;

&lt;p&gt;Google Search Console's URL Inspection tool gives you the rendered version, which is useful for confirming what Google eventually sees after the rendering queue. But it doesn't tell you about the delay or show you the first-pass experience.&lt;/p&gt;

&lt;p&gt;For a more thorough check, tools that compare user rendering vs. Googlebot rendering side by side can reveal gaps you'd never catch manually — especially across dozens or hundreds of pages. This is exactly the kind of analysis JSVisible was built for. It renders each page as both a regular user and as Googlebot, captures screenshots from both perspectives, and flags differences automatically across 35+ SEO checks.&lt;/p&gt;

&lt;h2&gt;
  
  
  What to do about it
&lt;/h2&gt;

&lt;p&gt;If your site has JavaScript rendering issues, the fix depends on your framework and how much you can change.&lt;/p&gt;

&lt;p&gt;The gold standard is server-side rendering or static site generation. Frameworks like Next.js, Nuxt, and SvelteKit make this relatively straightforward. With SSR, your server sends fully-rendered HTML on every request — no waiting for JavaScript to execute. Google gets your complete content on the first pass, and AI crawlers see everything too.&lt;/p&gt;

&lt;p&gt;If you can't move to SSR, prioritize getting your critical SEO elements into the initial HTML. At minimum, your page title, meta description, canonical URL, and H1 heading should be in the server-rendered HTML — not injected by JavaScript. Many frameworks offer head management utilities that can handle this even in client-side rendered apps.&lt;/p&gt;

&lt;p&gt;For internal links, always use proper anchor tags with href attributes instead of JavaScript click handlers. This ensures Google discovers your links during the HTML crawling phase, not just after rendering.&lt;/p&gt;

&lt;p&gt;Finally, test regularly. Your site changes over time as you add features and update content. A page that renders correctly for Google today might break after the next deployment. Automated scanning that checks rendering differences on a schedule catches regressions before they hurt your rankings.&lt;/p&gt;

&lt;h2&gt;
  
  
  The bottom line
&lt;/h2&gt;

&lt;p&gt;The web has moved to JavaScript-heavy applications, but crawlers haven't fully caught up — and AI crawlers haven't caught up at all. The gap between what your users see and what search engines see is where SEO problems hide. The first step to fixing it is knowing the gap exists.&lt;/p&gt;

&lt;p&gt;If you want to see exactly what Google sees on your JavaScript site, try a free scan at &lt;a href="https://jsvisible.com" rel="noopener noreferrer"&gt;jsvisible.com&lt;/a&gt;. It takes 30 seconds and might surprise you.&lt;/p&gt;

</description>
      <category>javascript</category>
      <category>webdev</category>
      <category>react</category>
      <category>seo</category>
    </item>
  </channel>
</rss>
