DEV Community

Cover image for Why Astro 6's 0kb JS is the Ultimate Enterprise SEO Solution
Hugo Campañoli
Hugo Campañoli

Posted on • Originally published at campa.dev

Why Astro 6's 0kb JS is the Ultimate Enterprise SEO Solution

Crawl Budget is the most scarce resource in enterprise SEO. Google defines it as "the number of URLs Googlebot can and wants to crawl" — and that number has an efficiency ceiling.

Sites with 100k+ URLs that rely on JS-heavy frameworks (Next.js, Nuxt) often waste critical crawling capacity in the Web Rendering Service (WRS). This is Google's infrastructure that executes JavaScript, but it creates processing queues that can last days or even weeks.

Astro 6 solves this by design: Pure static HTML, 0kb JS runtime by default, and an Island architecture that protects rendering fidelity.


The Cost of JavaScript Rendering

Googlebot goes through two passes to index your content:

  1. Crawl: It downloads the HTML.
  2. Rendering: It executes the JS in the WRS.

The second pass is an asynchronous queue. If your content is hidden behind JS, Googlebot won't "see" it until its turn comes up in the queue. This creates "Zombie Pages": URLs that are discovered but have incomplete content in the index.

The 5 Infrastructure Gates

Jason Barnard describes the pipeline as 5 sequential gates:
DiscoverySelectionCrawlingRenderingIndexing.

With Astro 6, the rendering gate is practically bypassed. The content is already in the HTML that Googlebot downloads during the Crawling phase.

Feature Legacy (SSR/CSR) Astro 6 (SSG/Islands)
JS Runtime 150-400kb (typical) 0kb (default)
Infra Gates 5 Gates (Discovery to Index) 4 Gates (Skips Rendering)
Render Fidelity Variable (Depends on WRS) 100% (HTML is the truth)
AI Bot Visibility Partial Complete

The AI Factor: Bots don't execute JS

There is an angle most people ignore: AI bots don't execute JavaScript.

Most search engines (Google, Bing) have rendering capabilities. But Perplexity, smaller AI agents, and training crawlers mostly work with the initial HTML. If your content depends on client-side rendering, to these bots, your content doesn't exist.


Use Cases: Astro 6 vs Next.js

It's not about which framework is "better," but which is right for the job.

✅ Use Astro 6 for (Content is King):

  • E-commerce with large catalogs (100k+ products).
  • Directories and Marketplaces with static content.
  • Technical documentation portals.
  • Enterprise landing pages at scale.

❌ Use Next.js for (Logic is King):

  • Real-time data dashboards.
  • Apps with complex route-based authentication.
  • Collaborative tools (editors, whiteboards).
  • SaaS platforms with heavy client-side business logic.

Conclusion: Ockham's Razor for SEO

The simplest solution is usually the right one. If Googlebot isn't indexing all your URLs, the answer isn't more servers or more complex SSR. The answer is less JavaScript.

In 2026, with AI bots becoming a major source of digital visibility, static HTML is no longer just a performance optimization — it's a multi-bot visibility strategy.


Originally published at campa.dev

Top comments (0)