DEV Community

Athreya aka Maneshwar
Athreya aka Maneshwar

Posted on

Reducing Request Counts in Static Astro Sites: A Guide for Heavy Websites

Hello, I'm Maneshwar. I'm working on FreeDevTools online currently building *one place for all dev tools, cheat codes, and TLDRs* — a free, open-source hub where developers can quickly find and use tools without any hassle of searching all over the internet.

If you run a highly SEO-optimized static site built with Astro, chances are you’ve noticed one problem: the sheer number of HTTP requests per page.

Even if your site is blazing fast, bots and crawlers can multiply requests, causing CDN overages, bloated analytics, and higher hosting costs.

In this guide, we’ll explore why this happens, how to analyze it, and practical techniques to reduce request counts drastically without compromising SEO or user experience.

Why Request Counts Explode

A single static page can generate 30–50 requests for:

  • HTML/Document
  • CSS files
  • JavaScript bundles
  • Fonts
  • Images
  • SVG icons
  • Miscellaneous files (favicon, manifest, etc.)

For listicle-heavy pages or icon-heavy pages, this can exceed 100 requests per page.

Now imagine a highly-crawled site:

  • 130,000 pages
  • Multiple SEO crawlers (Googlebot, Bingbot, AhrefsBot, etc.)
  • Each crawler fetching every asset individually

This can easily exceed 10 million requests in a week, even if actual users are only in the thousands.

Step 1: Analyze Your Request Pattern

Before reducing requests, identify which assets are causing the explosion:

  1. Use browser dev tools or curl -I to see per-page requests.
  2. Track list pages separately — these often trigger many SVG/image requests.
  3. Check bot traffic via analytics and logs to see which crawlers fetch your pages most aggressively.

Typical request breakdown:

Type Avg Requests per Page
HTML / Docs 2
CSS 3
JS 10–15
Images 5–10
Fonts 1
SVGs 20–100 (on listicle pages)
Misc 2

Observation: SVGs and JS/CSS bundles are the biggest contributors.

Step 2: Reduce Requests in Astro

1. Inline or Sprite SVGs

  • Problem: Each SVG file is a separate request. A page with 100 icons = 100 requests.
  • Solution:

    • Inline SVGs directly in HTML.
    <svg xmlns="http://www.w3.org/2000/svg" viewBox="0 0 24 24">
      <path d="..." />
    </svg>
    
    • Use SVG sprite sheets to combine multiple icons into a single file:
    <svg><use href="/icons.svg#icon-id"></use></svg>
    

Impact: Cuts dozens to hundreds of requests on icon-heavy pages.

2. Merge JS and CSS Bundles

Astro + Vite often splits small JS/CSS files into multiple requests.

  • Problem: 10–15 JS files + 2–3 CSS per page → 30+ requests.
  • Solution:

    • Merge JS/CSS during build using Vite Rollup options:
    // astro.config.mjs
    import { defineConfig } from 'astro/config';
    
    export default defineConfig({
      vite: {
        build: {
          rollupOptions: {
            output: { manualChunks: undefined } // merge all JS/CSS
          }
        }
      }
    });
    
  • Optional: inline critical CSS to reduce separate CSS requests.

Impact: Cuts 10–20 requests per page.

3. Lazy-Load Non-Critical Assets

  • Only load images, icons, or widgets when visible using:
  <img src="/image.png" loading="lazy" alt="..."/>
Enter fullscreen mode Exit fullscreen mode
  • Use IntersectionObserver for client-side components.
  • Effect: For some bot crawlers or users who don’t scroll, these requests are skipped.

4. Serve Bot-Light Pages

  • Problem: SEO bots crawl every asset, including unnecessary icons and JS.
  • Solution: Detect bots via User-Agent and serve minimal HTML:

    • Only critical CSS + essential HTML
    • Skip SVGs and optional JS
if (ua.includes("Googlebot") || ua.includes("Bingbot")) {
  // Serve minimal HTML snapshot
}
Enter fullscreen mode Exit fullscreen mode

Impact: Reduces bot requests per page by 50–80%.

5. Reduce Page Explosion

  • Merge similar content pages (tags, subcategories) to reduce total pages crawled.
  • Use canonical tags for duplicate content.
  • Limit sitemap.xml to most valuable pages — low-value pages can still be crawled via internal links but won’t trigger aggressive discovery.

Step 3: Bot-Friendly Optimizations

Even without blocking bots, you can control their request footprint:

  • robots.txt crawl-delay for aggressive scrapers (Ahrefs, MJ12, Semrush).
  • Exclude low-value crawlers from hitting list pages unnecessarily.
  • Pre-render static snapshots to reduce JS execution requests for bots.

Key idea: Give bots the minimal HTML they need, not the full interactive page.

Step 4: Quick Wins

Action Expected Request Reduction
Inline SVGs 50–90% on icon-heavy pages
Merge JS/CSS 10–20 requests per page
Bot-light HTML 50–80% per bot page load
Lazy-load assets 5–50% depending on content
Reduce page count / canonicalize Long-term reduction

Step 5: Implementation Tips

  • Use Vite plugins like vite-plugin-svg-icons for automated SVG inlining.
  • Configure Astro to avoid unnecessary client hydration (client:load, client:visible) for static components.
  • Precompute frequently used components/icons at build-time.
  • Audit list pages and prioritize merging assets and inline icons.

Conclusion

High request counts in static Astro sites are primarily caused by:

  1. Asset-heavy pages (SVGs, JS, CSS)
  2. Crawlers fetching every page and asset
  3. Splintered bundles and per-page requests

By combining Astro-level optimizations (inline SVGs, merge JS/CSS, lazy-load assets) and bot-focused techniques (bot-light HTML, sitemap control, crawl-delay), you can reduce CDN request counts by 70–90%, drastically cutting costs without harming SEO or UX.

FreeDevTools

I’ve been building for FreeDevTools.

A collection of UI/UX-focused tools crafted to simplify workflows, save time, and reduce friction in searching tools/materials.

Any feedback or contributors are welcome!

It’s online, open-source, and ready for anyone to use.

👉 Check it out: FreeDevTools
⭐ Star it on GitHub: freedevtools

Top comments (0)