DEV Community

Cover image for Speculation Rules API: Make Your Pages Load Before the User Clicks
Andrés Clúa
Andrés Clúa

Posted on

Speculation Rules API: Make Your Pages Load Before the User Clicks

Imagine your website could predict where the user is going and have that page ready before they click. That's exactly what the Speculation Rules API does.

It's not magic. It's not a framework. It's a native browser API. And it's stupidly easy to implement.


The Problem

When a user clicks a link, the browser has to:

  1. Request the HTML from the server
  2. Download CSS, JS, images
  3. Parse everything
  4. Render the page

That takes time. Sometimes a little, sometimes a lot. But it always feels slower than it should.

The Solution: Cheat (the good kind)

The Speculation Rules API tells the browser: "Hey, the user will probably go to this page. Get it ready now."

And the browser does it. In the background. Without the user knowing.

When they finally click, the page shows up instantly. Literally. 0ms of perceived wait time.


Prefetch vs Prerender

There are two levels:

Prefetch: Downloads only the HTML of the page. Like downloading the blueprints of a house but not building it.

Prerender: Downloads EVERYTHING and renders the full page in the background. Like building the entire house and having it ready when you arrive.

Prerender is more aggressive and uses more resources, but the experience is instant.


How to Use It

It's a <script> tag with type="speculationrules" and JSON inside. No NPM, no imports, no config files.

Option 1: Specific URLs

If you know exactly which pages you want to pre-load:

<script type="speculationrules">
{
  "prerender": [
    {
      "urls": ["/about", "/work", "/contact"]
    }
  ]
}
</script>
Enter fullscreen mode Exit fullscreen mode

This tells the browser: "Prerender /about, /work and /contact immediately."

Option 2: Automatic Rules (document rules)

This is where it gets interesting. Instead of listing URLs by hand, you tell the browser to decide based on the links it finds on the page:

<script type="speculationrules">
{
  "prefetch": [
    {
      "source": "document",
      "where": {
        "and": [
          { "href_matches": "/*" },
          { "not": { "href_matches": "*.pdf" } },
          { "not": { "selector_matches": ".no-prefetch" } }
        ]
      },
      "eagerness": "conservative"
    }
  ]
}
</script>
Enter fullscreen mode Exit fullscreen mode

Translation: "Prefetch all internal links on the page, except PDFs and links with the .no-prefetch class, but only when the user starts clicking."


Eagerness: How Anxious Do You Want It to Be

Controls when the browser starts pre-loading:

Level What it does
immediate Does it now. Doesn't ask.
eager Same as immediate (for now)
moderate Waits for 200ms of hover
conservative Waits for the click to start (mousedown/touchstart)

conservative is the safest to start with. Only pre-loads when the user is already clicking, so you don't waste resources. My recommendation if you're unsure.

moderate is the sweet spot. 200ms of hover is enough to have the page ready by the time the click lands.

immediate is for when you're certain the user will go there. Use it with specific URLs, not document rules (or you'll prerender everything).


Useful Filters

By URL pattern

{ "href_matches": "/work/*" }
Enter fullscreen mode Exit fullscreen mode

Only links starting with /work/.

By CSS selector

{ "selector_matches": ".prerender-this" }
Enter fullscreen mode Exit fullscreen mode

Only links with that class.

Exclude pages

{ "not": { "href_matches": "/logout" } }
Enter fullscreen mode Exit fullscreen mode

Important: Always exclude routes with side effects. If you prerender /logout, the user gets logged out without clicking. Not kidding.

Combine conditions

{
  "and": [
    { "href_matches": "/*" },
    { "not": { "href_matches": "/api/*" } },
    { "not": { "selector_matches": "a[rel~='nofollow']" } }
  ]
}
Enter fullscreen mode Exit fullscreen mode

Add It Dynamically with JS

If you need to add rules after the page loads:

const rules = {
  prerender: [
    {
      urls: ["/next-page"]
    }
  ]
};

const script = document.createElement("script");
script.type = "speculationrules";
script.textContent = JSON.stringify(rules);
document.body.append(script);
Enter fullscreen mode Exit fullscreen mode

Useful if you want to prerender the "next page" based on some user data.


Check Browser Support

if (
  HTMLScriptElement.supports &&
  HTMLScriptElement.supports("speculationrules")
) {
  console.log("Speculation Rules supported");
}
Enter fullscreen mode Exit fullscreen mode

Detect If a Page Was Prerendered

if (document.prerendering) {
  console.log("This page is being prerendered");
}

document.addEventListener("prerenderingchange", () => {
  console.log("User just navigated to this prerendered page");
});
Enter fullscreen mode Exit fullscreen mode

Useful if you want to defer analytics or other actions until the user actually sees the page.


Things to Keep in Mind

Browser Limits

Chrome caps how many pages you can pre-load at once:

  • immediate/eager: up to 50 prefetches, 10 prerenders
  • moderate/conservative: up to 2 of each

Don't go crazy prerendering 100 pages. The browser will just ignore them.

Resource Consumption

Prerender uses bandwidth, CPU and battery. Chrome automatically disables it if:

  • The device is in power saver mode
  • Battery is low

Content Can Go Stale

If you prerender a page and the user takes 5 minutes to click, the content might have changed. For pages with real-time data, use prefetch instead of prerender.

Extensions

uBlock Origin disables preloading by default. Keep that in mind when measuring impact.

Deferred APIs

Some APIs (Geolocation, Notifications, Storage) are delayed until the page is actually activated. They won't fire during prerender.


Debug in DevTools

  1. Open Chrome DevTools
  2. Go to Application > Background Services > Speculative Loads
  3. Reload the page
  4. You'll see which pages are being prerendered/prefetched and any errors

Browser Support

  • Chrome: Yes (since 2024)
  • Edge: Yes
  • Firefox: No
  • Safari: No

For Firefox and Safari, the <script type="speculationrules"> tag is simply ignored. It doesn't break anything. Pure progressive enhancement.


Now Here's the Plot Twist

Everything above is cool. But if you hardcode your speculation rules and forget about them, you're leaving performance on the table.

The real power of this API is that it's just JSON. And JSON can be generated. Dynamically. From data.

What data? Your analytics.

Think about it. Your analytics already know:

  • Which pages users visit most
  • What the most common navigation paths are
  • Which links get the most clicks on each page
  • How those patterns change over time

So instead of guessing which pages to prerender, you can know.

The Loop

Here's the workflow:

Week 1: You deploy speculation rules based on gut feeling. Prerender /about and /work because they seem important.

Week 2: You check analytics. Turns out 73% of homepage visitors go to /work first, then /work/project-x. Nobody clicks /about from the homepage. Now you know what to actually prerender.

Week 3: Traffic patterns shifted. A blog post went viral and now /play is getting 5x the traffic. Your speculation rules should reflect that.

This isn't a "set it and forget it" feature. It's a feedback loop.

How to Build It

The simplest version: a script that runs weekly (cron job, CI pipeline, whatever) that:

  1. Pulls your top navigation paths from Google Analytics, Plausible, or whatever you use
  2. Generates a JSON with the speculation rules
  3. Deploys it as a static file or injects it at build time
// build-speculation-rules.js
// Run this weekly via CI/cron

async function generateRules() {
  // 1. Fetch top navigation paths from your analytics
  const topPaths = await getTopPathsFromAnalytics();
  // e.g., [{ from: "/", to: "/work", percentage: 73 }, ...]

  // 2. Build rules per page
  const rulesPerPage = {};

  for (const path of topPaths) {
    if (!rulesPerPage[path.from]) {
      rulesPerPage[path.from] = [];
    }

    rulesPerPage[path.from].push({
      url: path.to,
      eagerness: path.percentage > 60 ? "moderate" : "conservative"
    });
  }

  // 3. Write the output
  return rulesPerPage;
}
Enter fullscreen mode Exit fullscreen mode

Then in your template/layout:

// Get the precomputed rules for the current page
const currentPageRules = speculationData[currentPath] || [];

const rules = {
  prerender: currentPageRules
    .filter(r => r.eagerness === "moderate")
    .map(r => ({ urls: [r.url], eagerness: "moderate" })),
  prefetch: currentPageRules
    .filter(r => r.eagerness === "conservative")
    .map(r => ({ urls: [r.url], eagerness: "conservative" }))
};
Enter fullscreen mode Exit fullscreen mode

The Eagerness Trick

Here's where it gets smart. Use analytics to decide eagerness too:

  • More than 60% of users navigate there? Use moderate (prerender on hover)
  • Between 20-60%? Use conservative (prerender on mousedown)
  • Less than 20%? Don't bother

You're not wasting resources prerendering pages nobody visits. And you're aggressively prerendering the ones everybody does.

Measure the Impact

Once you have this loop running, track:

  • LCP (Largest Contentful Paint) for pages that were prerendered vs not
  • Navigation timing using the Performance API
  • Hit rate: how often a prerendered page was actually visited
const navEntry = performance.getEntriesByType("navigation")[0];

if (navEntry.activationStart > 0) {
  // This page was prerendered!
  console.log("Prerender saved:", navEntry.activationStart, "ms");
}
Enter fullscreen mode Exit fullscreen mode

Feed that data back into the loop. Drop pages with low hit rates. Promote pages with high navigation probability. Every week your speculation rules get smarter.


Summary

  1. Add a <script type="speculationrules"> to your HTML
  2. Define which pages to pre-load and with what eagerness
  3. Your pages load instantly
  4. No libraries, no frameworks, no weird stuff
  5. Browsers that don't support it just ignore it
  6. Connect it to your analytics and update weekly -- that's where the real value is

Static speculation rules are a quick win. Analytics-driven speculation rules are a compounding advantage. Every week your site gets faster because it gets smarter about what to pre-load.

It's free, it's easy, and it makes a real difference. There's no reason not to use it.


If this helped, drop a like and follow for more web performance content.

Top comments (0)