DEV Community

Cover image for Why Your API Is Taking a World Tour (And How to Stop It)
Rohan Kokkula
Rohan Kokkula

Posted on

Why Your API Is Taking a World Tour (And How to Stop It)

Your user in Mumbai clicks a button. They’re expecting a quick JSON payload to render a product card.

What they don’t see is that their request is boarding a digital flight, hopping through Singapore, cruising undersea cables across the Pacific, and finally knocking on the door of an AWS server in North Virginia.

The API rubs its eyes, wakes up, hits the database, assembles some JSON, adds the right headers, and starts the return trip, all in under 750 milliseconds.

In the world of fast-moving React apps, every millisecond counts. A user landing on a page should see meaningful content before they blink. This is not just about rendering components quickly, it’s about delivering content intelligently across the globe.

You can build the fastest React components in the world, but if your content loads like it's coming from a sleepy dial-up modem in 2004, users are gone before the useEffect even fires.

At Contentstack, we believe frontend performance isn't just about rendering. It's about delivering content at light speed, across the globe, on demand and from the edge.

Let’s explore the how and the why.

The Problem: Components are Fast. Content Isn’t.

React gave developers superpowers — reusable UI, fast state updates, modular design. You build a beautifully optimized . It's got all the right props — title, price, imageUrl, cta.

But something broke down when it came to content.

Where do those props come from? Often, an API call to some backend that’s far, far away.

When your component is ready to render but waiting on an API call to Europe, all your shiny performance promises collapse.

That’s the bottleneck. And it’s what CDNs are born to solve.

Latency is not just a backend problem. It’s a storytelling problem for your brand.

Every user request is routed to a single origin server. Whether your user is in New York or Nairobi, they all wait for a roundtrip to the same data center.

Average wait time? 2 to 3 seconds. Unacceptable in the age of AI agents, real-time video, and hyper-personalization.
A CDN is Basically a Content Teleporter

A Content Delivery Network (CDN) is a globally distributed network of servers that cache content close to users. Instead of hitting your origin server every time, users get data from the nearest cache server.

If your backend is in Virginia but your user is in Mumbai, that’s a 300ms roundtrip just for the data. If that content is cached on a server in Delhi, it’s there in 20ms.

“Think of it as keeping your favorite snacks in every city, instead of shipping them from one warehouse.”

Think Like a Gamer

Imagine playing Valorant. If your shots fire 2 seconds after you click, you're toast. Contentstack engineers think of page rendering like a high-stakes FPS. Every delay kills the experience.

CDNs work like global respawn points — fast, local, and always ready.
Understanding Caching: Hits, Misses, and Headers

Every time a request is served, the server responds with a cache header — a little note saying whether the content came from cache or not.

  • Cache: HIT, MISS
  • X-Cache-Hits: 5
  • X-Cache-Hits: 0

This tells us how many times a cached version was served. The goal is to maximize HITs and minimize MISSes. Every MISS means your user had to wait for the origin to generate the content.

How We Maximize Hits

  • We use long-lived cache headers for static content (images, styles, scripts).
  • For dynamic content, we tag responses with ETags and Last-Modified headers.
  • We leverage stale-while-revalidate strategies to ensure freshness without waiting.

“Cache is a promise that your content will be fast, as long as your strategy is smart.”

Step 1: Make Your API Cacheable

First, you need your responses to say: “Hey CDN, feel free to save this and reuse it.”

You do that using cache headers:

// /pages/api/product.js (Next.js API Route)

export default function handler(req, res) {

  const product = { id: 1, title: "Lemonade", price: "$5" };

  res.setHeader("Cache-Control", "public, max-age=3600, stale-while-revalidate=59");

  res.status(200).json(product);
}
Enter fullscreen mode Exit fullscreen mode

What does this do?

  • public: Anyone can cache this response
  • max-age=3600: Cache it for 1 hour
  • stale-while-revalidate=59: Even after expiry, serve stale for 59s while refreshing in background

CDNs like Cloudflare, Akamai, and AWS CloudFront love these headers.

Step 2: Confirm You’re Actually Cached

Once you're deploying, check the response headers. You'll often see something like:

  • X-Cache: HIT
  • X-Cache-Hits: 12

If you see MISS, you’re going to origin. If you see HIT, you’re winning.

Here’s how to inspect it in your browser:

curl -I https://yourdomain.com/api/product

Or check in DevTools → Network → Headers.

Step 3: Use Next.js Middleware for Smart Routing at the Edge

Let’s say you want to show a different version of a product page to users in the US vs. Europe — without compromising cacheability.

Enter Next.js Middleware, which runs at the edge, before a page is rendered.

// middleware.js

import { NextResponse } from 'next/server';

export function middleware(request) {

  const region = request.geo?.country || 'US';

  const url = request.nextUrl.clone();

  url.pathname = /${region}${url.pathname};

  return NextResponse.rewrite(url);

}
Enter fullscreen mode Exit fullscreen mode

Now /product/1 becomes /US/product/1 or /DE/product/1, which can be separately cached.

This is how we geo-optimize delivery at Contentstack while keeping our cache HIT rates high.

Step 4: Invalidate What You Cache (When You Must)

Caching is great until you deploy a typo to 100 PoPs (points of presence) globally.

So how do we invalidate cache intelligently?
With Contentstack Webhooks

When content changes, we trigger a webhook that purges only what’s needed:

POST /api/purge-cache

{ "paths": ["/blog/my-latest-post", "/api/blog/my-latest-post"]

}
Enter fullscreen mode Exit fullscreen mode

Then in Next.js:

This is precise, fast, and doesn’t blow away your entire cache.

// /pages/api/purge-cache.js

import fetch from 'node-fetch';

export default async function handler(req, res) {

  const { paths } = req.body;

  for (const path of paths) {

    await fetch(`https://api.your-cdn.com/purge path=${encodeURIComponent(path)}`, {

      method: 'POST',

      headers: { Authorization: Bearer ${process.env.CDN_API_KEY} }

    });

  }

  res.status(200).json({ message: "Cache purged!" });

}
Enter fullscreen mode Exit fullscreen mode

Bonus: Use ISR for the Best of Both Worlds

If you're using Next.js with Contentstack, turn on Incremental Static Regeneration:

export async function getStaticProps() {

  const data = await fetchYourContentstackEntry();

  return {

    props: { data },

    revalidate: 60 // Rebuild this page every 60 seconds

  };

}
Enter fullscreen mode Exit fullscreen mode

“Cache this until it maybe gets stale. Then gently refresh it in the background.”

What You Get When You Get It Right

Once you start optimizing for the edge:

  • Your React apps feel instant globally
  • Your server load drops dramatically
  • Your costs fall because CDNs are cheap
  • Your deploys get safer, faster, cleaner

It’s not magic. It’s just engineering the web the way it’s meant to scale.

Top comments (0)