DEV Community

Cover image for Mastering Custom Cache Strategy in Next.js
Roman
Roman

Posted on

Mastering Custom Cache Strategy in Next.js

Everybody knows that Next.js has powerful built-in caching capabilities — it usually just works. When deploying your app to Vercel or Netlify, you don’t need to worry about what’s happening under the hood: how caching is scaled, how it’s shared, or even where it lives (CDN or application cache).

But there are still many cases where you need to run your Next.js application on a cloud provider (a.k.a. self-hosting) like AWS, Azure, or GCP. This comes with its own trade-offs. Now you — or your DevOps — are responsible for managing infrastructure, instances, scaling, failover, networking… and yes, caching.

This article isn’t about deploying Next.js to the cloud — it’s about how you can build a custom cache strategy that improves reliability, performance, and opens the door for advanced features like A/B testing.

Default Caching Behavior in Next.js

Next.js uses the file system to cache rendered output (HTML, JSON, RSC), and supplements this with an in-memory cache for frequently accessed data.

Sounds solid, but here are some real-world limitations:

  1. Auto-scaling: Cache is tied to the local instance. When you scale up, each new instance has a different, empty cache.
  2. Persistence: If an instance is replaced (scale-in, crash, etc.), its cache is lost — recreating cache can be costly.
  3. Fragmentation: Businesses often want A/B testing. That means rendering and caching multiple versions of the same page for different user segments.

So, to build a robust solution, you need to:

  • ✅ Share cache across instances
  • ✅ Persist cache between restarts
  • ✅ Fragment cache for feature buckets or A/B testing

All of that becomes possible with a custom cache strategy.

Creating a Custom Cache Handler

Next.js allows you to provide a custom cacheHandler. You can configure this in your next.config.js:

// next.config.js
module.exports = {
  cacheHandler: require.resolve('./cache-handler.js'),
  // disable default in-memory caching
  cacheMaxMemorySize: 0,
}
Enter fullscreen mode Exit fullscreen mode

Your cache-handler.js file should expose class which will be created during each request to your server, it is very important to remember that your class will be created for each request and not once node process.

Let’s create simple example of custom in-memory cache class to see what it does:

// cache-handler.js

// creating store outside, so it will be available on instance level
// and can be shared between different requests.
const store = new Map();

class CacheHandler {
  constructor(options) {
    this.options = options;   
  }

    // function to retrieve cache for the page
    // when returns null - then request will go to actual render function
    // otherwise when data in cache exists, will skip rendering the page
  async get(key) {
    const record = store.get(key);
    return record ? record.value : null;
  }

    // this will be triggered each time when next.js tries to write cache for the page
    // note: that for page router next.js caches only special types of pages, which are using ISR,
    // and for app router whenever your provide cache / revalidate properties
    // if you are using dynamic or SSR strategies it won't appear into cache handler.
  async set(key, data, ctx) {
    this.store.set(key, {
      value: data,
      lastModified: Date.now(),
      tags: ctx.tags || [],
    });
  }

  async revalidateTag(tags) {
    tags = Array.isArray(tags) ? tags : [tags];
    for (const [key, rec] of this.store) {
      if (rec.tags.some(tag => tags.includes(tag))) {
        this.store.delete(key);
      }
    }
  }

  resetRequestCache() {
    // optional, This method resets the temporary in-memory cache
    // for a single request before the next request.
  }
}

module.exports = CacheHandler;
Enter fullscreen mode Exit fullscreen mode

Basically even this simple example shows how much power we receive and now we are able to solve all our issues and achieve.

Scaling & Persisting Cache

I decided to combine the first two issues — cache sharing and persistence — because they’re pretty closely connected. In most real-world setups, the most effective solution is to move your cache into some form of shared storage. That could be S3 (or any blob storage), a shared file system, or even Redis if you prefer in-memory caching. The idea is simple: make your cache available outside the individual instance.

This way, all of your instances — old or new — can access the same centralized cache. So even when auto-scaling kicks in and new instances spin up, they’re not starting from scratch. They just pick up where the others left off. Problem solved.

Here’s a minimal example of a custom cache handler that reads and writes directly to S3:

import { S3Client, PutObjectCommand, GetObjectCommand, DeleteObjectCommand } from "@aws-sdk/client-s3"

const client = new S3Client({})

class CacheHandler {
    constructore(options) {
        this.options = options
    }

    // Now we can read data directly from s3 and if there is no data
    // then next.js will render it and call `set` method to write cache
    async get(key) {
        const response = await client.getObject({
            Bucket: process.env.NEXT_BUCKET_KEY,
            Key: key
        })

        if (!response.Body) return null

        return JSON.parse(await response.Body.transformToString('utf-8')) || null
    }

    async set(key, data, ctx) {
        await client.putObject({
            Bucket: process.env.NEXT_BUCKET_KEY,
            Key: key,
            // optional, but nice to have if you want to serve response directly from s3,
            // so CDN will know how to cache response.
            ...(data.revalidate ? { CacheControl: `max-age=${data.revalidate}` } : undefined),
            body: JSON.stringify(data)
        })
    }
}
Enter fullscreen mode Exit fullscreen mode

That’s it! Super straightforward, and yet we’ve already solved a lot. And what’s even better — we now have full control. You can extend this however you like and build a caching solution that fits your app’s architecture and scaling setup perfectly.

Now for my favorite part: let’s make it A/B testing–friendly.

Fragmenting Cache for A/B Testing

A/B testing is something most product teams want, and it’s actually pretty easy to support when you control your own cache handler.

Here’s a quick way to fragment your cache per user bucket — so users in different experiments don’t share cache entries:

// code stays the same ^^^

class CacheHandler {
    constructor(options) {
        this.options = options
    }

    getRequestBucket() {
        // we can have a function which will identify user's a/b test bucket based on request
        // we can use this.options and context which will have all necessary data about request and route
        // and return our value
        const cookie = this.options.req?.headers?.cookie ?? '';
        return cookie.includes('bucket_a') ? 'a' : 'b'
    }

    async get(key) {
        const currentBucket = this.getRequestBucket()
        // That's it, now you are able to fragment your cache
        // and read data for particular cache segment 
        const cacheKey = currentBucket + key
        //...
    }

    async set(key, data, ctx) {
        const currentBucket = this.getRequestBucket()
        // Same here, just we want to write unique key for our segment
        // so we can read it later on 
        const cacheKey = currentBucket + key
        // ...
    }
}
Enter fullscreen mode Exit fullscreen mode

Easy! Wow we’ve got separate cache storage per experiment group. No more cache collisions between multiple experiments!

Final Thoughts

So now we’ve solved all the problems we started with. Cache is shared across instances, persistent across restarts, and even segmented per user group for A/B testing. And the best part — it’s all under your control.

Of course, real-world implementations can be more complex, especially when you throw in edge caching, CDN rules, or enterprise-scale observability. But the core idea remains simple: own your caching strategy, and you’ll unlock a ton of flexibility and performance.

And remember, there are only two hard things in programming:

  1. Naming things
  2. Managing cache 😅

Every strategy comes with trade-offs, but with this approach, you’ve got the tools to make the best choice for your setup. Happy caching!

Top comments (0)