DEV Community

S M Tahosin
S M Tahosin

Posted on

Next.js 16: Revalidating Per-User Dynamic Fetches on Demand (3 Patterns That Actually Work)

If you've ever tried to revalidate a user-scoped fetch in Next.js App Router and watched revalidateTag('...') silently do nothing, you've run into one of the subtler gotchas of the 16.x data cache. The short version:

Once a fetch reads from cookies() or headers(), Next marks it as Dynamic and bypasses the data cache entirely — so next: { tags: [...] } is silently ignored, and your tag-based revalidation has nothing to invalidate.

This bites hardest on auth-gated dashboards: every fetch forwards the session cookie to your backend, so every fetch is Dynamic, so none of them are cached, so revalidateTag is a no-op. You end up writing action handlers that "revalidate everything" with an empty tag key — and that actually does work, but it's a sledgehammer that obliterates cross-user cache isolation you didn't know you wanted.

I ran into this last week while helping someone in vercel/next.js#92829, and realised I've been using three distinct patterns depending on the data shape. Writing them up here because the docs don't connect the dots between the Dynamic IO model and the "per-user revalidation" use case.

All examples target Next.js 16.1+. I'll note where 16.0 and earlier diverge.

The pattern you probably tried first (and why it fails)

// app/lib/api.ts
import { cookies } from 'next/headers';

export const fetchAPI = async () => {
  const cookieStore = await cookies();
  return fetch('https://api.example.com/dashboard', {
    method: 'POST',
    headers: { Cookie: cookieStore.toString() },
    next: { tags: ['dashboard-data'] },   // ← this is silently ignored
  });
};
Enter fullscreen mode Exit fullscreen mode

Then in a server action:

'use server';
import { revalidateTag } from 'next/cache';

export async function refreshDashboard() {
  revalidateTag('dashboard-data');   // ← nothing to invalidate; cache was never populated
}
Enter fullscreen mode Exit fullscreen mode

The fetch is considered Dynamic because it reads from cookies() inside the module scope that fetch resolves in. Dynamic fetches skip the data cache entirely — they're not cached per-user, they're not cached at all. next.tags is only consulted when something actually enters the cache, so the tag never gets associated with any cache entry.

Your three options are:

  1. Opt back in to caching with an explicit key (unstable_cache or 'use cache')
  2. Accept it's dynamic, use React.cache for same-request dedupe, and revalidatePath for rerenders
  3. Route the data through a Route Handler that does cache, and call it from the Server Component

Let's walk through each.

Pattern 1: unstable_cache with the cookie as a key part

unstable_cache reads its cache key from the function's arguments, not from the enclosing module. So you read the cookie outside the cached function and pass it in:

// app/lib/api.ts
import { unstable_cache } from 'next/cache';
import { cookies } from 'next/headers';
import { createHash } from 'node:crypto';

const sessionHash = (cookie: string) =>
  createHash('sha256').update(cookie).digest('hex').slice(0, 16);

const fetchAPIForUser = (sessionCookie: string) =>
  unstable_cache(
    async () => {
      const res = await fetch('https://api.example.com/dashboard', {
        method: 'POST',
        headers: { Cookie: sessionCookie },
      });
      if (!res.ok) throw new Error(`API ${res.status}`);
      return res.json();
    },
    // Cache key parts — different sessions get different cache entries
    ['fetchAPI', sessionCookie],
    {
      tags: [
        'dashboard-data',
        `dashboard-data:${sessionHash(sessionCookie)}`,
      ],
      revalidate: 60,
    },
  )();

export async function fetchAPI() {
  const sessionCookie = (await cookies()).toString();
  return fetchAPIForUser(sessionCookie);
}
Enter fullscreen mode Exit fullscreen mode

Two things are doing work here:

  • The cookie is a key part, so every user ends up with their own cache entry. User A's revalidateTag doesn't nuke User B's data.
  • The tags list has both a global dashboard-data and a per-user dashboard-data:<hash>. This gives you granular control: revalidate one user's data after they mutate something, or nuke everyone's when a global config changes.

Then your server action becomes:

'use server';
import { revalidateTag } from 'next/cache';
import { cookies } from 'next/headers';
import { createHash } from 'node:crypto';

const sessionHash = (c: string) =>
  createHash('sha256').update(c).digest('hex').slice(0, 16);

export async function refreshMyDashboard() {
  const cookie = (await cookies()).toString();
  revalidateTag(`dashboard-data:${sessionHash(cookie)}`);   // just me
}

export async function refreshEveryonesDashboard() {
  revalidateTag('dashboard-data');   // global flush
}
Enter fullscreen mode Exit fullscreen mode

When to use this: user-scoped data that's expensive to fetch and read more than once per session — dashboards, settings pages, user-specific feeds. You get the latency win of caching and tag-based revalidation.

Gotcha: don't accidentally cache PII in a way that survives the user's logout. The per-user tag + a reasonable revalidate ceiling (60s–5min) keeps the blast radius sane.

Pattern 2: 'use cache' directive (the modern shape)

If you're on 16.1+ with experimental.dynamicIO enabled, 'use cache' is the newer, less verbose form — same idea, less ceremony:

// app/lib/api.ts
import { cookies } from 'next/headers';
import { cacheTag, cacheLife } from 'next/cache';

async function fetchAPIForUser(sessionCookie: string) {
  'use cache';
  cacheLife('minutes');
  cacheTag('dashboard-data', `dashboard-data:${sessionHash(sessionCookie)}`);

  const res = await fetch('https://api.example.com/dashboard', {
    method: 'POST',
    headers: { Cookie: sessionCookie },
  });
  return res.json();
}

export async function fetchAPI() {
  const sessionCookie = (await cookies()).toString();
  return fetchAPIForUser(sessionCookie);   // same pattern — read cookie outside
}
Enter fullscreen mode Exit fullscreen mode

cacheTag / cacheLife from next/cache are the equivalents of the unstable_cache options, and the function's arguments become the cache key automatically.

The key discipline — read cookies() outside the cached function and pass it as an argument — is identical to Pattern 1. The framework still can't introspect into cookies() from inside a cached region; it just sees a function that takes a string and caches by string.

Enable it in next.config.ts:

import type { NextConfig } from 'next';

const config: NextConfig = {
  experimental: {
    dynamicIO: true,
    useCache: true,
  },
};

export default config;
Enter fullscreen mode Exit fullscreen mode

Check your 16.x changelog for exact flag names — they shifted between 16.0 and 16.1.

Pattern 3: Accept the dynamic, dedupe with React.cache, refresh with revalidatePath

Sometimes the data just isn't cacheable — it changes every request, or it's cheap enough that caching adds latency instead of removing it. In that case, don't fight the framework; work with it.

// app/lib/api.ts
import { cache } from 'react';
import { cookies } from 'next/headers';

export const fetchAPI = cache(async () => {
  const sessionCookie = (await cookies()).toString();
  const res = await fetch('https://api.example.com/dashboard', {
    method: 'POST',
    headers: { Cookie: sessionCookie },
  });
  return res.json();
});
Enter fullscreen mode Exit fullscreen mode

React.cache dedupes the fetch across components within the same request, so if five Server Components call fetchAPI() during one render, you still only hit the backend once. Different requests get fresh data — exactly what you want for per-user live data.

Then your server action rerenders the page instead of revalidating a cache entry:

'use server';
import { revalidatePath } from 'next/cache';

export async function refreshDashboard() {
  revalidatePath('/dashboard');   // forces re-render, which re-runs fetchAPI
}
Enter fullscreen mode Exit fullscreen mode

When to use this: user-scoped data that's small, cheap, or genuinely fresh-per-request. Most dashboards I've built fall here — the latency of a direct backend call is dominated by network anyway, and skipping the cache layer saves you from a whole class of staleness bugs.

Decision rule I actually use

After writing a few of these, this is the rule I apply:

  1. "The data is user-scoped, expensive, and reads dominate writes"Pattern 1 or 2 with per-user tags. The 5× latency win on cache hits usually justifies the complexity.

  2. "The data is user-scoped, cheap, and reads roughly equal writes"Pattern 3. Don't cache; dedupe per-request, rerender on mutation.

  3. "The data is global but personalised at the margin (e.g. reading a session cookie only for feature flags)"Pattern 1 with a single tag, no per-user keying. Feature flag data is worth caching even though it reads a cookie.

  4. "I need real-time-ish data (< 30s)"Pattern 3 + poll-on-client with React Query / SWR. Caching on the server layer just pushes the staleness problem around.

The sledgehammer (and why to avoid it)

You can make the original code "work" by calling revalidateTag('') on every mutation — it nukes every tagged entry in the cache, and your Dynamic fetch also re-runs because the page gets marked for revalidation. I've seen this in production a few times and every time it caused an incident later:

  • One user's mutation invalidates every other user's cache → thundering herd on the backend
  • Global feature flags that were cacheable get flushed on every user action → effective cache hit rate drops to ~0%
  • Debugging becomes impossible because "why did User A see stale data?" has no local explanation

Per-user tags (Pattern 1 / 2) or per-request React.cache (Pattern 3) are both strictly better. Pick one, be consistent within a feature area, and document which pattern a given fetch is using.

A word on the mental model

The thing that clicked for me about the 16.x Dynamic IO model: the data cache is fundamentally a global key-value store keyed by URL + options hash. When your fetch reads something request-scoped (cookies, headers, searchParams), the cache layer has no good default for "who does this entry belong to?" — so it bails out entirely rather than silently cache PII across users.

You opt back in by making the user-scoping explicit (passing the cookie as a key part), which moves the security decision into your code where you can reason about it. That's the same tradeoff React Server Components made around 'use server' — the framework refuses to guess, and gives you a small API to tell it exactly what you mean.

Once I started thinking of unstable_cache / 'use cache' as "declare your cache key explicitly, include whatever request-scoped stuff you want to partition on", the rest of the API fell into place.

References

If you're hitting a variation of this problem — say, SSE streams that need to drop their connection on revalidation, or RSC payloads that race with client-side tag invalidations — drop a comment, I've probably tripped on it too.

Top comments (0)