DEV Community

Cover image for The Precompute Pattern: How to Stop One Cookie from Wrecking Your Entire Next.js App
Tanzim Hossain
Tanzim Hossain

Posted on

The Precompute Pattern: How to Stop One Cookie from Wrecking Your Entire Next.js App

A deep dive into the pattern that keeps pages static when your app needs to know who's logged in


You build a product page in Next.js. Statically generated. Served from the CDN. Fast.

Then someone says: "The header should show the user's name when they're logged in."

You add a cookies() call to the root layout. You push to production. And suddenly — every single page in your app is dynamically rendered. Your CDN cache is useless. Your TTFB tanks. And you're staring at the terminal wondering what just happened.

This is one of the most common performance traps in Next.js. And the Precompute Pattern is one of the cleverest ways people have worked around it.


The problem: one dynamic call poisons the whole tree

Here's the thing about Next.js rendering. When a component calls a dynamic API — cookies(), headers(), searchParams — it opts that component into dynamic rendering. That's expected.

What catches people off guard is the cascade. When the call happens in the root layout, every page nested under it becomes dynamic too. The whole route tree falls over.

A typical e-commerce root layout looks like this:

// app/layout.tsx
export default async function RootLayout({ children }: { children: React.ReactNode }) {
  const isLoggedIn = await isAuthenticated(); // reads cookies()

  return (
    <html>
      <body>
        <Header isLoggedIn={isLoggedIn} />
        <main>{children}</main>
      </body>
    </html>
  );
}
Enter fullscreen mode Exit fullscreen mode

That one isAuthenticated() call — which reads cookies internally — drags the product listing, the category pages, the marketing homepage, everything into dynamic rendering. The only user-specific part might be a login button in the header. But the entire site pays the price.


The Precompute Pattern: encode it, don't read it

The idea is simple: instead of reading dynamic data inside your components, resolve it once in middleware (now called the proxy in Next.js), encode it into the URL, and let the page read it from params.

Here's the flow:

  1. A request hits the proxy
  2. The proxy reads cookies() and figures out the user's state (logged in or not)
  3. It encodes that state as a base64url string
  4. It prepends that string to the URL as a hidden path segment
  5. Next.js routes to something like /eyJsb2dnZWRJbiI6dHJ1ZX0/products
  6. The page reads the encoded data from params — no cookies() call needed

The browser still sees /products. The encoded segment is invisible to the user. But the server treats each combination as a distinct, cacheable URL.

Because the page never calls cookies() or headers() directly, Next.js can statically generate it. And because the URL is unique per user state, you can pre-build every known variant at deploy time.


How to build it

Let's walk through the actual code. This is based on a real commerce demo, simplified for clarity.

Step 1: Define your context shape

First, decide what data you want to encode. Start small — auth state is the classic example. You can always add more later (locale, user type, feature flags).

// utils/request-context.ts
export interface RequestContextData {
  loggedIn: boolean;
  // Other things you might add later:
  // locale?: string;       // 'en', 'no', 'sv'
  // currency?: string;     // 'USD', 'EUR', 'NOK'
  // userType?: 'b2c' | 'b2b';
}

// Turn context into a short URL-safe string
export function encodeRequestContext(data: RequestContextData): string {
  const json = JSON.stringify(data);
  return Buffer.from(json).toString('base64url');
}

// Decode it back, with a safe fallback if something goes wrong
export function decodeRequestContext(encoded: string): RequestContextData {
  try {
    const json = Buffer.from(encoded, 'base64url').toString();
    const data = JSON.parse(json);
    return {
      loggedIn: typeof data.loggedIn === 'boolean' ? data.loggedIn : false,
    };
  } catch {
    return { loggedIn: false };
  }
}

// Convenient wrapper for use inside components
export function getRequestContext(params: { requestContext: string }): RequestContextData {
  return decodeRequestContext(params.requestContext);
}
Enter fullscreen mode Exit fullscreen mode

The encoding produces short strings like eyJsb2dnZWRJbiI6dHJ1ZX0. Not pretty, but URL-safe and fast to decode.


Step 2: Encode in the proxy

The proxy runs on every request. It reads the cookie, encodes the context, and rewrites the URL.

// proxy.ts
import { NextResponse } from 'next/server';
import { encodeRequestContext } from '@/utils/request-context';
import type { NextRequest } from 'next/server';

function isUserAuthenticated(request: NextRequest): boolean {
  return !!request.cookies.get('selectedAccountId')?.value;
}

export function proxy(request: NextRequest) {
  const encodedContext = encodeRequestContext({
    loggedIn: isUserAuthenticated(request),
  });

  // Prepend the encoded context as the first URL segment
  const nextUrl = new URL(
    `/${encodedContext}${request.nextUrl.pathname}${request.nextUrl.search}`,
    request.url
  );

  // Internal rewrite — the browser URL stays unchanged
  return NextResponse.rewrite(nextUrl, { request });
}

export const config = {
  matcher: ['/((?!api|_next/static|_next/image|favicon.ico|.*\\..*).*)'],
};
Enter fullscreen mode Exit fullscreen mode

The key here is NextResponse.rewrite. It changes what the server serves, but the browser URL stays as /products. The user sees nothing different.


Step 3: Read context in components

Components that previously called cookies() now read from the decoded params instead.

import { getRequestContext } from '@/utils/request-context';

export default async function UserProfile({
  params,
}: {
  params: Promise<{ requestContext: string }>;
}) {
  const { requestContext } = await params;
  const { loggedIn } = getRequestContext({ requestContext });

  if (!loggedIn) {
    return <LoginButton />;
  }

  return <ProfileMenu />;
}
Enter fullscreen mode Exit fullscreen mode

No cookies() call. No dynamic rendering. The component just reads from params and renders accordingly.

The layout itself becomes simple — it just passes children through without needing to resolve any auth state:

// app/[requestContext]/layout.tsx
export default function RequestContextLayout({ children }: LayoutProps) {
  return (
    <>
      <Header rightContent={<UserProfile />} />
      <main>{children}</main>
    </>
  );
}
Enter fullscreen mode Exit fullscreen mode

Step 4: Pre-generate variants with generateStaticParams

This is where the pattern pays off. With just auth state, you have two variants: logged in and logged out. You can pre-generate both at build time.

// app/[requestContext]/products/page.tsx
import { encodeRequestContext } from '@/utils/request-context';

export async function generateStaticParams() {
  return [
    { requestContext: encodeRequestContext({ loggedIn: false }) },
    { requestContext: encodeRequestContext({ loggedIn: true }) },
  ];
}
Enter fullscreen mode Exit fullscreen mode

Any combination not pre-generated falls back to ISR — more on why that matters in a moment.


Vercel's Flags SDK formalizes this exact pattern

This isn't something someone invented from scratch. Vercel's Flags SDK formalizes it under the name "precompute" — and it's the same concept, just with tooling on top.

The SDK encrypts flag values into a URL segment. The proxy reads flags from a feature flag provider, encodes the result, and rewrites the request. Pages read the encoded hash from params and use it to look up flag values.

// middleware.ts (Flags SDK version)
import { precompute } from 'flags/next';
import { marketingFlags } from './flags';

export async function proxy(request: NextRequest) {
  const code = await precompute(marketingFlags);

  const nextUrl = new URL(
    `/${code}${request.nextUrl.pathname}${request.nextUrl.search}`,
    request.url
  );

  return NextResponse.rewrite(nextUrl, { request });
}
Enter fullscreen mode Exit fullscreen mode

The pattern is identical to what we built manually. The SDK adds: encryption of flag values, integrations with providers like LaunchDarkly and Statsig, and a helper for generating all flag permutations.

The Flags SDK documentation also recommends a discipline worth paying attention to: use multiple, scoped flag groups rather than one giant global group. This matters more than it seems, which is why.


The trap: variant count explodes fast

Here's where teams get into trouble. Look at your file system with the Precompute Pattern:

app/
└── [requestContext]/
    ├── page.tsx              # home
    ├── all/page.tsx          # product listing
    ├── product/[id]/page.tsx # product detail (thousands of products)
    ├── cart/page.tsx
    └── about/page.tsx
Enter fullscreen mode Exit fullscreen mode

With just auth state: 2 variants per page. Fine.

Add 3 locales: 6 variants. Still manageable.

Add 4 currencies: 24 variants. Getting bigger.

Add 2 A/B test flags: 96 variants per page.

Now multiply 96 by 10,000 product pages: 960,000 page variants to build or cache.

ISR was designed for regeneration of existing pages, not progressive generation of new ones. When a request hits a variant that hasn't been built yet, the user waits for a full synchronous render before seeing anything. No fallback shell. No streaming. Just a cold start.

And every deploy blows out the ISR cache — a CSS change can affect every page, leading to a flood of writes relative to reads when the cache warms back up.

The teams that handle this well are selective. Auth state and locale go in because they have low cardinality and affect large chunks of the page. Feature flags with many possible values stay out, or get scoped to specific pages. The Flags SDK recommendation to use multiple scoped flag groups isn't just ergonomics — it's how you avoid making this problem worse.


Does 'use cache' in Next.js 16 make this unnecessary?

Short answer: for the auth-in-layout problem specifically, yes. For the broader use case, not entirely.

Next.js 16 introduced cacheComponents — a way to annotate individual components with 'use cache' so they cache independently from the rest of the page. Combined with Partial Prerendering, this changes the trade-off completely.

Instead of encoding auth state into the URL, you pass the auth check as a promise through a provider, without awaiting it in the layout:

// app/layout.tsx
export default async function RootLayout({ children }: LayoutProps) {
  const loggedIn = getIsAuthenticated(); // no await — returns a promise

  return (
    <html lang="en">
      <body>
        <AuthProvider loggedIn={loggedIn}>
          <Header />
          <main>{children}</main>
        </AuthProvider>
      </body>
    </html>
  );
}
Enter fullscreen mode Exit fullscreen mode

The layout doesn't block. The promise flows into components that need it. Server components resolve it directly. Client components unwrap it with React's use():

export const useLoggedIn = () => {
  const { loggedIn } = useAuth();
  return use(loggedIn);
};
Enter fullscreen mode Exit fullscreen mode

Static components cache themselves independently:

// features/product/components/FeaturedProducts.tsx
export default async function FeaturedProducts() {
  'use cache';
  cacheTag('featured-product');

  const products = await getFeaturedProducts(4);
  return (
    <div>
      {products.map(product => (
        <ProductCard key={product.id} {...product} />
      ))}
    </div>
  );
}
Enter fullscreen mode Exit fullscreen mode

The cookies() call in UserProfile makes only that component dynamic. FeaturedProducts, Hero, FeaturedCategories — everything marked with 'use cache' — becomes part of the static shell that ships immediately. The dynamic user profile streams in behind it.

That said, 'use cache' solves the auth-in-layout case. It doesn't solve everything.

If your cached components need to vary by locale, currency, user type, or feature flags — if those values affect what the cached component itself renders — then you can't just suspend them as dynamic. For those cases, the Precompute Pattern (or the Flags SDK) remains the right tool, even alongside 'use cache'.


The missing piece: rootParams

One of the roughest edges of the Precompute Pattern is prop drilling. The requestContext param needs to flow from the root segment down to every component that reads it. In a deep component tree, that gets tedious fast.

An upcoming Next.js feature called rootParams addresses this directly. Instead of threading values through the tree, components can import root parameters directly:

import { locale } from 'next/root-params';

async function CachedComponent() {
  'use cache';
  const currentLocale = await locale();
  // ...
}
Enter fullscreen mode Exit fullscreen mode

The value automatically becomes a cache key for 'use cache' — so cached components can vary by locale or any other root parameter without manual prop passing.

For the Precompute Pattern specifically, rootParams would mean the encoded hash can be accessed anywhere in the tree without drilling it through props. Teams using precomputed feature flags wouldn't need placeholder generateStaticParams on every page just to satisfy the build.

It's not shipped yet, but it's the piece that makes the ergonomics of this pattern genuinely pleasant.


When to use this pattern

Use the Precompute Pattern when:

  • You have low-cardinality dynamic data that affects large portions of the page (auth state, locale, user type)
  • You want full static generation with CDN-level caching for pages that are otherwise identical across requests
  • You're working with feature flags on marketing pages where the variants can be pre-built at deploy time

Think twice before using it when:

  • Your context has many possible values — variant count multiplies fast
  • You're relying on ISR for on-demand generation — cold starts hurt on every new variant
  • You're on Next.js 16 with cacheComponents available — 'use cache' likely solves your specific problem more cleanly

The honest summary

The Precompute Pattern is a genuine piece of production engineering. It's not something you'll use on every project — but when you have a high-traffic site where CDN cache hit rate directly translates to cost and user experience, understanding this pattern matters.

The Vercel Flags SDK formalizes it. E-commerce teams at scale live by it. And even as 'use cache' in Next.js 16 makes it unnecessary for many common cases, the pattern remains the right answer for feature flags, A/B testing, and multi-dimensional personalization where content needs to be statically distinct per combination.

The cardinality problem is real, and it requires discipline. Start with the lowest-cardinality dimensions. Scope flags to the pages that need them. Pre-generate only the most common combinations. Let ISR handle the rest — but know that every cold start costs you.

And keep an eye on rootParams. When it ships, the last rough edge of this pattern disappears.

Top comments (0)