DEV Community

BeanBean
BeanBean

Posted on • Originally published at nextfuture.io.vn

How to Cache Drizzle ORM Queries with Redis in Next.js 16 (2026)

Originally published on NextFuture

The problem

Drizzle ORM gives you type-safe SQL but ships with no query-level cache — every db.select() call hits PostgreSQL, even for data that changes once an hour. On a low-RAM VPS that most indie projects run on, every page load competes for database connections. Wrapping queries with Redis eliminates round-trips on hot paths without touching your schema. The Drizzle relational query API makes the fetch side easy; the cache layer is the part you have to wire yourself.

Prerequisites

  • Node.js 22+, TypeScript 5 strict

  • Next.js 16 project using App Router and Server Actions

  • Drizzle ORM with drizzle-orm/pg-core already configured

  • Redis 7 instance — local via Docker or managed

  • ioredis package: npm i ioredis

  • REDIS_URL env var (e.g. redis://localhost:6379)

Step 1: Create a Redis client singleton

Instantiate ioredis once per process so connections are reused across hot-reload cycles and concurrent requests.

// lib/redis.ts
import Redis from "ioredis";

let _redis: Redis | null = null;

export function getRedis(): Redis {
  if (!_redis) {
    _redis = new Redis(process.env.REDIS_URL!, {
      maxRetriesPerRequest: 3,
      lazyConnect: true,
    });
  }
  return _redis;
}
Enter fullscreen mode Exit fullscreen mode

Step 2: Build a typed cache wrapper

One generic helper handles JSON serialization, TTL, and cache-miss fallback — each query call site stays a one-liner.

// lib/cache.ts
import { getRedis } from "./redis";

export async function cached(
  key: string,
  ttlSeconds: number,
  fn: () => Promise
): Promise {
  const redis = getRedis();
  const hit = await redis.get(key);
  if (hit) return JSON.parse(hit) as T;
  const value = await fn();
  await redis.setex(key, ttlSeconds, JSON.stringify(value));
  return value;
}

export async function invalidate(...keys: string[]): Promise {
  if (keys.length) await getRedis().del(...keys);
}
Enter fullscreen mode Exit fullscreen mode

Step 3: Wrap Drizzle queries in Server Actions

Replace bare db.query.* calls with cached(). Key the cache by query identity; set TTL to match how stale the data is allowed to be.

// actions/product.actions.ts
import { createDb } from "@repo/database/drizzle/client";
import { techProducts } from "@repo/database/drizzle";
import { cached } from "@/lib/cache";
import { eq } from "drizzle-orm";

const db = createDb();

export async function getProductBySlug(slug: string) {
  return cached(`product:slug:${slug}`, 3600, () =>
    db.query.techProducts.findFirst({
      where: eq(techProducts.slug, slug),
      with: { category: true, brand: true },
    })
  );
}
Enter fullscreen mode Exit fullscreen mode

Step 4: Invalidate on mutations

After any write, delete the matching key so the next read repopulates from the database rather than serving stale data.

// actions/admin/product-admin.actions.ts
import { invalidate } from "@/lib/cache";
import { createDb } from "@repo/database/drizzle/client";
import { techProducts } from "@repo/database/drizzle";
import { eq } from "drizzle-orm";
import { auth } from "@/lib/auth";

const db = createDb();

export async function updateProduct(
  id: string,
  slug: string,
  data: Partial
) {
  const session = await auth();
  if (session?.user?.role !== "ADMIN") throw new Error("Unauthorized");
  await db.update(techProducts).set(data).where(eq(techProducts.id, id));
  await invalidate(`product:slug:${slug}`);
}
Enter fullscreen mode Exit fullscreen mode

Full working example

// lib/redis.ts
import Redis from "ioredis";

let _redis: Redis | null = null;

export function getRedis(): Redis {
  if (!_redis) {
    _redis = new Redis(process.env.REDIS_URL!, {
      maxRetriesPerRequest: 3,
      lazyConnect: true,
    });
  }
  return _redis;
}

// lib/cache.ts
import { getRedis } from "./redis";

export async function cached(
  key: string,
  ttlSeconds: number,
  fn: () => Promise
): Promise {
  const redis = getRedis();
  const hit = await redis.get(key);
  if (hit) return JSON.parse(hit) as T;
  const value = await fn();
  // Dates serialize to ISO strings; cast back with new Date(value.createdAt) where needed
  await redis.setex(key, ttlSeconds, JSON.stringify(value));
  return value;
}

export async function invalidate(...keys: string[]): Promise {
  if (keys.length) await getRedis().del(...keys);
}

// actions/product.actions.ts
import { createDb } from "@repo/database/drizzle/client";
import { techProducts } from "@repo/database/drizzle";
import { cached } from "@/lib/cache";
import { eq } from "drizzle-orm";

const db = createDb();

export async function getProductBySlug(slug: string) {
  return cached(`product:slug:${slug}`, 3600, () =>
    db.query.techProducts.findFirst({
      where: eq(techProducts.slug, slug),
      with: { category: true, brand: true },
    })
  );
}

export async function listProducts(categorySlug: string, limit = 20) {
  return cached(`products:cat:${categorySlug}:${limit}`, 300, () =>
    db.query.techProducts.findMany({
      where: eq(techProducts.categorySlug, categorySlug),
      limit,
      orderBy: (t, { desc }) => [desc(t.createdAt)],
    })
  );
}

// actions/admin/product-admin.actions.ts
import { invalidate } from "@/lib/cache";
import { createDb } from "@repo/database/drizzle/client";
import { techProducts } from "@repo/database/drizzle";
import { eq } from "drizzle-orm";
import { auth } from "@/lib/auth";

const db = createDb();

export async function updateProduct(
  id: string,
  slug: string,
  data: Partial
) {
  const session = await auth();
  if (session?.user?.role !== "ADMIN") throw new Error("Unauthorized");
  await db.update(techProducts).set(data).where(eq(techProducts.id, id));
  await invalidate(`product:slug:${slug}`);
}
Enter fullscreen mode Exit fullscreen mode

Testing it

Start Redis locally with docker run -d -p 6379:6379 redis:7-alpine, then call getProductBySlug("some-slug") twice in sequence. After the first call, run redis-cli keys "product:*" — the key should appear. The second call should return in under 1 ms versus the typical 5-15 ms PostgreSQL round-trip.

Troubleshooting

  • ECONNREFUSED on startup: Redis is not running or REDIS_URL points to the wrong host. In Docker Compose, use the service name (redis://redis:6379), not localhost.

  • Stale data persists after update: The invalidation key does not match your read key. Export key-builder functions (e.g. const productKey = (slug: string) => \product:slug:\${slug}``) from a shared file so read and write paths cannot drift.

  • Date fields come back as strings: JSON.stringify coerces Date to ISO strings. Cast them back with new Date(product.createdAt), or exclude date fields from cached objects and re-attach after fetch.

Where to go next

This Redis client can power more than caching — see background job queues with BullMQ and Redis in Node.js to reuse the same connection for durable async work. If you need to expose these cached queries via a lightweight HTTP layer, deploying a Hono API with Postgres on Railway is a natural next step. For cache invalidation at the framework level, the Next.js revalidateTag docs cover ISR-level control that complements this Redis approach.

{"@context":"https://schema.org","@type":"HowTo","name":"How to Cache Drizzle ORM Queries with Redis in Next.js 16 (2026)","step":[
{"@type":"HowToStep","position":1,"name":"Create a Redis client singleton","text":"Instantiate ioredis once per process to reuse connections across requests."},
{"@type":"HowToStep","position":2,"name":"Build a typed cache wrapper","text":"Write a generic cached() helper handling JSON serialization, TTL, and cache-miss fallback."},
{"@type":"HowToStep","position":3,"name":"Wrap Drizzle queries in Server Actions","text":"Replace bare db.query calls with cached(), keyed by query identity with appropriate TTL."},
{"@type":"HowToStep","position":4,"name":"Invalidate on mutations","text":"Delete the matching Redis key after any write so stale data is never served."}
]}


This article was originally published on NextFuture. Follow us for more fullstack & AI engineering content.

Top comments (0)