DEV Community

Atlas Whoff
Atlas Whoff

Posted on

Redis Caching Patterns for Node.js: Cache-Aside, Write-Through, and TTL Strategies

Redis Caching Patterns for Node.js: Cache-Aside, Write-Through, and TTL Strategies

The right caching pattern can reduce database load by 90% and cut API latency from 200ms to 5ms.
Here are the patterns that actually work in production.

Setup

npm install ioredis
Enter fullscreen mode Exit fullscreen mode
// lib/redis.ts
import Redis from 'ioredis'

const redis = new Redis(process.env.REDIS_URL!, {
  maxRetriesPerRequest: 3,
  retryStrategy: (times) => Math.min(times * 50, 2000),
})

redis.on('error', (err) => console.error('Redis error:', err))

export default redis
Enter fullscreen mode Exit fullscreen mode

Pattern 1: Cache-Aside (Lazy Loading)

Check cache first, fall back to DB, then populate cache:

async function getUser(userId: string): Promise<User> {
  const cacheKey = `user:${userId}`

  // Check cache
  const cached = await redis.get(cacheKey)
  if (cached) return JSON.parse(cached)

  // Cache miss — fetch from DB
  const user = await db.user.findUniqueOrThrow({ where: { id: userId } })

  // Populate cache with TTL
  await redis.setex(cacheKey, 3600, JSON.stringify(user))  // 1 hour

  return user
}

// Invalidate on update
async function updateUser(userId: string, data: Partial<User>): Promise<User> {
  const user = await db.user.update({ where: { id: userId }, data })
  await redis.del(`user:${userId}`)  // invalidate cache
  return user
}
Enter fullscreen mode Exit fullscreen mode

Pattern 2: Write-Through

Write to cache and DB simultaneously:

async function updateProduct(id: string, data: Partial<Product>): Promise<Product> {
  const product = await db.product.update({ where: { id }, data })

  // Update cache immediately (no stale reads)
  await redis.setex(`product:${id}`, 3600, JSON.stringify(product))

  return product
}
Enter fullscreen mode Exit fullscreen mode

Best for write-heavy data where you can't tolerate stale reads.

Generic Cache Wrapper

async function withCache<T>(
  key: string,
  ttl: number,
  fetchFn: () => Promise<T>
): Promise<T> {
  const cached = await redis.get(key)
  if (cached) return JSON.parse(cached) as T

  const data = await fetchFn()
  await redis.setex(key, ttl, JSON.stringify(data))
  return data
}

// Usage
const products = await withCache(
  'products:featured',
  300,  // 5 minutes
  () => db.product.findMany({ where: { featured: true } })
)
Enter fullscreen mode Exit fullscreen mode

Pattern 3: Cache Stampede Prevention

When the cache expires and 1000 requests hit the DB simultaneously:

async function getWithLock<T>(
  key: string,
  ttl: number,
  fetchFn: () => Promise<T>
): Promise<T> {
  const cached = await redis.get(key)
  if (cached) return JSON.parse(cached)

  const lockKey = `lock:${key}`
  const lockAcquired = await redis.set(lockKey, '1', 'EX', 10, 'NX')

  if (!lockAcquired) {
    // Another process is fetching — wait and retry
    await new Promise(resolve => setTimeout(resolve, 100))
    return getWithLock(key, ttl, fetchFn)
  }

  try {
    const data = await fetchFn()
    await redis.setex(key, ttl, JSON.stringify(data))
    return data
  } finally {
    await redis.del(lockKey)
  }
}
Enter fullscreen mode Exit fullscreen mode

Rate Limiting with Redis

async function checkRateLimit(
  identifier: string,
  limit: number,
  windowSeconds: number
): Promise<{ allowed: boolean; remaining: number }> {
  const key = `ratelimit:${identifier}`
  const count = await redis.incr(key)

  if (count === 1) {
    await redis.expire(key, windowSeconds)
  }

  return {
    allowed: count <= limit,
    remaining: Math.max(0, limit - count),
  }
}

// In API route
const { allowed, remaining } = await checkRateLimit(
  `api:${userId}`,
  100,   // 100 requests
  3600   // per hour
)

if (!allowed) return Response.json({ error: 'Rate limited' }, { status: 429 })
Enter fullscreen mode Exit fullscreen mode

Session Storage

async function createSession(userId: string): Promise<string> {
  const sessionId = crypto.randomUUID()
  await redis.setex(
    `session:${sessionId}`,
    86400,  // 24 hours
    JSON.stringify({ userId, createdAt: Date.now() })
  )
  return sessionId
}

async function getSession(sessionId: string) {
  const session = await redis.get(`session:${sessionId}`)
  if (!session) return null
  // Extend TTL on access
  await redis.expire(`session:${sessionId}`, 86400)
  return JSON.parse(session)
}
Enter fullscreen mode Exit fullscreen mode

TTL Strategy Guide

Data Type Recommended TTL
User profile 1 hour
Product catalog 5-15 minutes
Search results 1-5 minutes
Dashboard stats 30 seconds
Sessions 24 hours
Rate limit windows Match window

The AI SaaS Starter Kit includes Redis cache-aside helpers, rate limiting middleware, and session management pre-configured. $99 one-time.

Top comments (0)