Hey dev.to community! 👋
I've been working on a side project for a while now and finally got it to a point where I feel comfortable sharing it publicly. It's called layercache — a multi-layer caching toolkit for Node.js.
I'd really appreciate any feedback, honest criticism, or ideas from folks who deal with caching in production. Here's the quick overview:
Why I built this
Almost every Node.js service I've worked on eventually hits the same caching problem:
- Memory-only cache → Fast, but each instance has its own isolated view of data
- Redis-only cache → Shared across instances, but every request still pays a network round-trip
- Hand-rolled hybrid → Works at first, then you need stampede prevention, tag invalidation, stale serving, observability... and it spirals fast
I couldn't find a library that handled all of this cleanly in one place, so I built one.
What layercache does
layercache lets you stack multiple cache layers (Memory → Redis → Disk) behind a single unified API. On a cache hit, it serves from the fastest available layer and backfills the rest. On a miss, the fetcher runs exactly once — even under high concurrency.
┌───────────────────────────────────────┐
your app ---->│ layercache │
│ │
│ L1 Memory ~0.01ms (per-process) │
│ | │
│ L2 Redis ~0.5ms (shared) │
│ | │
│ L3 Disk ~2ms (persistent) │
│ | │
│ Fetcher ~20ms (runs once) │
└───────────────────────────────────────┘
Basic usage
npm install layercache
import { CacheStack, MemoryLayer, RedisLayer } from 'layercache'
import Redis from 'ioredis'
const cache = new CacheStack([
new MemoryLayer({ ttl: 60, maxSize: 1_000 }), // L1: in-process
new RedisLayer({ client: new Redis(), ttl: 3600 }), // L2: shared
])
// Read-through: fetcher runs once, all layers filled automatically
const user = await cache.get('user:123', () => db.findUser(123))
You can also start with just memory (no Redis required) and add layers as your needs grow.
Key features I'm most proud of
Stampede prevention — 100 concurrent requests for the same key trigger only 1 fetcher execution. Distributed dedup via Redis locks works across multiple server instances too.
Tag-based invalidation — Invalidate groups of related keys by tag, including across all layers at once. Useful for things like "invalidate all user-related cache entries."
Stale-while-revalidate / stale-if-error — Serve the stale cached value immediately while refreshing in the background, or keep serving stale data when the upstream is down.
Framework integrations — Middleware helpers for Express, Fastify, Hono, tRPC, GraphQL, and a NestJS module with a @Cacheable() decorator.
Observability out of the box — Prometheus exporter, OpenTelemetry tracing, per-layer latency metrics, event hooks, and an HTTP stats endpoint.
Admin CLI — npx layercache stats|keys|invalidate for Redis-backed caches.
NestJS example (because I use NestJS a lot)
// app.module.ts
@Module({
imports: [
CacheStackModule.forRoot({
layers: [
new MemoryLayer({ ttl: 20 }),
new RedisLayer({ client: redis, ttl: 300 })
]
})
]
})
export class AppModule {}
// user.service.ts
@Injectable()
export class UserService {
constructor(@InjectCacheStack() private readonly cache: CacheStack) {}
async getUser(id: number) {
return this.cache.get(`user:${id}`, () => this.db.findUser(id))
}
}
Benchmark numbers (on my machine, grain of salt)
| Scenario | Avg Latency |
|---|---|
| L1 memory hit | ~0.006 ms |
| L2 Redis hit | ~0.020 ms |
| No cache (simulated DB) | ~1.08 ms |
Stampede prevention: 100 concurrent requests → 1 fetcher execution.
What I'm looking for feedback on
Honestly, everything! But a few things I'm specifically unsure about:
-
API design — Does the
CacheStack+ layer composition model feel intuitive? Are there footguns I'm missing? - The feature set — Is this too much? Too little? Are there things here that should just be separate libraries?
- Production readiness — What would you need to see before using something like this in production? (more tests? better docs? battle-tested examples?)
-
Naming / discoverability —
layercacheas a name... does it communicate what it does clearly enough? - Anything else — I'm sure there are patterns or edge cases I haven't thought of.
Links
- 📦 npm: npmjs.com/package/layercache
- 🐙 GitHub: github.com/flyingsquirrel0419/layercache
- 📖 Docs: API Reference | Tutorial
If you try it out or browse the source and have thoughts — good, bad, or indifferent — I'm all ears. Comments here, GitHub Issues, or Discussions all work.
Thanks for reading! 🙏
Top comments (0)