You've probably written this before:
async function getUser(userId: string, requestId: string, logger: Logger) {
logger.info('Fetching user', { requestId, userId });
const user = await db.query('SELECT * FROM users WHERE id = $1', [userId]);
return processUser(user, requestId, logger); // pass it down... again
}
async function processUser(user: User, requestId: string, logger: Logger) {
logger.info('Processing user', { requestId }); // still passing it
return enrichUser(user, requestId, logger); // and again
}
This is context prop drilling — the Node.js equivalent of React's most notorious anti-pattern. Every function in your call stack needs requestId, userId, or traceId, so you thread it through every argument list. It bloats function signatures, leaks implementation details, and makes refactoring painful.
In 2026, there's a better way: AsyncLocalStorage, Node.js's built-in solution for propagating request-scoped context without prop drilling. It's been stable since Node.js 16, battle-tested in production at scale, and is the same mechanism used internally by OpenTelemetry's Node.js SDK.
This guide shows you how to use it to build clean, traceable, production-grade APIs.
What is AsyncLocalStorage?
AsyncLocalStorage is part of Node.js's async_hooks module. It creates a store that's automatically propagated through the entire async call chain — promises, callbacks, setTimeout, everything — without you passing it explicitly.
Think of it like a thread-local variable in Java or Python, but for Node.js's async model.
import { AsyncLocalStorage } from 'node:async_hooks';
const storage = new AsyncLocalStorage<{ requestId: string }>();
storage.run({ requestId: 'abc-123' }, () => {
// Everything called from here has access to the store
console.log(storage.getStore()); // { requestId: 'abc-123' }
setTimeout(() => {
console.log(storage.getStore()); // Still { requestId: 'abc-123' }
}, 1000);
});
The store is per-request — concurrent requests each get their own isolated store, so there's no cross-contamination even under heavy load.
Setting Up Request Context in Express
Here's a production pattern for attaching context to every incoming request:
// src/context/request-context.ts
import { AsyncLocalStorage } from 'node:async_hooks';
import { randomUUID } from 'node:crypto';
export interface RequestContext {
requestId: string;
userId?: string;
tenantId?: string;
startTime: number;
method: string;
path: string;
}
// Single global store — safe because each run() is isolated
const store = new AsyncLocalStorage<RequestContext>();
export const RequestContext = {
/** Get the current request context (throws if called outside a request) */
get(): RequestContext {
const ctx = store.getStore();
if (!ctx) throw new Error('No request context available');
return ctx;
},
/** Get the current request context (returns null if called outside a request) */
tryGet(): RequestContext | null {
return store.getStore() ?? null;
},
/** Run a function with the given context */
run<T>(ctx: RequestContext, fn: () => T): T {
return store.run(ctx, fn);
},
/** Create a new context with a generated request ID */
create(partial: Partial<RequestContext>): RequestContext {
return {
requestId: randomUUID(),
startTime: Date.now(),
method: 'UNKNOWN',
path: 'UNKNOWN',
...partial,
};
},
};
// src/middleware/context.middleware.ts
import { Request, Response, NextFunction } from 'express';
import { RequestContext } from '../context/request-context.js';
export function contextMiddleware(req: Request, res: Response, next: NextFunction) {
const ctx = RequestContext.create({
// Respect upstream tracing headers (from load balancer, API gateway, etc.)
requestId: (req.headers['x-request-id'] as string) ?? undefined,
method: req.method,
path: req.path,
});
// Always echo the request ID back to the client
res.setHeader('X-Request-Id', ctx.requestId);
// Run the rest of the request inside the context
RequestContext.run(ctx, () => next());
}
// src/app.ts
import express from 'express';
import { contextMiddleware } from './middleware/context.middleware.js';
const app = express();
app.use(express.json());
app.use(contextMiddleware); // Must be first!
export { app };
Now every handler, service, and database call within that request has automatic access to the context.
Clean Logging Without Prop Drilling
The killer use case is structured logging. Instead of passing requestId everywhere, your logger reads it from the context automatically:
// src/logger/index.ts
import { RequestContext } from '../context/request-context.js';
type LogLevel = 'debug' | 'info' | 'warn' | 'error';
interface LogEntry {
timestamp: string;
level: LogLevel;
message: string;
requestId?: string;
userId?: string;
[key: string]: unknown;
}
function log(level: LogLevel, message: string, extra?: Record<string, unknown>) {
const ctx = RequestContext.tryGet();
const entry: LogEntry = {
timestamp: new Date().toISOString(),
level,
message,
// Automatically pulls from the current request context
requestId: ctx?.requestId,
userId: ctx?.userId,
...extra,
};
// In production, use pino or winston instead of console.log
console.log(JSON.stringify(entry));
}
export const logger = {
debug: (msg: string, extra?: Record<string, unknown>) => log('debug', msg, extra),
info: (msg: string, extra?: Record<string, unknown>) => log('info', msg, extra),
warn: (msg: string, extra?: Record<string, unknown>) => log('warn', msg, extra),
error: (msg: string, extra?: Record<string, unknown>) => log('error', msg, extra),
};
Now your services look like this:
// src/services/user.service.ts
import { logger } from '../logger/index.js';
import { db } from '../db/index.js';
// No requestId parameter needed!
export async function getUserById(userId: string) {
logger.info('Fetching user', { userId });
const user = await db.query('SELECT * FROM users WHERE id = $1', [userId]);
if (!user.rows[0]) {
logger.warn('User not found', { userId });
return null;
}
logger.info('User fetched successfully', { userId });
return user.rows[0];
}
export async function processUserOrder(userId: string, orderId: string) {
logger.info('Processing order', { userId, orderId });
// No context parameter needed anywhere in this chain
const user = await getUserById(userId);
// ...
}
Every log line automatically includes requestId and userId. When a bug hits production, you grep by requestId and see every operation in that request's full call chain.
Enriching Context After Authentication
One common pattern: you don't have userId when the request arrives (it comes from the JWT). Add a middleware that enriches the context after auth:
// src/middleware/auth.middleware.ts
import { Request, Response, NextFunction } from 'express';
import { RequestContext } from '../context/request-context.js';
import { verifyJWT } from '../auth/jwt.js';
export async function authMiddleware(req: Request, res: Response, next: NextFunction) {
const token = req.headers.authorization?.replace('Bearer ', '');
if (!token) {
return res.status(401).json({ error: 'Missing token' });
}
try {
const payload = await verifyJWT(token);
const ctx = RequestContext.get();
// Mutate the existing context to add user info
// This works because ctx is a reference to the stored object
Object.assign(ctx, {
userId: payload.sub,
tenantId: payload.tenantId,
});
next();
} catch {
res.status(401).json({ error: 'Invalid token' });
}
}
Now every log line after authMiddleware automatically includes userId and tenantId — zero prop drilling.
Database Query Instrumentation
Instrument your database layer to automatically log slow queries with full context:
// src/db/instrumented.ts
import { Pool } from 'pg';
import { logger } from '../logger/index.js';
const pool = new Pool({ connectionString: process.env.DATABASE_URL });
const SLOW_QUERY_THRESHOLD_MS = 100;
export const db = {
async query<T = unknown>(sql: string, params?: unknown[]): Promise<{ rows: T[] }> {
const start = performance.now();
try {
const result = await pool.query(sql, params);
const duration = performance.now() - start;
if (duration > SLOW_QUERY_THRESHOLD_MS) {
logger.warn('Slow query detected', {
sql: sql.slice(0, 200), // truncate for safety
duration: Math.round(duration),
rowCount: result.rowCount,
});
}
return result;
} catch (error) {
const duration = performance.now() - start;
logger.error('Query failed', {
sql: sql.slice(0, 200),
duration: Math.round(duration),
error: error instanceof Error ? error.message : String(error),
});
throw error;
}
},
};
No requestId parameter. It's always available via logger → RequestContext.
Request Duration Tracking
Track request duration and log it automatically on response:
// src/middleware/response-logger.middleware.ts
import { Request, Response, NextFunction } from 'express';
import { RequestContext } from '../context/request-context.js';
import { logger } from '../logger/index.js';
export function responseLoggerMiddleware(req: Request, res: Response, next: NextFunction) {
res.on('finish', () => {
const ctx = RequestContext.tryGet();
if (!ctx) return;
const duration = Date.now() - ctx.startTime;
const level = res.statusCode >= 500 ? 'error'
: res.statusCode >= 400 ? 'warn'
: 'info';
logger[level]('Request completed', {
statusCode: res.statusCode,
duration,
method: ctx.method,
path: ctx.path,
});
});
next();
}
Using Context in Error Handlers
Global error handlers automatically have context too:
// src/middleware/error.middleware.ts
import { Request, Response, NextFunction } from 'express';
import { logger } from '../logger/index.js';
import { RequestContext } from '../context/request-context.js';
export function errorMiddleware(
error: Error,
req: Request,
res: Response,
_next: NextFunction
) {
logger.error('Unhandled error', {
error: error.message,
stack: process.env.NODE_ENV === 'development' ? error.stack : undefined,
});
const ctx = RequestContext.tryGet();
res.status(500).json({
error: 'Internal server error',
// Always include requestId so clients can report it in bug tickets
requestId: ctx?.requestId ?? 'unknown',
});
}
Putting It All Together
Here's the full middleware chain:
// src/app.ts
import express from 'express';
import { contextMiddleware } from './middleware/context.middleware.js';
import { responseLoggerMiddleware } from './middleware/response-logger.middleware.js';
import { authMiddleware } from './middleware/auth.middleware.js';
import { errorMiddleware } from './middleware/error.middleware.js';
import { userRouter } from './routes/users.js';
const app = express();
app.use(express.json());
app.use(contextMiddleware); // 1. Attach request context
app.use(responseLoggerMiddleware); // 2. Log every response
// Public routes (no auth)
app.get('/health', (req, res) => res.json({ ok: true }));
// Protected routes
app.use('/api', authMiddleware); // 3. Enrich context with userId
app.use('/api/users', userRouter);
app.use(errorMiddleware); // 4. Global error handler
export { app };
A request log output now looks like this — every line correlated by requestId:
{"timestamp":"2026-03-27T01:00:00.000Z","level":"info","message":"Fetching user","requestId":"f47ac10b-58cc-4372-a567-0e02b2c3d479","userId":"usr_abc123"}
{"timestamp":"2026-03-27T01:00:00.012Z","level":"info","message":"User fetched successfully","requestId":"f47ac10b-58cc-4372-a567-0e02b2c3d479","userId":"usr_abc123"}
{"timestamp":"2026-03-27T01:00:00.015Z","level":"info","message":"Request completed","requestId":"f47ac10b-58cc-4372-a567-0e02b2c3d479","userId":"usr_abc123","statusCode":200,"duration":15,"method":"GET","path":"/api/users/usr_abc123"}
Performance Considerations
A common concern: does AsyncLocalStorage add overhead? In Node.js 22+ (the current LTS as of early 2026), the V8 team optimized the async context propagation significantly. Benchmarks show less than 1-2% overhead for typical I/O-bound API workloads — completely negligible.
The AsyncLocalStorage.getStore() call is O(1) and doesn't involve any locking or synchronization. It reads from V8's internal async context slot directly.
For CPU-bound workloads, consider using worker_threads instead — each worker has its own independent AsyncLocalStorage scope.
Integration with OpenTelemetry
If you're using OpenTelemetry in production, the two integrate seamlessly:
import { context, trace } from '@opentelemetry/api';
import { RequestContext } from './context/request-context.js';
export function getTraceContext() {
const ctx = RequestContext.tryGet();
const span = trace.getActiveSpan();
const spanContext = span?.spanContext();
return {
requestId: ctx?.requestId,
traceId: spanContext?.traceId,
spanId: spanContext?.spanId,
};
}
OTel's SDK uses AsyncLocalStorage internally too, so the two systems work together without any conflicts.
TypeScript Tip: Typed Context Access
For fully type-safe context access in handlers, use a typed accessor:
// src/types/express.d.ts
// Optional: if you prefer req.context instead of RequestContext.get()
declare global {
namespace Express {
interface Request {
context: import('./context/request-context.js').RequestContext;
}
}
}
// In contextMiddleware, after creating the context:
req.context = ctx;
RequestContext.run(ctx, () => next());
This gives you two access patterns — use whichever fits your team's style.
When NOT to Use AsyncLocalStorage
A few cases where you should be careful:
Worker threads: Each worker has its own isolated
AsyncLocalStorage. Don't expect context to propagate acrossworker_threadsboundaries — you need to serialize/deserialize it manually.Event emitters with long lifetimes: If an EventEmitter outlives the request that registered its listener, the context may be stale. Use
AsyncResource.bind()to capture the correct context.Cron jobs / background tasks: These don't have a "request" — create a synthetic context with a job ID instead.
Summary
AsyncLocalStorage eliminates the most painful boilerplate in Node.js API development. Once set up, you get:
- Request correlation across every log line, automatically
-
Clean function signatures — no
requestIdparameter threading - Zero-cost context in error handlers — always know which request failed
-
Automatic enrichment — add
userId,tenantId, andtraceIdas they become available - Negligible performance overhead on Node.js 22+ (the current LTS in 2026)
The pattern scales from simple Express apps to complex microservices. It's the same mechanism OpenTelemetry uses under the hood — battle-tested and production-proven.
Need a production API to test your frontend against? 1xAPI on RapidAPI offers sports data, verification, and utility APIs with built-in request tracing on every response.
Top comments (0)