How to Debug Memory Leaks in Node.js Production Apps
Your Node.js app works fine for hours, then slows to a crawl. Memory climbs steadily. A restart fixes it temporarily — then it happens again.
That's a memory leak. Here's how to find and fix it.
Confirm It's a Leak
# Watch RSS memory of your Node process over time
watch -n 30 'ps -o pid,rss,vsz,comm -p $(pgrep -f "node server")'
# Or log it from inside the app
setInterval(() => {
const m = process.memoryUsage();
console.log(JSON.stringify({
rss: Math.round(m.rss / 1024 / 1024) + 'MB',
heap: Math.round(m.heapUsed / 1024 / 1024) + 'MB',
time: new Date().toISOString()
}));
}, 60000);
If RSS grows steadily and never drops, you have a leak.
Step 1: Generate a Heap Snapshot
# Start Node with inspector
node --inspect server.js
# Or send signal to running process
kill -USR1 <PID> # Opens inspector on port 9229
Then in Chrome: chrome://inspect → Open dedicated DevTools → Memory → Take heap snapshot.
Take a snapshot, do some actions, take another. Compare — look for objects accumulating.
Step 2: Use clinic.js (Easiest Method)
npm install -g clinic
# Profile your app under load
clinic heapprofile -- node server.js
# In another terminal, run load
npx autocannon -c 10 -d 60 http://localhost:3000/api/endpoint
# Ctrl+C clinic — it generates a flamegraph
The flamegraph shows where memory is being allocated. Tall bars = lots of allocation from that function.
Step 3: Common Leak Patterns
Pattern 1: Event Listener Accumulation
// Leak: adding listener every request without removing
app.get('/stream', (req, res) => {
emitter.on('data', (chunk) => res.write(chunk)); // Never removed!
});
// Fix: remove listener on connection close
app.get('/stream', (req, res) => {
const handler = (chunk) => res.write(chunk);
emitter.on('data', handler);
req.on('close', () => emitter.off('data', handler)); // Cleanup
});
Pattern 2: Global Cache Without Expiry
// Leak: cache grows forever
const cache = {};
app.get('/user/:id', async (req, res) => {
if (!cache[req.params.id]) {
cache[req.params.id] = await db.getUser(req.params.id);
}
res.json(cache[req.params.id]);
});
// Fix: use a proper cache with TTL
const NodeCache = require('node-cache');
const cache = new NodeCache({ stdTTL: 300, maxKeys: 1000 });
Pattern 3: Closures Holding References
// Leak: closure keeps large object alive
function processData(largeArray) {
const summary = computeSummary(largeArray);
return function getSummary() {
return summary; // largeArray stays in memory via closure
};
}
// Fix: only keep what you need
function processData(largeArray) {
const summary = computeSummary(largeArray);
largeArray = null; // Help GC
return function getSummary() { return summary; };
}
Pattern 4: Uncleared Timers
// Leak: interval never cleared
function startMonitoring() {
setInterval(() => {
checkHealth();
}, 5000); // No reference kept, can't clear it
}
// Fix: keep reference and clear on shutdown
let monitorInterval;
function startMonitoring() {
monitorInterval = setInterval(() => checkHealth(), 5000);
}
process.on('SIGTERM', () => clearInterval(monitorInterval));
Step 4: Detect with Automated Testing
// Add to your test suite
const v8 = require('v8');
test('endpoint does not leak memory', async () => {
global.gc(); // Force GC (run with --expose-gc)
const before = v8.getHeapStatistics().used_heap_size;
// Run the operation 100 times
for (let i = 0; i < 100; i++) {
await request(app).get('/api/users');
}
global.gc();
const after = v8.getHeapStatistics().used_heap_size;
const growthMB = (after - before) / 1024 / 1024;
expect(growthMB).toBeLessThan(5); // Allow <5MB growth
});
I built ARIA to solve exactly this.
Try it free at step2dev.com — no credit card needed.
Top comments (0)