DEV Community

HK Lee
HK Lee

Posted on • Originally published at pockit.tools

Debugging Node.js Memory Leaks: A Complete Troubleshooting Guide with Real-World Examples

It starts subtly. Your Node.js server runs fine for hours, then days. Then you notice the memory usage graph in your monitoring dashboard climbing—slowly but relentlessly. Restarts become a weekly ritual. Then daily. Then you're writing cron jobs to restart your server every few hours, and you know something is deeply wrong.

Memory leaks in Node.js are among the most frustrating bugs to diagnose. Unlike a syntax error that screams at you immediately, a memory leak whispers. It reveals itself only under sustained load, often in production environments where debugging tools are limited. And when it finally crashes your server at 3 AM with an out-of-memory error, you're left staring at cryptic heap dumps wondering where to even begin.

This guide will transform you from someone who fears memory leaks into someone who hunts them systematically. We'll cover the fundamental concepts, walk through practical debugging sessions with real tools, examine the most common leak patterns in Node.js applications, and build a mental framework for attacking these issues in any codebase.


Understanding Memory in Node.js: The Foundation

Before we can fix memory leaks, we need to understand how Node.js manages memory. Node.js uses the V8 JavaScript engine, which implements automatic memory management through garbage collection.

The V8 Memory Model

V8 divides memory into several spaces:

┌─────────────────────────────────────────────────────────────────┐
│                        V8 HEAP MEMORY                           │
├─────────────────────────────────────────────────────────────────┤
│                                                                 │
│  ┌─────────────────────┐    ┌─────────────────────────────────┐ │
│  │    NEW SPACE        │    │         OLD SPACE               │ │
│  │  (Young Generation) │    │     (Old Generation)            │ │
│  │                     │    │                                 │ │
│  │  • Short-lived      │───▶│  • Long-lived objects           │ │
│  │    objects          │    │  • Promoted from New Space      │ │
│  │  • Fast GC (Scavenge)│   │  • Slower GC (Mark-Sweep)       │ │
│  └─────────────────────┘    └─────────────────────────────────┘ │
│                                                                 │
│  ┌─────────────────────┐    ┌─────────────────────────────────┐ │
│  │    LARGE OBJECT     │    │         CODE SPACE              │ │
│  │       SPACE         │    │   (Compiled JS functions)       │ │
│  └─────────────────────┘    └─────────────────────────────────┘ │
│                                                                 │
└─────────────────────────────────────────────────────────────────┘
Enter fullscreen mode Exit fullscreen mode

New Space (Young Generation): Where new objects are allocated. Small and frequently garbage collected using a fast "Scavenger" algorithm.

Old Space (Old Generation): Objects that survive multiple garbage collection cycles in New Space get promoted here. Collected less frequently using the slower Mark-Sweep-Compact algorithm.

What Is a Memory Leak?

A memory leak occurs when your application allocates memory that it no longer needs but fails to release. In garbage-collected languages like JavaScript, this typically means holding references to objects that should have been discarded.

The garbage collector only frees objects that are unreachable—objects with no references pointing to them. If your code accidentally maintains a reference to an object, that object will never be collected, even if you never use it again.

Common patterns that cause leaks:

  1. Global variables that accumulate data
  2. Closures that capture variables unintentionally
  3. Event listeners that are added but never removed
  4. Caches with no eviction policy
  5. Timers (setInterval) that are never cleared
  6. Circular references in certain contexts

Recognizing a Memory Leak: The Symptoms

How do you know you have a memory leak versus just high memory usage? Look for these patterns:

Symptom 1: The Sawtooth Pattern Gone Wrong

Normal memory usage in Node.js looks like a sawtooth: memory climbs as objects are allocated, then drops sharply during garbage collection.

Normal Memory Pattern:
     ▲
     │    /\    /\    /\    /\
     │   /  \  /  \  /  \  /  \
     │  /    \/    \/    \/    \
     └────────────────────────────▶
                  Time
Enter fullscreen mode Exit fullscreen mode

A memory leak shows a different pattern: the sawtooth baseline keeps rising:

Memory Leak Pattern:
     ▲
     │                      /\
     │                 /\  /  \
     │            /\  /  \/
     │       /\  /  \/
     │  /\  /  \/
     │ /  \/
     └────────────────────────────▶
                  Time
Enter fullscreen mode Exit fullscreen mode

Symptom 2: Growing Old Space

Use the --expose-gc flag and periodically force garbage collection while logging memory:

// memory-monitor.js
if (global.gc) {
  setInterval(() => {
    global.gc();
    const used = process.memoryUsage();
    console.log({
      heapUsed: Math.round(used.heapUsed / 1024 / 1024) + 'MB',
      heapTotal: Math.round(used.heapTotal / 1024 / 1024) + 'MB',
      external: Math.round(used.external / 1024 / 1024) + 'MB',
      rss: Math.round(used.rss / 1024 / 1024) + 'MB',
    });
  }, 10000);
}
Enter fullscreen mode Exit fullscreen mode

Run with:

node --expose-gc memory-monitor.js
Enter fullscreen mode Exit fullscreen mode

If heapUsed keeps growing even immediately after GC, you have a leak.

Symptom 3: Increasing Response Times

As the heap grows, garbage collection cycles become longer and more frequent. This manifests as increasing latency and periodic "pauses" in request handling.


Debugging Tools: Your Arsenal

Node.js offers several powerful tools for memory debugging. Let's explore each one.

1. Chrome DevTools (The Heavy Hitter)

Node.js integrates with Chrome DevTools for powerful heap analysis.

Start your app in inspect mode:

node --inspect server.js
# or for break on first line:
node --inspect-brk server.js
Enter fullscreen mode Exit fullscreen mode

Connect from Chrome:

  1. Open chrome://inspect in Chrome
  2. Click "inspect" under your Node.js target
  3. Navigate to the "Memory" tab

Taking Heap Snapshots:

The most powerful technique is comparing heap snapshots over time:

  1. Take a snapshot at baseline (after server startup)
  2. Perform the action you suspect is leaking (send requests, etc.)
  3. Force garbage collection (click the trash can icon)
  4. Take another snapshot
  5. Compare the snapshots

The comparison view shows you what objects were allocated between snapshots and never freed.

2. Node.js Built-in: process.memoryUsage()

Quick and dirty memory monitoring:

function logMemory(label = '') {
  const used = process.memoryUsage();
  console.log(`Memory ${label}:`, {
    rss: `${Math.round(used.rss / 1024 / 1024)} MB`,
    heapTotal: `${Math.round(used.heapTotal / 1024 / 1024)} MB`,
    heapUsed: `${Math.round(used.heapUsed / 1024 / 1024)} MB`,
    external: `${Math.round(used.external / 1024 / 1024)} MB`,
  });
}

// Use it to bracket suspicious operations
logMemory('before operation');
await suspiciousOperation();
logMemory('after operation');
Enter fullscreen mode Exit fullscreen mode

Understanding the metrics:

  • rss (Resident Set Size): Total memory allocated for the process
  • heapTotal: Total allocated heap
  • heapUsed: Actually used heap memory
  • external: Memory used by C++ objects bound to JavaScript (Buffers, etc.)

3. Heap Snapshots via Code

You can generate heap snapshots programmatically without Chrome DevTools:

const v8 = require('v8');
const fs = require('fs');
const path = require('path');

function takeHeapSnapshot() {
  const filename = path.join(
    __dirname, 
    `heap-${Date.now()}.heapsnapshot`
  );

  const snapshotStream = v8.writeHeapSnapshot(filename);
  console.log(`Heap snapshot written to: ${filename}`);
  return filename;
}

// Trigger via HTTP endpoint for production debugging
app.get('/debug/heap-snapshot', (req, res) => {
  const file = takeHeapSnapshot();
  res.json({ file });
});
Enter fullscreen mode Exit fullscreen mode

You can then load these .heapsnapshot files in Chrome DevTools for analysis.

4. Clinic.js: The Production-Ready Suite

Clinic.js is a fantastic suite of tools for Node.js performance analysis:

npm install -g clinic

# Detect various issues including memory leaks
clinic doctor -- node server.js

# Focused heap profiling  
clinic heap -- node server.js

# Flame graphs for CPU profiling
clinic flame -- node server.js
Enter fullscreen mode Exit fullscreen mode

Clinic Doctor will run your server under load and generate a visual report highlighting potential issues.

5. Memwatch-next: Leak Detection in Code

For automated leak detection:

const memwatch = require('memwatch-next');

memwatch.on('leak', (info) => {
  console.error('Memory leak detected:', info);
  // info.growth: bytes leaked
  // info.reason: why it's considered a leak
});

memwatch.on('stats', (stats) => {
  console.log('GC stats:', stats);
  // Detailed garbage collection statistics
});
Enter fullscreen mode Exit fullscreen mode

The 7 Deadly Leak Patterns (And How to Fix Them)

Pattern 1: The Unbounded Cache

The Problem:

// ❌ Memory leak: cache grows forever
const cache = {};

function getCachedData(key) {
  if (cache[key]) {
    return cache[key];
  }

  const data = expensiveComputation(key);
  cache[key] = data;  // Never evicted!
  return data;
}
Enter fullscreen mode Exit fullscreen mode

The Fix: Use an LRU (Least Recently Used) cache with a maximum size:

// ✅ Fixed: bounded cache with eviction
const LRU = require('lru-cache');

const cache = new LRU({
  max: 500,              // Maximum 500 items
  maxAge: 1000 * 60 * 5, // Items expire after 5 minutes
  updateAgeOnGet: true,  // Reset age on access
});

function getCachedData(key) {
  const cached = cache.get(key);
  if (cached !== undefined) {
    return cached;
  }

  const data = expensiveComputation(key);
  cache.set(key, data);
  return data;
}
Enter fullscreen mode Exit fullscreen mode

Pattern 2: Event Listener Accumulation

The Problem:

// ❌ Memory leak: adding listeners in a hot path without removal
function handleConnection(socket) {
  const onData = (data) => {
    processData(data);
  };

  // Every connection adds a new listener, never removed
  eventEmitter.on('data', onData);
}
Enter fullscreen mode Exit fullscreen mode

The Fix: Always remove listeners when they're no longer needed:

// ✅ Fixed: remove listener on disconnect
function handleConnection(socket) {
  const onData = (data) => {
    processData(data);
  };

  eventEmitter.on('data', onData);

  socket.on('close', () => {
    eventEmitter.removeListener('data', onData);
  });
}
Enter fullscreen mode Exit fullscreen mode

Or use once for one-time listeners:

eventEmitter.once('data', onData);
Enter fullscreen mode Exit fullscreen mode

Pattern 3: Closures Capturing Context

The Problem:

// ❌ Memory leak: closure captures large data
function processRequest(req, res) {
  const largePayload = req.body; // 10MB of data

  // This closure captures largePayload
  someAsyncOperation(() => {
    // Only using req.body.id, but the entire payload is retained
    console.log('Processed:', req.body.id);
    res.send('done');
  });
}
Enter fullscreen mode Exit fullscreen mode

The Fix: Extract only what you need:

// ✅ Fixed: only capture necessary data
function processRequest(req, res) {
  const { id } = req.body;  // Extract only what's needed

  someAsyncOperation(() => {
    console.log('Processed:', id);  // Only 'id' is captured
    res.send('done');
  });
}
Enter fullscreen mode Exit fullscreen mode

Pattern 4: Orphaned Timers

The Problem:

// ❌ Memory leak: setInterval never cleared
class DataPoller {
  constructor(url) {
    this.url = url;
    this.intervalId = setInterval(() => {
      this.poll();
    }, 5000);
  }

  poll() {
    fetch(this.url).then(/* ... */);
  }

  // No cleanup method! Instance can never be garbage collected
}
Enter fullscreen mode Exit fullscreen mode

The Fix: Always provide cleanup:

// ✅ Fixed: explicit cleanup
class DataPoller {
  constructor(url) {
    this.url = url;
    this.intervalId = setInterval(() => {
      this.poll();
    }, 5000);
  }

  poll() {
    fetch(this.url).then(/* ... */);
  }

  destroy() {
    clearInterval(this.intervalId);
  }
}

// Usage
const poller = new DataPoller('/api/data');
// When done:
poller.destroy();
Enter fullscreen mode Exit fullscreen mode

Pattern 5: Growing Global State

The Problem:

// ❌ Memory leak: global array grows forever
const requestLog = [];

app.use((req, res, next) => {
  requestLog.push({
    timestamp: Date.now(),
    url: req.url,
    method: req.method,
    // ...potentially large headers and body
  });
  next();
});
Enter fullscreen mode Exit fullscreen mode

The Fix: Bound your data structures:

// ✅ Fixed: bounded log with rotation
const MAX_LOG_SIZE = 1000;
const requestLog = [];

app.use((req, res, next) => {
  requestLog.push({
    timestamp: Date.now(),
    url: req.url,
    method: req.method,
  });

  // Evict old entries
  while (requestLog.length > MAX_LOG_SIZE) {
    requestLog.shift();
  }

  next();
});
Enter fullscreen mode Exit fullscreen mode

Or better yet, use a ring buffer or stream logs to external storage.

Pattern 6: Forgotten Promises

The Problem:

// ❌ Potential leak: promises without error handling accumulate
function fetchAll(urls) {
  urls.forEach(url => {
    fetch(url).then(response => {
      process(response);
    });
    // No .catch() - unhandled rejections accumulate references
  });
}
Enter fullscreen mode Exit fullscreen mode

The Fix: Always handle promise rejections:

// ✅ Fixed: proper error handling
async function fetchAll(urls) {
  const promises = urls.map(async url => {
    try {
      const response = await fetch(url);
      await process(response);
    } catch (error) {
      console.error(`Failed to fetch ${url}:`, error);
      // Properly handled, no accumulation
    }
  });

  await Promise.all(promises);
}
Enter fullscreen mode Exit fullscreen mode

Pattern 7: Circular References with Closures

The Problem:

// ❌ Tricky leak: circular reference through closure
function createConnection() {
  const connection = {
    data: new Array(1000000).fill('x'), // Large data
  };

  connection.onClose = function() {
    // This closure references 'connection', creating a circular reference
    console.log('Connection closed', connection.id);
    cleanup(connection);
  };

  return connection;
}
Enter fullscreen mode Exit fullscreen mode

Modern V8 handles most circular references fine, but closures can prevent collection:

The Fix: Break the cycle explicitly:

// ✅ Fixed: break circular reference on cleanup
function createConnection() {
  const connection = {
    data: new Array(1000000).fill('x'),
  };

  connection.onClose = function() {
    console.log('Connection closed', connection.id);
    cleanup(connection);
    connection.onClose = null;  // Break the cycle
    connection.data = null;     // Release large data
  };

  return connection;
}
Enter fullscreen mode Exit fullscreen mode

A Real-World Debugging Session

Let's walk through debugging a realistic memory leak scenario step by step.

The Scenario

You have an Express server that handles WebSocket connections. Memory keeps growing.

// server.js - The leaky code
const express = require('express');
const WebSocket = require('ws');

const app = express();
const wss = new WebSocket.Server({ port: 8080 });

// Suspicious: global storage for connections
const connections = new Map();

// Suspicious: global message history
const messageHistory = [];

wss.on('connection', (ws, req) => {
  const userId = req.url.split('?userId=')[1];
  connections.set(userId, ws);

  ws.on('message', (message) => {
    // Store all messages forever
    messageHistory.push({
      userId,
      message: message.toString(),
      timestamp: Date.now(),
    });

    // Broadcast to all
    connections.forEach((client) => {
      if (client.readyState === WebSocket.OPEN) {
        client.send(message.toString());
      }
    });
  });

  // BUG: No cleanup when connection closes!
});

app.listen(3000);
Enter fullscreen mode Exit fullscreen mode

Step 1: Confirm the Leak

Add memory monitoring:

setInterval(() => {
  const used = process.memoryUsage();
  console.log(`[Memory] Heap: ${Math.round(used.heapUsed / 1024 / 1024)}MB, ` +
              `Connections: ${connections.size}, ` +
              `Messages: ${messageHistory.length}`);
}, 5000);
Enter fullscreen mode Exit fullscreen mode

After running with simulated traffic, you see:

[Memory] Heap: 45MB, Connections: 100, Messages: 1000
[Memory] Heap: 67MB, Connections: 98, Messages: 2500
[Memory] Heap: 89MB, Connections: 102, Messages: 4200
[Memory] Heap: 112MB, Connections: 95, Messages: 6100
Enter fullscreen mode Exit fullscreen mode

Memory grows even though connection count stays stable. The messageHistory array is the obvious culprit here, but let's also check connections.

Step 2: Take Heap Snapshots

Using Chrome DevTools:

  1. Connect to node --inspect server.js
  2. Take snapshot after startup
  3. Simulate 100 connections, disconnect them
  4. Take another snapshot
  5. Compare

In the comparison view, you might see:

  • Many Array objects (the message history)
  • WebSocket objects that weren't removed from the Map

Step 3: Apply Fixes

// server.js - Fixed version
const express = require('express');
const WebSocket = require('ws');

const app = express();
const wss = new WebSocket.Server({ port: 8080 });

const connections = new Map();

// Fixed: bounded message history
const MAX_HISTORY = 1000;
const messageHistory = [];

wss.on('connection', (ws, req) => {
  const userId = req.url.split('?userId=')[1];
  connections.set(userId, ws);

  ws.on('message', (message) => {
    messageHistory.push({
      userId,
      message: message.toString(),
      timestamp: Date.now(),
    });

    // Evict old messages
    while (messageHistory.length > MAX_HISTORY) {
      messageHistory.shift();
    }

    connections.forEach((client) => {
      if (client.readyState === WebSocket.OPEN) {
        client.send(message.toString());
      }
    });
  });

  // Fixed: cleanup on close
  ws.on('close', () => {
    connections.delete(userId);
  });

  ws.on('error', () => {
    connections.delete(userId);
  });
});

app.listen(3000);
Enter fullscreen mode Exit fullscreen mode

Step 4: Verify the Fix

Run the same load test. Memory should now stabilize:

[Memory] Heap: 45MB, Connections: 100, Messages: 1000
[Memory] Heap: 52MB, Connections: 98, Messages: 1000
[Memory] Heap: 49MB, Connections: 102, Messages: 1000
[Memory] Heap: 51MB, Connections: 95, Messages: 1000
Enter fullscreen mode Exit fullscreen mode

Victory!


Production Strategies

1. Memory Limits and Alerts

Set explicit memory limits to prevent runaway crashes:

node --max-old-space-size=512 server.js  # 512MB limit
Enter fullscreen mode Exit fullscreen mode

Implement alerting:

const MEMORY_THRESHOLD = 450 * 1024 * 1024; // 450MB

setInterval(() => {
  const used = process.memoryUsage();
  if (used.heapUsed > MEMORY_THRESHOLD) {
    alertOperations('Memory threshold exceeded', {
      heapUsed: used.heapUsed,
      threshold: MEMORY_THRESHOLD,
    });
  }
}, 60000);
Enter fullscreen mode Exit fullscreen mode

2. Graceful Restarts

When memory gets too high, restart gracefully:

const RESTART_THRESHOLD = 500 * 1024 * 1024;

setInterval(() => {
  if (process.memoryUsage().heapUsed > RESTART_THRESHOLD) {
    console.log('Memory threshold exceeded, initiating graceful shutdown');

    // Stop accepting new connections
    server.close(() => {
      // Allow existing requests to finish
      setTimeout(() => {
        process.exit(0);  // PM2 or container orchestrator will restart
      }, 5000);
    });
  }
}, 60000);
Enter fullscreen mode Exit fullscreen mode

3. Health Check Endpoints

Expose memory metrics for monitoring:

app.get('/health', (req, res) => {
  const memory = process.memoryUsage();
  const uptime = process.uptime();

  res.json({
    status: 'ok',
    uptime: Math.round(uptime),
    memory: {
      heapUsed: Math.round(memory.heapUsed / 1024 / 1024),
      heapTotal: Math.round(memory.heapTotal / 1024 / 1024),
      rss: Math.round(memory.rss / 1024 / 1024),
    },
    // Custom metrics
    connections: connections.size,
    cacheSize: cache.size,
  });
});
Enter fullscreen mode Exit fullscreen mode

4. Heap Dump on Demand

Enable production heap dumps for post-mortem analysis:

app.post('/debug/heap', authenticateAdmin, (req, res) => {
  const v8 = require('v8');
  const filename = v8.writeHeapSnapshot();
  res.json({ filename });
});
Enter fullscreen mode Exit fullscreen mode

Prevention: Writing Leak-Resistant Code

Rule 1: Always Clean Up Event Listeners

// Use AbortController for easy cleanup
const controller = new AbortController();

element.addEventListener('click', handler, { 
  signal: controller.signal 
});

// Later: removes all listeners added with this controller
controller.abort();
Enter fullscreen mode Exit fullscreen mode

Rule 2: Bound All Collections

// Instead of unbounded arrays
const items = [];
items.push(newItem);

// Use bounded collections
const MAX_ITEMS = 10000;
if (items.length >= MAX_ITEMS) {
  items.shift();
}
items.push(newItem);
Enter fullscreen mode Exit fullscreen mode

Rule 3: Use WeakMap and WeakSet for Metadata

When attaching metadata to objects:

// ❌ Creates strong references, prevents GC
const metadata = new Map();
metadata.set(someObject, { extra: 'data' });

// ✅ Weak references, doesn't prevent GC
const metadata = new WeakMap();
metadata.set(someObject, { extra: 'data' });
// When someObject is no longer referenced elsewhere, 
// the metadata entry is automatically removed
Enter fullscreen mode Exit fullscreen mode

Rule 4: Implement Dispose Patterns

class ResourceManager {
  #resources = [];
  #disposed = false;

  acquire(resource) {
    if (this.#disposed) {
      throw new Error('Manager is disposed');
    }
    this.#resources.push(resource);
    return resource;
  }

  dispose() {
    this.#disposed = true;
    for (const resource of this.#resources) {
      resource.close?.();
      resource.destroy?.();
    }
    this.#resources.length = 0;  // Clear array
  }
}
Enter fullscreen mode Exit fullscreen mode

Conclusion: The Mindset Shift

Debugging memory leaks requires a shift in how you think about your code. Instead of just asking "does this work?", you need to ask:

  1. Where does this data go? Every allocation should have a clear lifecycle.
  2. When is this cleaned up? Every add should have a corresponding remove.
  3. What's the upper bound? Every collection should have a maximum size.
  4. What holds references to this? Understand your reference graph.

Memory leaks aren't mysterious. They're just objects being held onto longer than necessary. With the tools and patterns in this guide, you can systematically hunt down any leak and build applications that run reliably for months without restart.

The next time your server's memory starts climbing, you'll know exactly where to look—and exactly how to fix it.


Happy debugging. May your heaps stay small and your garbage collector stay idle.


💡 Note: This article was originally published on the Pockit Blog.

Check out Pockit.tools for 50+ free developer utilities (JSON Formatter, Diff Checker, etc.) that run 100% locally in your browser.

Top comments (0)