The 2AM Outage That Changed Everything
Our monitoring alerts screamed at us in the middle of the night:
"API latency spike: 900ms p99 → 12 seconds"
After hours of debugging, we discovered the culprit:
Node.js’s built-in http
module was choking under high traffic.
Switching to undici
(Node.js’s modern HTTP/1.1 client) reduced latency by 65% and doubled throughput. Here’s why.
1. The Problem with Node.js’s http
Module
🚨 Issue #1: Connection Pool Inefficiency
The native http
module:
- Opens a new TCP connection per request by default (HTTP/1.1)
-
Poor connection reuse (even with
keep-alive
)
// http.request() creates a new connection if pool is busy
http.request('http://api.com', (res) => { /* ... */ });
🚨 Issue #2: Memory Overhead
- Each request allocates new buffers (no recycling)
- Garbage collector overload under high traffic
🚨 Issue #3: Slow Pipelining
- No HTTP pipelining (requests wait for prior responses)
2. How undici
Fixes These Problems
⚡ Solution #1: Smarter Connection Pooling
- Pre-allocated connections (no TCP handshake delays)
- Aggressive reuse (up to 15x more efficient)
import { request } from 'undici';
// Reuses connections automatically
const { body } = await request('http://api.com');
⚡ Solution #2: Zero-Copy Optimization
-
Recycles memory buffers (unlike
http
) - Reduces GC pressure by ~40%
⚡ Solution #3: Pipeline Support
- Queues requests efficiently (like HTTP/2 multiplexing)
3. Benchmarks: undici
vs http
vs axios
Metric | http |
axios |
undici |
---|---|---|---|
Requests/sec | 3,200 | 2,800 | 18,000 |
Latency (p99) | 450ms | 600ms | 85ms |
Memory Usage | 120MB | 150MB | 45MB |
Tested with Node.js 20, 1K concurrent connections
4. When to Use undici
(And When Not To)
✅ Best For:
✔ High-throughput APIs (microservices, proxies)
✔ Low-latency requirements (user-facing apps)
✔ Serverless functions (faster cold starts)
❌ Avoid If:
✖ You need browser compatibility (use fetch
or axios
)
✖ Your dependencies require http
(some SDKs aren’t compatible)
5. Migration Tips
From http
:
// Before
const res = await fetch('http://api.com');
// After
const { body } = await undici.request('http://api.com');
From axios
:
// Before
axios.get('http://api.com');
// After
const { body } = await undici.request('http://api.com');
Pro Tip: Use undici-fetch
for fetch()
compatibility!
Key Takeaways
🚀 undici
is 3-5x faster than core http
🧠 Smarter connection pooling = lower latency
💾 Memory efficient = better scaling
Have you tried undici
? Share your benchmarks!
Top comments (0)