DEV Community

AttractivePenguin
AttractivePenguin

Posted on

From Node.js to Bun: How We Got 5x More Throughput (and Lived to Tell the Tale)

From Node.js to Bun: How We Got 5x More Throughput (and Lived to Tell the Tale)

If you're running Node.js in production and haven't looked at Bun yet, you might be leaving a ridiculous amount of performance on the table. We did the migration, hit a memory leak, fixed it, and came out with 5x the throughput. Here's the full story — including the ugly parts.

Why This Matters

Runtime performance isn't an academic exercise. When your job queue starts backing up at 2 AM, the difference between processing 100 tasks/second and 500 tasks/second is the difference between "we'll handle it in the morning" and "page the on-call."

Trigger.dev runs a task execution engine. Every millisecond of runtime per task translates directly into how many tasks they can process, how much infrastructure they need, and how much they pay in cloud bills. When they benchmarked Bun against Node.js on their real workload — not a synthetic Hello World — the results were dramatic enough to warrant a full migration.

The kicker? Bun has matured. This isn't the "cool experiment" phase anymore. It's production-ready for the right workloads, and this article will help you figure out if yours is one of them.

The Benchmark That Changed Everything

Before diving into the "how," let's look at the "why" with real numbers. Trigger.dev tested their task execution engine on both runtimes with identical workloads:

Metric Node.js Bun Improvement
Tasks/sec ~100 ~500 5x
Cold start time ~300ms ~8ms 37x
Memory (baseline) ~60MB ~20MB 3x

These aren't synthetic Fibonacci benchmarks. This is a real application doing real work — hitting databases, parsing JSON, making HTTP calls. The throughput gap narrowed slightly after a Bun memory leak was discovered and patched (more on that below), but even post-fix, Bun held a comfortable 4-5x lead.

Migration Guide: Node.js → Bun

Step 1: Install Bun

curl -fsSL https://bun.sh/install | bash
Enter fullscreen mode Exit fullscreen mode

Yeah, it's one line. The hardest part of your migration just happened.

Step 2: Replace Your Package Manager

Bun's package manager is a drop-in replacement for npm/yarn/pnpm, and it's fast:

# Instead of npm install
bun install

# Instead of npx
bunx create-next-app my-app

# Instead of npm run dev
bun run dev
Enter fullscreen mode Exit fullscreen mode

Your package.json works as-is. Your lockfile gets converted. Your scripts run. This is the easy part.

Step 3: Swap the Runtime

Replace node with bun in your Dockerfile and startup scripts:

# Before
FROM node:20-slim
WORKDIR /app
COPY package*.json ./
RUN npm ci --only=production
COPY . .
CMD ["node", "dist/index.js"]

# After
FROM oven/bun:1-slim
WORKDIR /app
COPY package.json bun.lockb ./
RUN bun install --production
COPY . .
CMD ["bun", "dist/index.js"]
Enter fullscreen mode Exit fullscreen mode

Step 4: Handle Node.js API Differences

Bun implements most Node.js APIs, but not all. Here's what you'll likely need to adjust:

// ✅ These work out of the box
import { readFileSync } from "fs";
import { createServer } from "http";
import crypto from "crypto";

// ⚠️ These need attention
// - node:child_process (supported but behavioral differences)
// - node:dgram (UDP — limited support)
// - Native .node addons (need recompilation)

// ❌ These don't work yet
// - node:v8 serialization API
// - Some obscure stream edge cases
Enter fullscreen mode Exit fullscreen mode

The practical advice: run your test suite under Bun first. If it passes, you're probably fine.

bun test  # Yes, Bun has a built-in test runner too
Enter fullscreen mode Exit fullscreen mode

Step 5: Optimize for Bun's Strengths

Once you're running on Bun, lean into what makes it fast:

// Bun.serve() is faster than Node's http.createServer
const server = Bun.serve({
  port: 3000,
  fetch(req) {
    return new Response("Hello from Bun!");
  },
});

console.log(`Listening on http://localhost:${server.port}`);
Enter fullscreen mode Exit fullscreen mode
// Built-in SQLite — no driver needed
import { Database } from "bun:sqlite";
const db = new Database("mydb.sqlite");
const results = db.query("SELECT * FROM users WHERE id = ?").get(42);
Enter fullscreen mode Exit fullscreen mode
// Built-in test runner — no jest dependency
import { test, expect } from "bun:test";

test("task processes correctly", () => {
  const result = processTask({ id: 1, payload: "test" });
  expect(result.status).toBe("completed");
});
Enter fullscreen mode Exit fullscreen mode

Real-World Scenarios

Scenario 1: High-Throughput Job Processing

This is where Bun shines brightest. If you're running BullMQ, Agenda, or a custom job queue:

// worker.ts — processing thousands of tasks
interface Task {
  id: string;
  type: string;
  payload: unknown;
}

async function processTask(task: Task): Promise<void> {
  switch (task.type) {
    case "send_email":
      await sendEmail(task.payload);
      break;
    case "process_data":
      await processData(task.payload);
      break;
  }
}

// Bun's event loop handles this dramatically faster
// because of V8->JavaScriptCore differences in promise resolution
async function main() {
  const tasks = await fetchPendingTasks();

  // Process in controlled concurrency
  const results = await Promise.allSettled(
    tasks.map(task => processTask(task))
  );

  const failures = results.filter(r => r.status === "rejected");
  if (failures.length) {
    console.error(`${failures.length} tasks failed`);
  }
}
Enter fullscreen mode Exit fullscreen mode

Scenario 2: API Gateway / BFF Layer

If you're running a backend-for-frontend that aggregates multiple services:

// Bun.serve() handles concurrent fetches more efficiently
Bun.serve({
  port: 3000,
  async fetch(req) {
    const url = new URL(req.url);

    if (url.pathname === "/api/dashboard") {
      // Parallel fetching — Bun handles this beautifully
      const [user, orders, notifications] = await Promise.all([
        fetch(`${USER_SERVICE}/me`, { headers: req.headers }),
        fetch(`${ORDER_SERVICE}/recent`, { headers: req.headers }),
        fetch(`${NOTIF_SERVICE}/unread`, { headers: req.headers }),
      ]);

      return Response.json({
        user: await user.json(),
        orders: await orders.json(),
        notifications: await notifications.json(),
      });
    }

    return new Response("Not found", { status: 404 });
  },
});
Enter fullscreen mode Exit fullscreen mode

Scenario 3: Cold Start-Heavy Serverless

Bun's ~8ms cold start (vs Node's ~300ms) makes it compelling for serverless:

// With AWS Lambda, use a custom runtime
// runtime.sh:
//   #!/bin/sh
//   exec bun /opt/bun-handler.js

// The handler itself can use top-level await
const db = new Database("/opt/mydb.sqlite"); // Opens once, reused across invocations

export default {
  async fetch(req: Request): Promise<Response> {
    const users = db.query("SELECT * FROM users LIMIT 10").all();
    return Response.json(users);
  },
};
Enter fullscreen mode Exit fullscreen mode

Troubleshooting & FAQ

Q: I hit a memory leak. Is Bun production-ready?

Yes, but be aware. The Trigger.dev team discovered a memory leak in Bun's HTTP client under high concurrency. The Bun team shipped a fix within days. The lesson: monitor memory in production, regardless of runtime.

# Monitor Bun's memory usage
ps aux | grep bun
# Or use Bun's built-in profiler
bun --inspect your-app.ts
Enter fullscreen mode Exit fullscreen mode

If you see unbounded memory growth:

  1. Check your Bun version — upgrade to latest
  2. Look for unclosed HTTP connections or streaming responses
  3. Use BUN_DEBUG_QUIET_LOGS=1 to reduce noise, then inspect

Q: My native Node module doesn't work under Bun.

Bun supports native addons compiled for it, but they need recompilation:

# Rebuild native modules for Bun
bun install --force
Enter fullscreen mode Exit fullscreen mode

If that doesn't work, check if there's a pure JS alternative or file an issue at bun.sh/issues.

Q: Should I migrate everything at once?

No. Start with stateless workers and job processors — the simplest, most CPU-bound parts of your stack. Validate performance, then expand.

# Run both runtimes in parallel during migration
# Docker Compose example:
services:
  worker-node:
    build: 
      dockerfile: Dockerfile.node
  worker-bun:
    build:
      dockerfile: Dockerfile.bun
    # Route a percentage of traffic to Bun
    # Compare metrics side by side
Enter fullscreen mode Exit fullscreen mode

Q: What about TypeScript?

Bun runs TypeScript natively. No tsx, no ts-node, no compilation step:

bun run src/index.ts  # Just works
Enter fullscreen mode Exit fullscreen mode

Q: Is Bun a drop-in replacement?

For most server applications — yes. For CLI tools with complex native dependencies — maybe not yet. Check the compatibility list before committing.

Conclusion

Bun isn't a toy anymore. It's a production runtime that can deliver real, measurable performance improvements — especially for high-throughput job processing and API layers. The Trigger.dev team proved it: 5x throughput, faster cold starts, lower memory, and a migration that took days, not months.

Should you switch? If you're running Node.js in performance-sensitive scenarios, you owe it to yourself to benchmark Bun against your actual workload. The numbers might surprise you.

Start small. Migrate a worker. Run the tests. Watch the metrics. Then decide.

The runtime wars aren't over, but Bun just won a significant battle.


Inspired by Trigger.dev's real-world migration story. Go read their original post for the raw benchmarks and the memory leak deep-dive.

Top comments (0)