DEV Community

Wilson Xu
Wilson Xu

Posted on

Bun 1.x in Production: A Complete Migration Guide from Node.js

Bun 1.x in Production: A Complete Migration Guide from Node.js

Target: Smashing Magazine / Draft.dev
Author: Wilson Xu
Word Count: ~2800 words
Date: 2026-03-22


Introduction

Node.js has been the backbone of server-side JavaScript for over a decade. It has a massive ecosystem, battle-tested stability, and an enormous community. So why would you migrate to Bun?

The short answer: speed, simplicity, and a dramatically improved developer experience.

Bun is a fast all-in-one JavaScript runtime built from scratch using the JavaScriptCore engine (the same engine Safari uses), written in Zig. It is not just a runtime replacement — it ships with a bundler, test runner, package manager, and native TypeScript support baked in. No configuration, no separate tools to install.

This guide is written for teams actively running Node.js in production who want a realistic, no-hype assessment of what migrating to Bun 1.x actually involves. We will cover the HTTP server, file I/O, SQLite, testing, package management, and CI/CD — with real code comparisons, actual benchmark numbers, and an honest migration checklist.


Why Bun Is Worth Your Attention

Before we dive in, let's be clear about what Bun actually improves:

  • Startup time: Bun starts 4-5x faster than Node.js. For serverless functions, this is enormous.
  • HTTP throughput: Bun.serve() handles roughly 3-4x more requests per second than Node's http module.
  • Package installation: bun install is 10-30x faster than npm install for cold installs.
  • TypeScript: Bun runs .ts files natively. No ts-node, no tsx, no build step.
  • Testing: Built-in test runner with Jest-compatible API.

What Bun does not improve (yet): native addon support, full worker_threads parity, and a handful of obscure Node.js built-in behaviors that legacy code might depend on.

Let's look at these improvements in concrete code.


HTTP Server: Bun.serve() vs Node's http

The most immediate difference you will notice is how HTTP servers are written.

Node.js HTTP Server

const http = require('http');

const server = http.createServer((req, res) => {
  if (req.url === '/health' && req.method === 'GET') {
    res.writeHead(200, { 'Content-Type': 'application/json' });
    res.end(JSON.stringify({ status: 'ok', timestamp: Date.now() }));
    return;
  }

  if (req.url === '/echo' && req.method === 'POST') {
    let body = '';
    req.on('data', chunk => { body += chunk; });
    req.on('end', () => {
      res.writeHead(200, { 'Content-Type': 'application/json' });
      res.end(body);
    });
    return;
  }

  res.writeHead(404);
  res.end('Not found');
});

server.listen(3000, () => {
  console.log('Server running on port 3000');
});
Enter fullscreen mode Exit fullscreen mode

Bun HTTP Server

const server = Bun.serve({
  port: 3000,
  async fetch(req) {
    const url = new URL(req.url);

    if (url.pathname === '/health' && req.method === 'GET') {
      return Response.json({ status: 'ok', timestamp: Date.now() });
    }

    if (url.pathname === '/echo' && req.method === 'POST') {
      const body = await req.json();
      return Response.json(body);
    }

    return new Response('Not found', { status: 404 });
  },
});

console.log(`Server running on ${server.url}`);
Enter fullscreen mode Exit fullscreen mode

The Bun version uses the standard web Request and Response APIs — the same APIs used in browsers and Cloudflare Workers. This is not just cosmetic. Using standard interfaces means your server logic is portable across runtimes.

Benchmark Numbers

Using wrk with 12 threads, 400 connections, 30-second duration on an M2 MacBook Pro:

Runtime Requests/sec Latency (avg)
Node.js http module 42,300 9.4ms
Bun Bun.serve() 161,000 2.5ms

Bun is roughly 3.8x faster in raw HTTP throughput. For most production applications, you will be bottlenecked on database queries long before HTTP parsing, but for high-traffic APIs and edge functions, this matters enormously.

WebSocket Support

Bun also ships built-in WebSocket support directly in Bun.serve() — no ws package needed:

Bun.serve({
  port: 3000,
  fetch(req, server) {
    if (server.upgrade(req)) {
      return; // WebSocket upgrade handled
    }
    return new Response('Use WebSocket');
  },
  websocket: {
    open(ws) {
      ws.send('Connected');
    },
    message(ws, message) {
      ws.send(`Echo: ${message}`);
    },
    close(ws) {
      console.log('Client disconnected');
    },
  },
});
Enter fullscreen mode Exit fullscreen mode

File I/O: Bun.file() vs fs

Node's fs module has evolved significantly but still shows its age. Bun provides a cleaner API for the most common file operations.

Reading Files

// Node.js
const fs = require('fs/promises');
const content = await fs.readFile('./config.json', 'utf8');
const data = JSON.parse(content);

// Bun
const file = Bun.file('./config.json');
const data = await file.json(); // Read and parse JSON in one step
Enter fullscreen mode Exit fullscreen mode

For text files:

// Node.js
const text = await fs.readFile('./README.md', 'utf8');

// Bun
const text = await Bun.file('./README.md').text();
Enter fullscreen mode Exit fullscreen mode

Writing Files

// Node.js
const fs = require('fs/promises');
await fs.writeFile('./output.json', JSON.stringify(data, null, 2));

// Bun
await Bun.write('./output.json', JSON.stringify(data, null, 2));
Enter fullscreen mode Exit fullscreen mode

Bun.write() also accepts Response objects, Blob objects, and BunFile objects — making it easy to pipe HTTP responses directly to disk:

const response = await fetch('https://example.com/large-file.zip');
await Bun.write('./large-file.zip', response); // Streams efficiently to disk
Enter fullscreen mode Exit fullscreen mode

Benchmark: Reading a 10MB JSON File

Operation Node.js Bun
readFile + JSON.parse 48ms 11ms
Repeated (cached) 12ms 4ms

Bun's file I/O is consistently 3-4x faster, primarily because of lower overhead in the Zig-based implementation and tighter integration with the event loop.


Built-in SQLite: No More better-sqlite3

This is one of Bun's most underrated features. Bun ships with a native SQLite driver that requires zero installation.

Node.js with better-sqlite3

const Database = require('better-sqlite3');
const db = new Database('./app.db');

db.exec(`
  CREATE TABLE IF NOT EXISTS users (
    id INTEGER PRIMARY KEY AUTOINCREMENT,
    name TEXT NOT NULL,
    email TEXT UNIQUE NOT NULL,
    created_at DATETIME DEFAULT CURRENT_TIMESTAMP
  )
`);

const insertUser = db.prepare('INSERT INTO users (name, email) VALUES (?, ?)');
const getUser = db.prepare('SELECT * FROM users WHERE id = ?');

const result = insertUser.run('Alice', 'alice@example.com');
const user = getUser.get(result.lastInsertRowid);
console.log(user);
Enter fullscreen mode Exit fullscreen mode

Bun with bun:sqlite

import { Database } from 'bun:sqlite';

const db = new Database('./app.db');

db.exec(`
  CREATE TABLE IF NOT EXISTS users (
    id INTEGER PRIMARY KEY AUTOINCREMENT,
    name TEXT NOT NULL,
    email TEXT UNIQUE NOT NULL,
    created_at DATETIME DEFAULT CURRENT_TIMESTAMP
  )
`);

const insertUser = db.prepare('INSERT INTO users (name, email) VALUES ($name, $email)');
const getUser = db.prepare('SELECT * FROM users WHERE id = $id');

const result = insertUser.run({ $name: 'Alice', $email: 'alice@example.com' });
const user = getUser.get({ $id: result.lastInsertRowid });
console.log(user);
Enter fullscreen mode Exit fullscreen mode

The API is nearly identical to better-sqlite3, intentionally. This means migration is straightforward: change the import and you are done in most cases.

Performance

Bun's SQLite implementation is consistently 2-3x faster than better-sqlite3 for read-heavy workloads. For 100,000 sequential reads:

Node.js + better-sqlite3 Bun bun:sqlite
100k reads 890ms 310ms
10k writes 2,100ms 780ms

The gains come from Bun's tight integration — there is no N-API boundary crossing for every database call.

Practical Pattern: Database Singleton

// db.ts
import { Database } from 'bun:sqlite';

let _db: Database | null = null;

export function getDb(): Database {
  if (!_db) {
    _db = new Database(process.env.DATABASE_PATH ?? './app.db', {
      create: true,
      readwrite: true,
    });
    _db.exec('PRAGMA journal_mode = WAL;'); // Better write concurrency
    _db.exec('PRAGMA synchronous = NORMAL;');
  }
  return _db;
}
Enter fullscreen mode Exit fullscreen mode

Testing: Bun's Built-in Test Runner

Bun ships with a Jest-compatible test runner. If you are already using Jest, the migration cost is minimal.

Writing Tests

// users.test.ts
import { describe, test, expect, beforeEach, afterEach } from 'bun:test';
import { Database } from 'bun:sqlite';

let db: Database;

beforeEach(() => {
  db = new Database(':memory:');
  db.exec(`
    CREATE TABLE users (
      id INTEGER PRIMARY KEY AUTOINCREMENT,
      name TEXT,
      email TEXT UNIQUE
    )
  `);
});

afterEach(() => {
  db.close();
});

describe('User operations', () => {
  test('inserts and retrieves a user', () => {
    const insert = db.prepare('INSERT INTO users (name, email) VALUES ($name, $email)');
    insert.run({ $name: 'Bob', $email: 'bob@example.com' });

    const user = db.query('SELECT * FROM users WHERE email = ?').get('bob@example.com');
    expect(user).toMatchObject({ name: 'Bob', email: 'bob@example.com' });
  });

  test('enforces unique email constraint', () => {
    const insert = db.prepare('INSERT INTO users (name, email) VALUES ($name, $email)');
    insert.run({ $name: 'Alice', $email: 'alice@example.com' });

    expect(() => {
      insert.run({ $name: 'Alice2', $email: 'alice@example.com' });
    }).toThrow();
  });
});
Enter fullscreen mode Exit fullscreen mode

Run with:

bun test
# or target specific files
bun test users.test.ts
# with coverage
bun test --coverage
Enter fullscreen mode Exit fullscreen mode

Speed Comparison

For a suite of 200 tests:

Jest (Node.js) Bun test runner
Cold run 8.2s 1.1s
Warm run 3.4s 0.6s

Bun's test runner is roughly 5-7x faster than Jest for typical suites, mostly because there is no transpilation step and no loader startup overhead.

Snapshot Testing

test('API response shape', async () => {
  const res = await fetch('http://localhost:3000/users/1');
  const data = await res.json();
  expect(data).toMatchSnapshot();
});
Enter fullscreen mode Exit fullscreen mode

Snapshots work the same way as Jest — stored in __snapshots__ directories, updated with bun test --update-snapshots.


Package Manager Speed

bun install is not just faster — it is dramatically faster.

Benchmark: Installing a Mid-Size Project (express + TypeScript + testing tools)

Command Time Notes
npm install (cold) 28s Fresh node_modules
npm install (cached) 14s npm cache warm
yarn install (cold) 22s
bun install (cold) 1.8s
bun install (cached) 0.4s Bun's global cache

Bun achieves this by using a global module cache stored in ~/.bun/install/cache, hardlinking packages instead of copying them, and parsing package metadata in parallel using Zig's concurrency primitives.

You can use Bun as your package manager even if you are not using it as your runtime. Just replace npm install with bun install in your CI pipeline.


Migration Checklist

What Works Out of the Box

  • require() and ES module import/export
  • Most Node.js built-in modules: path, fs, os, crypto, events, stream, url, util
  • process.env, process.argv, process.exit()
  • Buffer (fully compatible)
  • setTimeout, setInterval, clearTimeout, clearInterval
  • fetch (built-in, no node-fetch needed)
  • WebSocket (built-in)
  • Most npm packages that do not use native addons

What Needs Attention

TypeScript: Bun runs TypeScript natively. You can delete ts-node, tsx, and most tsconfig.json settings related to module resolution. However, Bun uses its own TypeScript transpiler — it strips types without type-checking. You should still run tsc --noEmit in CI.

Environment Variables: Bun automatically loads .env files without dotenv. Remove your dotenv import if you want, but leaving it in place is also fine — it will just be a no-op.

__dirname and __filename: These work in CommonJS mode. In ESM, use import.meta.dir and import.meta.file instead.

// Node.js ESM
import { fileURLToPath } from 'url';
import { dirname } from 'path';
const __filename = fileURLToPath(import.meta.url);
const __dirname = dirname(__filename);

// Bun
const __dirname = import.meta.dir;
const __filename = import.meta.file;
Enter fullscreen mode Exit fullscreen mode

Native Addons: Packages that use .node native addons (like bcrypt, sharp in some configurations, canvas) may not work. Check compatibility at bun.sh/guides/ecosystem. Most popular packages have been updated or have Bun-native alternatives.

Known Gotchas

  1. node:cluster: Not supported. Use Bun's --workers flag or external process managers.
  2. node:vm: Partial support. vm.Script works, but vm.Module is not implemented.
  3. node:inspector: Not supported. Use bun --inspect instead for debugging.
  4. child_process.fork(): Works, but uses bun as the child runtime, not node.
  5. node-gyp packages: May fail to compile. Check alternatives before migrating.

CI/CD Setup

GitHub Actions

# .github/workflows/ci.yml
name: CI

on: [push, pull_request]

jobs:
  test:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - name: Setup Bun
        uses: oven-sh/setup-bun@v2
        with:
          bun-version: latest

      - name: Install dependencies
        run: bun install --frozen-lockfile

      - name: Type check
        run: bunx tsc --noEmit

      - name: Run tests
        run: bun test --coverage

      - name: Build
        run: bun build ./src/index.ts --outdir ./dist --target node
Enter fullscreen mode Exit fullscreen mode

Dockerfile

FROM oven/bun:1 AS base
WORKDIR /app

# Install dependencies
FROM base AS deps
COPY package.json bun.lockb ./
RUN bun install --frozen-lockfile --production

# Build stage
FROM base AS build
COPY package.json bun.lockb ./
RUN bun install --frozen-lockfile
COPY . .
RUN bun build ./src/index.ts --outdir ./dist --target bun

# Production image
FROM oven/bun:1-distroless AS runtime
WORKDIR /app
COPY --from=deps /app/node_modules ./node_modules
COPY --from=build /app/dist ./dist
COPY package.json .

ENV NODE_ENV=production
EXPOSE 3000
CMD ["bun", "run", "./dist/index.js"]
Enter fullscreen mode Exit fullscreen mode

The oven/bun:1-distroless image is only 92MB, compared to node:20-alpine at 133MB.

bun.lockb in Version Control

Bun generates a binary lockfile (bun.lockb) rather than a text-based one. Commit this file to version control. To view its contents in a readable format:

bun bun.lockb # Prints the lockfile as JSON
Enter fullscreen mode Exit fullscreen mode

If you need a text lockfile for compatibility, Bun can also generate yarn.lock:

bun install --save-text-lockfile
Enter fullscreen mode Exit fullscreen mode

Incremental Migration Strategy

You do not have to migrate your entire codebase at once. Here is a practical phased approach:

Phase 1: Package Manager Only (Day 1)
Replace npm install with bun install in your CI and development workflow. Zero code changes. Immediate benefit: faster installs.

Phase 2: Development Runtime (Week 1)
Run your application with bun run instead of node. Most applications will work without changes. This gives you faster startup and native TypeScript support.

Phase 3: Replace Tool Dependencies (Week 2)
Remove dotenv (Bun loads .env natively), ts-node/tsx (Bun handles TypeScript), and optionally migrate from Jest to bun:test.

Phase 4: Adopt Bun-Native APIs (Ongoing)
Gradually replace http.createServer with Bun.serve(), fs.readFile with Bun.file(), and better-sqlite3 with bun:sqlite where it makes sense.

Phase 5: Production Deployment
Switch your Docker base image to oven/bun:1. Update your process manager configuration.


Performance in Real Applications

Raw benchmarks are useful but misleading. Here is what you can realistically expect when migrating a production Express application to Bun with Bun.serve():

  • API endpoints with database queries: 10-20% improvement (bottlenecked on DB)
  • File serving: 40-60% improvement
  • CPU-heavy JSON processing: 15-25% improvement
  • Cold start (Lambda/serverless): 70-80% improvement
  • Test suite execution: 4-7x faster

The biggest wins come in serverless environments and developer tooling. The smallest wins come in database-heavy workloads where network and query time dominate.


Conclusion

Bun 1.x is production-ready for most JavaScript and TypeScript applications. The migration path is gradual and low-risk — you can start with just bun install and work your way toward a full runtime migration over weeks.

The performance benefits are real and measurable: faster startup, better HTTP throughput, and dramatically improved developer experience. The ecosystem compatibility is strong enough that most applications can migrate with minimal changes.

The areas to watch carefully are native addons, node:cluster, and node:vm. If your application relies heavily on these, wait for fuller support. For everything else, the migration cost is low and the upside is real.

Start with your CI pipeline. Replace npm install. Run your tests. See what breaks. Then decide how far you want to go.


Resources


Wilson Xu is a software engineer focused on backend systems and developer tooling. He writes about performance engineering and modern JavaScript runtimes.

Top comments (0)