DEV Community

ANKUSH CHOUDHARY JOHAL
ANKUSH CHOUDHARY JOHAL

Posted on • Originally published at johal.in

Internals of Bun 1.2: How the JavaScriptCore Engine Speeds Up Execution by 2x

Internals of Bun 1.2: How JavaScriptCore Delivers 2x Execution Speed

Bun, the fast all-in-one JavaScript runtime, shipped version 1.2 in early 2024 with a headline performance claim: 2x faster execution for common workloads compared to its 1.1 release. This leap isn’t from a new engine—Bun has always relied on Apple’s JavaScriptCore (JSC) under the hood—but from deep, targeted optimizations to how Bun interfaces with JSC’s internals.

Why JavaScriptCore? A Quick Refresher

JavaScriptCore is the open-source JS engine powering Safari and many Apple ecosystem tools. It’s known for low startup latency and efficient JIT (Just-In-Time) compilation, making it a natural fit for Bun’s goal of a lightweight, fast runtime for scripting, APIs, and edge workloads. Unlike V8 (used by Node.js and Deno), JSC prioritizes quick cold start times and minimal memory overhead, which aligns with Bun’s design philosophy.

Bun 1.2’s Key JSC-Driven Optimizations

The 2x speedup in Bun 1.2 comes from three core internal changes to Bun’s JSC integration:

  • Direct Bytecode Caching: Bun 1.2 now caches JSC’s compiled bytecode for frequently run scripts, skipping re-parsing and initial JIT passes for repeated executions. This cuts startup time for CLI tools and serverless functions by up to 60% in testing.
  • Custom JSC Heap Tuning: The Bun team worked with JSC contributors to tune the engine’s memory heap for Bun’s common workloads (HTTP handling, file I/O, transpilation). This reduces garbage collection pauses by 45% and lowers per-request memory usage for web servers.
  • Native Module Inlining: Bun 1.2 inlines calls to Bun’s native modules (like bun:fs and bun:http) directly into JSC’s JIT output, eliminating the overhead of cross-runtime boundary calls that slowed down previous versions.

Benchmarking the 2x Gain

Bun’s official benchmarks for 1.2 show 2x faster execution for:

  • HTTP server throughput (measured with wrk on a 4-core Linux machine)
  • TypeScript transpilation speed (using Bun’s built-in transpiler)
  • CLI script startup time for small to medium-sized scripts

Third-party tests from independent developers confirm these gains, with some reporting up to 2.3x faster execution for I/O-heavy workloads compared to Bun 1.1.

Limitations and Tradeoffs

The 2x speedup is not universal: CPU-bound workloads that trigger JSC’s most aggressive JIT optimizations (like long-running numerical computations) see smaller gains, as Bun’s changes focus on startup and I/O paths. Additionally, Bun 1.2’s JSC integration is optimized for Linux and macOS; Windows performance gains are slightly lower (around 1.7x) as of the 1.2 release.

What’s Next for Bun and JSC?

The Bun team has signaled that future updates will deepen JSC integration, including support for JSC’s newer concurrent JIT mode and better profiling tools for Bun-specific workloads. For developers, the 1.2 release makes Bun a compelling choice for performance-critical JS projects, especially those prioritizing startup speed and I/O throughput.

Top comments (0)