DEV Community

Sam
Sam

Posted on

How I made BeeThreads and what doest it solve

How I built BeeThreads in 2 days (and now it has 3k+ downloads)

A week ago I was complaining (again) about the same old thing:

“Man, I just want to write normal sync code and in the background with a very simple a manageble DX.”

We’ve all been there. One heavy crypto operation, one giant JSON parse, one dumb recursive Fibonacci… and boom — the event loop is dead. Everyone hates it.

So two days ago I decided to test Claude Opus 4.5 and try to make this real.

I was expecting a half-broken prototype.

Four hours later we had something that actually worked.

And somehow, in less than two days, it’s already at 3k+ downloads.

The dream vs reality

Before (native worker_threads)

// ~50+ lines of pain
const { Worker } = require('worker_threads');
const worker = new Worker('./worker.js');
worker.postMessage(data);
worker.on('message', resolve);
worker.on('error', reject);
worker.on('exit', cleanup);
// plus pooling, timeouts, retries, queue limits… you get it
Enter fullscreen mode Exit fullscreen mode

Would become this:

// bee-threads: 1 line
const result = await bee((x) => x * 2)(21); // 42
Enter fullscreen mode Exit fullscreen mode

But I didn’t want to stop at “it works”. I wanted rock-solid worker management and an absolutely killer developer experience (DX).

Here’s exactly how I made it happen:

I turned the event-driven nature of worker_threads into real Promises on the main thread — so threading feels exactly like writing normal async/await code.

Errors work the same way too: they’re proper Error instances with all custom properties, .cause, stack traces — everything preserved perfectly across thread boundaries.

The biggest challenge (and the magic) is running completely dynamic code inside workers without ever forcing you to create a separate worker file. We do this safely with vm.Script (never eval), and we aggressively cache the compiled functions using an LRU cache.

That means no AST parsing or recompilation on repeated calls — we go from ~0.4 ms (first run) down to ~1–3 µs on cache hits.

There are tons of production-ready features:

  • Retry with exponential backoff
  • AbortSignal cancellation
  • Timeout support
  • Streaming generators (for await)
  • Transferable Buffers (coming in next release)

And also ridiculous amount of V8-level optimizations: monomorphic object shapes (no hidden class junk), raw for loops in hot paths, O(1) counters instead of array scans, shared base vm.Context — all the scary stuff… but you never see it. The API stays clean and buttery smooth.

Under the hood, BeeThreads is smart about picking the best worker for each task. It uses function affinity: if a worker has already executed your function, it gets priority. That keeps the code “hot” in V8’s TurboFan JIT, giving you near-native performance after just a few calls.

Here’s some examples:

const { bee } = require('bee-threads');

//Run any function in a separate thread - promise like
const result = await bee((x) => x * 2)(21);  // 42

//Non Blocking I/O in any CPU-Itensive operation.
const hash = await bee((pwd) => 
  require('crypto').pbkdf2Sync(pwd, 'salt', 100000, 64, 'sha512').toString('hex')
)('password123');

//Run with Promise.all
const [a, b, c] = await Promise.all([
  bee((x) => x * 2)(21),
  bee((x) => x + 1)(41),
  bee(() => 'hello')()
]);
Enter fullscreen mode Exit fullscreen mode

You can also write code using method chaining if a very complete API called BeeThreads

const { beeThreads } = require('bee-threads');

await beeThreads
  .run((x) => x * 2)
  .usingParams(21)
  .execute();  // → 42
Enter fullscreen mode Exit fullscreen mode

if you would like to have a try

npm i bee-threads

github: https://github.com/samsantosb/BeeThreads
npm: https://www.npmjs.com/package/bee-threads

Top comments (0)