DEV Community

Cover image for The JavaScript Runtime Handbook - Deno, Bun and Node.js in 10 minutes
Sk
Sk

Posted on

The JavaScript Runtime Handbook - Deno, Bun and Node.js in 10 minutes

A runtime is the only way JavaScript becomes a systems language. If you truly understand that, you'll be unstoppable.

And if you're a backend engineer, or aspiring to operate at the lower levels, this post is for you. Because how you view runtimes directly dictates what you’re able to build.

If you think runtimes are just for CRUD, you’ll only ever write CRUD.

But if you see runtimes as bridges into the system layer, whole new worlds open up.

Here’s the fact:
A runtime sits on top of a systems language and gives JavaScript bindings into it.

  • Node.js is backed by C++
  • Bun is built on Zig
  • Deno is powered by Rust In essence, Node.js is C++ abstracted.

So when I pick a runtime, I’m hunting for three things that let me build at an unhinged level:

  1. Network capability - HTTP and raw TCP. I want to serve APIs and talk binary over sockets. Like in BunniMq, a message broker I built in JavaScript that speaks low-level.
  2. System access - Filesystems, streams, process control. I want to touch the kernel.
  3. Extensibility - Can I hook into the runtime's? Load native binaries? Inject C++ or Rust where it matters? That’s how I built imgui.js, a threaded and sparse set powered C++ GUI framework over dear imgui for Node:

imgui.js

Also runs in bun(uses's napi <- more on this in the bun section)

imgui.js,runs in bun

Once you flip this switch, you're done building basic apps, you start crafting infrastructure.

This is why I’m on a crusade to push deeper runtime literacy.
If that hits? You're in the right place.

So we'll start with Node.js. It’s familiar territory, and the perfect anchor to compare other runtimes against.


Node.js - The C++ Runtime

Node is the most mature of the three major JavaScript runtimes, and probably the most widely used. I’ve been pushing it to absurd levels for years. Memory leaks? Segfaults? Love 'em. That's how deep you can go.

Its maturity means you can do things fast and find things fast, thanks to extensive docs and a rich ecosystem.


Servers and Network Communication

Setting up raw HTTP/HTTPS servers in Node is as simple as an import:

// Example: HTTP Server
const http = require('http')

const server = http.createServer((req, res) => {
  res.writeHead(200)
  res.end('Hello, World!')
})

server.listen(3000)
Enter fullscreen mode Exit fullscreen mode

Raw TCP is just as easy:

// Example: TCP Server
const net = require('net')

const server = net.createServer(socket => {
  socket.write('Welcome!\n')
  socket.on('data', data => {
    console.log('Received:', data.toString())
  })
})

server.listen(4000)
Enter fullscreen mode Exit fullscreen mode

Note: You can’t curl a raw TCP server, it speaks pure binary. You’ll need a raw TCP client:

// Example: TCP Client
const client = net.createConnection({ port: 4000 }, () => {
  client.write('Hello server!')
})
Enter fullscreen mode Exit fullscreen mode

That’s the two core layers: HTTP and raw TCP. Everything else under the hood? That’s C++, powered by libuv.


Buffers - Low-Level JavaScript

Buffers are raw memory. They’re how you tap into low-level JavaScript, and they’re fast. Most developer tools in the Node ecosystem are built around Buffer for this reason.

Example: my pure javascript message broker BunniMq is littered with them.

Here's how easy they are to use:

const buf = Buffer.alloc(4)  // 4 bytes
buf.writeInt32BE(10)
console.log(buf.readInt32BE(0)) // Outputs: 10
Enter fullscreen mode Exit fullscreen mode

With just this, you can start building serious low-level abstractions.


Streams - All About Moving Data

Streams are how you move massive amounts of data efficiently, think Google's distributed file systems. Unlike buffered APIs that wait for everything to load, streams read in chunks (you control the size, 64KB, 64MB, etc).

They don’t just read; they write too. Perfect for piping files, building proxies, or transforming data on the fly.

Example: simple file stream in Node:

const fs = require('fs');


const readable = fs.createReadStream('input.txt');
const writable = fs.createWriteStream('output.txt');


readable.pipe(writable);

writable.on('finish', () => {
  console.log('Copy complete');
});

Enter fullscreen mode Exit fullscreen mode

That’s it. Chunked, non-blocking, memory-safe.


Extensibility - N-API and FFI

Here’s where things get wild, hooking into native code.

N-API (Node Addons)

N-API lets you write C++ plugins that Node can ingest like any regular JS module. You go here when JavaScript just doesn’t cut it, either for speed or capability.

You write C++ and expose it to JS. When you call the JS side, V8 translates the call and executes the C++ under the hood.

Examples:

  • better-sqlite3 → native SQLite, N-API bound.
  • tensorflow-node → full TensorFlow, exposed to JavaScript.

Here’s a tiny example stub of what a N-API module might look like:

// hello.cpp
#include <napi.h>

Napi::String Hello(const Napi::CallbackInfo& info) {
  return Napi::String::New(info.Env(), "Hello from C++!");
}

Napi::Object Init(Napi::Env env, Napi::Object exports) {
  exports.Set("hello", Napi::Function::New(env, Hello));
  return exports;
}

NODE_API_MODULE(hello, Init)
Enter fullscreen mode Exit fullscreen mode

Then you compile it and load it in Node. It's magic.

N-API is stable, well-documented, and honestly a pleasure to work with.


Dynamic Libraries - DLLs and .so Files (FFI)

If you grew up modding games, you know DLLs. That’s how systems load external code dynamically.

FFI (Foreign Function Interface) lets you load and call dynamic libraries from languages like C, C++, or Rust at runtime, without needing a build step.

Which is how I ported Golang Charm CLI library wasm first then dll's see How To Build Beautiful Terminal UIs (TUIs) in JavaScript! and repo

With node out of the way! it's Deno time.


Deno - The Infra Runtime "Nobody Talks About" (But Everyone Uses)

Deno is the runtime quietly powering your favorite cloud and edge platforms. If someone tells you “nobody uses Deno,” they’re either (1) a noob or (2) lying.

Because of its built-in security model and sane defaults, Deno is widely adopted in infra environments:

  • Cloudflare Workers
  • Netlify Edge Functions
  • Supabase Functions

Yep, many of these run on Deno under the hood.

I’ve personally shipped with Deno. Took down an 8-month-old monolith and decomposed it into services in just a week. The developer experience(DX) is chef’s kiss: TypeScript out of the box, high-level APIs, zero config builds.


Servers and Network Communication

Prerequisites

Install the runtime

In VSCode, make sure the Deno plugin is installed.
Then run Ctrl+Shift+P → “Enable Deno” so the Deno namespace and types stop VSCode from crying.


HTTP Server (Simple)

Deno.serve({ port: 4242, hostname: "0.0.0.0" },(req) => {

  if (req.url == "/"){
  return new Response("hello World", {
    status: 200,
    // default header text
  });
  } else{
  const body = JSON.stringify({ message: "NOT FOUND" });
  return new Response(body, {
    status: 404,
    headers: {
      "content-type": "application/json; charset=utf-8",
    },
  });
  } 

});
Enter fullscreen mode Exit fullscreen mode

run

 deno --allow-net .\server.ts
Enter fullscreen mode Exit fullscreen mode

Want HTTPS? plug cert options

Deno.serve({
  port: 443,
  cert: Deno.readTextFileSync("./cert.pem"),
  key: Deno.readTextFileSync("./key.pem"),
}, handler);
Enter fullscreen mode Exit fullscreen mode

Raw TCP Server

const listener = Deno.listen({port: 3000})
// deno --allow-net .\tcp.ts
// TCP server listening on :4000

console.log("TCP server listening on :3000");
for await (const conn of listener) {
    console.log("on con", conn)

  ;(async () => {
    const buf = new Uint8Array(1024);
    await conn.read(buf);
    console.log("Received:", new TextDecoder().decode(buf));
    await conn.write(new TextEncoder().encode("PONG\n"));
    conn.close();
  })();
}
Enter fullscreen mode Exit fullscreen mode

TCP Client

const conn = await Deno.connect({ port: 8080 });
await conn.write(new TextEncoder().encode("Hello Server!"));
conn.close();
Enter fullscreen mode Exit fullscreen mode

So yeah, it’s mostly syntax differences, all these runtimes are Node-inspired under the hood.


Buffers - Raw Memory Access

Deno gives you two flavors of buffer handling:

Native Deno Style

// Allocate a zeroed buffer of 1 KB
const buf = new Uint8Array(1024);
// Fill it
buf.fill(0xFF, 0, 256); // set first 256 bytes to 0xFF
console.log(buf.byteLength); // 1024

Enter fullscreen mode Exit fullscreen mode

Node.js Compatibility Shim

import { Buffer } from "node:buffer";

const buf = Buffer.alloc(4);
buf.writeUInt32BE(42);
console.log(buf.readUInt32BE(0)); // 42
Enter fullscreen mode Exit fullscreen mode

Streams - Still the Way

Deno streams work just like modern web streams:

const output = await Deno.open("example.txt", {
  create: true,
  append: true,
});
const outputWriter = output.writable.getWriter();
await outputWriter.ready;
const encoded = new TextEncoder().encode("I love Deno!");
await outputWriter.write(encoded);
await outputWriter.close();


const input = await Deno.open("example.txt");
const inputReader = input.readable.getReader();
const decoder = new TextDecoder();
while (true) {
  const result = await inputReader.read();
  if (result.done) {
    break;
  }
  console.log(`Read chunk: ${decoder.decode(result.value)}`);
}
Enter fullscreen mode Exit fullscreen mode

Efficient, native, chunked file writing, no buffer bloating required.


Extensibility - FFI and Native Plugins

Deno comes with built-in FFI support, no extra libs or build steps needed. Here's a native .dll/.so call in pure Deno:

const libName = {
  windows: "./lib.dll",
  linux: "./liblib.so",
  darwin: "./liblib.dylib",
}[Deno.build.os];

const dylib = Deno.dlopen(
  libName,
  {
    fibonacci: { parameters: ["u32"], result: "u32" },
  } as const,
);

console.log("Fibonacci(10) =", dylib.symbols.fibonacci(10)); // 55

dylib.close();
Enter fullscreen mode Exit fullscreen mode

Want to go lower? Deno lets you plug in native Rust with Deno plugins:

#[no_mangle]
pub fn deno_plugin_init(interface: &mut dyn Interface) {
    interface.register_op("hello_world", hello_world);
}
Enter fullscreen mode Exit fullscreen mode

This is runtime-level integration, no glue, just raw access.


That’s Deno in a nutshell:
Secure by default, fast, native-friendly, and actually used in production-grade edge systems.

Next, we’ll hit Bun, the chaotic performance beast.
Let’s go.


Bun - The Fast, Zig-Powered Runtime

Bun is pure speed. Built in Zig, optimized to the teeth.

Before diving in, make sure Bun is installed.

But, just like Deno, you’ll want proper type support in VSCode. Luckily, Bun makes it easy:

bun init  # pulls in relevant modules and types
Enter fullscreen mode Exit fullscreen mode

Now you’ve got auto-complete, no red squiggles, and a ready-to-go project.


Servers and Network Communication

HTTP Server

Bun.serve({
  port: 3000,
  fetch(req) {
    return new Response("Hello from Bun!");
  },
});
Enter fullscreen mode Exit fullscreen mode

Raw TCP Server

const server = Bun.listen({
  hostname: "0.0.0.0",
  port: 8080,
  socket: {
    open(socket) {
      socket.write("Welcome to TCP server!\n");
    },
    data(socket, data) {
      console.log("Received:", data.toString());
    },
    close(socket) {
      console.log("Client disconnected.");
    },
  },
});
Enter fullscreen mode Exit fullscreen mode

TCP Client

const conn = Bun.connect({
  hostname: "localhost",
  port: 8080,
});

conn.write("Hello server!");
Enter fullscreen mode Exit fullscreen mode

Same core concepts, different flavor.


Buffers - Bun Style (Blobs + Typed Arrays)

Bun leans into Blob for raw memory, but you’ll still use Uint8Array under the hood:

const blob = new Blob([ new Uint8Array(256) ]);
const arr = await blob.arrayBuffer(); 
console.log(arr.byteLength); // 256
Enter fullscreen mode Exit fullscreen mode

Remember: Uint8Array is your friend for byte-level control, 1 byte = 8 bits.
That’s how you play with memory directly in JS.


Streams - Well... Kinda?

I tried Bun streams. I really did. But they just stared back at me and refused to work. I followed their official example, and this is what I got:

const rs = new ReadableStream({
  start(controller) {
    controller.enqueue(new TextEncoder().encode("hello\n"));
    controller.close();
  },
});

const response = new Response(rs);

await Bun.write("out.log", response);

console.log("✅ Done writing to out.log");

process.exit(0);
Enter fullscreen mode Exit fullscreen mode

In theory, this should write to a file.
In practice? Nothing. Maybe I’m doing something wrong.


Extensibility - FFI & Node-API

Bun supports FFI, but it's experimental (read: ⚠️ buggy, incomplete, and definitely not production-ready).

Here’s their official warning:

⚠️ Warningbun:ffi is experimental, with known bugs and limitations. The most stable way to interact with native code from Bun is to write a Node-API module.

So Bun leans on Node-API (N-API) for native plugins, same interface Node uses for C++ modules.
And Bun implements 95% of that from scratch. Meaning: most .node files just work.

You’ve got two options to load them:

require

const napi = require("./my-node-module.node");
Enter fullscreen mode Exit fullscreen mode

process.dlopen

let mod = { exports: {} };
process.dlopen(mod, "./my-node-module.node");
Enter fullscreen mode Exit fullscreen mode

It’s fast, compatible, and if you're already building native modules for Node, Bun probably just runs them.


That’s Bun in a nutshell:
Ridiculously fast, wild edge cases, evolving quickly, and built to compete.


In this post we covered the three major runtimes in three levels(networking, systems and extensibility), with node.js being the most mature!

So if you want to build lower level systems and infra! this should be your mantra, and guide to what to learn!

I’ll be posting more deep dives on backend topics,JavaScript, Golang, C++, and low-level systems on Substack. Would love to have you there; come say hi:

Coffee & Kernels | skdev | Substack

Where we segfault Node.js for fun, yes, it’s possible. Just Google N-API. Click to read Coffee & Kernels, by skdev, a Substack publication. Launched 9 days ago.

favicon skdev.substack.com

X

Thanks for reading.

Top comments (1)

Collapse
 
sfundomhlungu profile image
Sk

npm module for loading dynamic libraries: dlls', so etc

ffi-napi - npm

A foreign function interface (FFI) for Node.js, N-API style. Latest version: 4.0.3, last published: 4 years ago. Start using ffi-napi in your project by running `npm i ffi-napi`. There are 321 other projects in the npm registry using ffi-napi.

favicon npmjs.com