Most TypeScript auth libraries assume Node.js. They reach for crypto.randomBytes, Buffer, the Node fs module, sometimes process.env directly. That works on Vercel serverless, AWS Lambda with the Node runtime, Railway, Fly. It does not work on Cloudflare Workers.
Workers do not have Node. They have Web APIs. crypto means Web Crypto, not the Node crypto module. Buffer is gone. fs is gone. process.env does not exist. Bindings are injected into a request handler.
I rewrote KavachOS to run on Workers in February. Here is what I had to change, in case you are going through the same migration.
Why bother
Workers are cheap. They run close to the user. They cold-start in under 5ms. For an auth library that gets called on every authenticated request, that latency floor matters. If your auth library adds 80ms of cold start every time someone hits an endpoint, your app feels slow even when the actual logic is fast.
Most AI agent infrastructure is also moving to the edge. MCP servers, agent runtimes, function-call handlers, they all want to be near the user. If your auth layer cannot follow them there, it becomes the slow link.
Buffer is gone. Use Uint8Array.
This was the biggest change. I had Buffer.from(x) in maybe 60 places. All of it had to go.
Before:
const bytes = Buffer.from(secret, "utf-8");
const hex = bytes.toString("hex");
After:
const bytes = new TextEncoder().encode(secret);
const hex = Array.from(bytes)
.map((b) => b.toString(16).padStart(2, "0"))
.join("");
This is more verbose. It is the price of running on Web APIs.
For base64 I added a small helper:
function bytesToBase64Url(bytes: Uint8Array): string {
let binary = "";
for (const b of bytes) binary += String.fromCharCode(b);
return btoa(binary)
.replace(/\+/g, "-")
.replace(/\//g, "_")
.replace(/=+$/, "");
}
That replaced every Buffer.from(bytes).toString("base64url") call.
Node crypto out, Web Crypto in
Node:
import { randomBytes, createHash } from "node:crypto";
const token = randomBytes(32).toString("hex");
const hash = createHash("sha256").update(input).digest("hex");
Web Crypto:
const bytes = new Uint8Array(32);
crypto.getRandomValues(bytes);
const token = bytesToHex(bytes);
const data = new TextEncoder().encode(input);
const hashBuffer = await crypto.subtle.digest("SHA-256", data);
const hash = bytesToHex(new Uint8Array(hashBuffer));
Web Crypto is async. Node createHash is sync. That cascaded through the library because functions that called createHash had to become async too. About 20 internal helpers gained an await.
For JWT signing I switched from jsonwebtoken (Node-only) to jose, which has a Web Crypto path. Both work the same way at the API surface, but jose runs on Workers, Bun, and Deno without a polyfill.
SQLite to D1
Cloudflare D1 is SQLite, but the API is different. You cannot pass a url string. You pass a binding:
const kavach = await createKavach({
database: {
provider: "d1",
binding: env.DB, // injected by the Workers runtime
},
});
I added a d1 provider next to the existing sqlite provider. They share most of the schema and migration code. The difference is the prepared statement API. D1 uses db.prepare(sql).bind(...).run() instead of the better-sqlite3 style db.prepare(sql).run(...).
The migration utility had to ship two paths. On Node, it reads the schema file from disk. On Workers, the schema is bundled at build time as a string. The library imports it conditionally based on which provider is active.
process.env is gone
Workers do not have process.env. Bindings come in on the request:
export default {
async fetch(request: Request, env: Env) {
const kavach = await createKavach({
secret: env.AUTH_SECRET,
database: { provider: "d1", binding: env.DB },
});
return kavach.handle(request);
},
};
I had to push every secret read into a config object instead of having the library reach for process.env.X internally. This was good for testing too. It made the library easier to use in tests where you do not want real env vars.
TypeScript 5.8 ArrayBuffer / Uint8Array typing
This was the most annoying change. After upgrading to TypeScript 5.8, code like this stopped compiling:
const bytes: Uint8Array = await crypto.subtle.digest("SHA-256", data);
crypto.subtle.digest returns ArrayBuffer, not Uint8Array. You have to wrap it:
const bytes = new Uint8Array(await crypto.subtle.digest("SHA-256", data));
There were 30 places where I was implicitly casting. Each one needed an explicit new Uint8Array(...) constructor.
What I gained
Bundle size dropped from 240KB to 90KB after pulling out Node-only dependencies. Cold start on Workers is around 4ms for a typed handler. The library now runs on Workers, Bun, Deno, and Node from the same source tree, with no polyfills.
The "no Node-specific APIs" rule is also why KavachOS works in Vercel Edge runtime, AWS Lambda@Edge, Netlify Edge, and Deno Deploy. One auth library, every edge runtime.
What I would skip if I did this again
I spent two days on a custom AES-GCM helper before realizing Web Crypto already has it. Read the Web Crypto API docs first. Most of what you reach for in Node crypto has a direct counterpart. You just have to learn the new API surface.
I also wrote my own bytesToBase64 before realizing btoa works fine for ASCII. Use the platform.
If you are migrating and want a reference, KavachOS is open source. The PR that introduced D1 support is small enough to read in 10 minutes. The PR that switched away from node:crypto is bigger but documented commit by commit.
https://github.com/kavachos/kavachos
If you have ported a Node library to Workers or Bun, what was the gnarliest API surface you ran into? Buffer was easy in retrospect. process.env and the async cascade from Web Crypto cost me real days.
Top comments (0)