DEV Community

websilvercraft
websilvercraft

Posted on

One router, two runtimes: itty-router on Cloudflare Workers and Node

I wanted one tiny router that runs unchanged on Cloudflare Workers and a plain Node HTTP server. The secret sauce is itty-router + the Fetch standard (Request/Response). Here’s how I wired it up, plus a quick note on the “do Node and Cloudflare use the same Response?” question.

Do Node and Cloudflare use the same Response object?
Short answer: not literally the same object, but the same Fetch standard Response API, compatible classes with different under the hood implementations.
Cloudflare Workers exposes a Response that follows the WHATWG Fetch spec.
Modern Node (v18+) also exposes a Fetch-compatible Response (implemented by undici).

So they’re API-compatible (same props/methods: status, headers, text(), json(), arrayBuffer(), body as a Web ReadableStream, new Response(), etc.), but they’re different implementations under the hood—don’t rely on brand checks like mixing realms or instanceof across runtimes.

A few practical gotchas:
Streaming: both use Web Streams; in Node you sometimes convert to/from Node streams (Readable.fromWeb, Readable.toWeb) when bridging to http.ServerResponse.
Node-only quirks (Requests): when you construct a Request with a Node stream body, Node needs duplex: 'half'. This is not needed/used in Workers. (Doesn’t affect Response itself.)
Helpers: Response.json() exists in Workers and in recent Node. If you need to support older Node, fallback to new Response(JSON.stringify(data), { headers: { 'content-type': 'application/json; charset=utf-8' } }).

If you stick to the spec surface (like in the router below), you can reuse the same code in both environments. For a more detailed explanation, check What’s the Request object in the browser, Cloudflare, and Node?.


The shared router (one codepath)

itty-router’s AutoRouter can produce a single fetch handler. Keep the routes and tiny HTTP helpers here.

// src/router.ts
import { AutoRouter } from 'itty-router'

// tiny helpers
const withCORS = (res: Response, status = res.status) =>
  new Response(res.body, {
    status,
    headers: {
      'content-type': res.headers.get('content-type') ?? 'application/json; charset=utf-8',
      'access-control-allow-origin': '*',
      'access-control-allow-methods': 'GET,POST,OPTIONS',
      'access-control-allow-headers': 'content-type',
    },
  })

const ok = (data: unknown) => withCORS(Response.json(data))
const badRequest = (msg: string) => withCORS(Response.json({ error: msg }), 400)
const noContent = () => withCORS(new Response(null, { status: 204 }), 204)

const router = AutoRouter({ base: '/' })

// CORS preflight
router.options('/*', () => noContent())

// sample routes (keep it simple)
router.get('/hello', () => ok({ hello: 'world' }))

router.post('/echo', async (req: Request) => {
  try {
    const body = await req.json()
    return ok({ youSent: body })
  } catch {
    return badRequest('Expected JSON body')
  }
})

// 404
router.all('*', () => withCORS(Response.json({ error: 'Not found' }), 404))

// export a single fetch (works on Workers; reusable in Node)
export default { fetch: router.fetch }
Enter fullscreen mode Exit fullscreen mode

That’s it. This fetch function is our “universal” entry point.


Cloudflare Worker entry (zero glue)

Cloudflare Workers already speak fetch:

// src/worker.ts
import router from './router'

export default {
  fetch: router.fetch,
}
Enter fullscreen mode Exit fullscreen mode

Deploy with your usual Wrangler config.


Node adapter (a tiny bridge)

For Node, we just translate IncomingMessageRequest, call the shared fetch, then write the Response back.

// src/node-server.ts
/// <reference types="node" />
import http from 'node:http'
import { Readable } from 'node:stream'
import router from './router'

const PORT = Number(process.env.PORT ?? 8787)

http.createServer(async (req, res) => {
  try {
    const url = new URL(req.url ?? '/', `http://${req.headers.host ?? 'localhost'}`)
    const method = req.method ?? 'GET'

    // build Headers
    const headers = new Headers()
    for (const [k, v] of Object.entries(req.headers)) {
      if (Array.isArray(v)) v.forEach(vv => headers.append(k, vv))
      else if (v != null) headers.set(k, String(v))
    }

    // only attach a body for non-GET/HEAD
    const hasBody = !['GET', 'HEAD'].includes(method.toUpperCase())
    const body = hasBody ? (req as unknown as Readable) : undefined

    const request = new Request(url.toString(), {
      method,
      headers,
      body,
      // Node (undici) quirk when passing a stream body:
      // @ts-expect-error undici extension
      duplex: hasBody ? 'half' : undefined,
    })

    const response = await router.fetch(request)

    res.statusCode = response.status
    response.headers.forEach((v, k) => res.setHeader(k, v))

    if (!response.body) return res.end()

    // stream to Node
    // @ts-ignore types for toWeb are a bit behind
    await (response.body as any).pipeTo(Readable.toWeb(res) as any)
  } catch (err: any) {
    res.statusCode = 500
    res.setHeader('content-type', 'application/json; charset=utf-8')
    res.end(JSON.stringify({ error: 'Internal error', detail: String(err?.message ?? err) }))
  }
}).listen(PORT, () => {
  console.log(`Node server: http://localhost:${PORT}`)
})
Enter fullscreen mode Exit fullscreen mode

TypeScript tip

Make sure your tsconfig.json includes DOM types so Request/Response are available in Node:

{
  "compilerOptions": {
    "target": "ES2022",
    "module": "ESNext",
    "moduleResolution": "Bundler",
    "lib": ["ES2022", "DOM"],
    "strict": true
  }
}
Enter fullscreen mode Exit fullscreen mode

Why this works so well

  • Single mental model: you write to the Fetch standard once. itty-router just plugs in.
  • No “environment if”s: the router and helpers are runtime-agnostic.
  • Portability: same handler can move to other Fetch runtimes (Bun, Deno Deploy, etc.) with little/no glue.

Quick curl test

# Node server
curl http://localhost:8787/hello
# => {"hello":"world"}

curl -X POST http://localhost:8787/echo -H 'content-type: application/json' -d '{"x":1}'
# => {"youSent":{"x":1}}
Enter fullscreen mode Exit fullscreen mode

For Workers, hit your deployed URL with the same requests.


Gotchas & guardrails

  • Don’t do instanceof Response checks across runtimes—treat them as structurally compatible, not identical classes.
  • When constructing a Request from a Node stream, pass duplex: 'half'.
  • If you need cookies or advanced streaming, test both runtimes—semantics are compatible, but edge cases differ.

Top comments (0)