- Book: TypeScript in Production — Tooling, Build, and Library Authoring Across Runtimes
- Also by me: The TypeScript Library — the 5-book collection
- My project: Hermes IDE | GitHub — an IDE for developers who ship with Claude Code and other AI coding tools
- Me: xgabriel.com | GitHub
You add a greet route on the server that takes { name: string } and returns a string. After saving, you flip to the client tab and start typing client.greet.query({, and autocomplete already shows name. Type a wrong value, the squiggle fires before you finish. Nothing was generated, no OpenAPI document was rebuilt, no watcher kicked in. The server route is plain TypeScript, the client is plain TypeScript, they talk over JSON. How does tRPC do that?
The answer is two type-system tricks and a transport. Once you see them, both tricks look small. Small enough that ~200 lines of TypeScript gets you the same end-to-end type safety in your own codebase, without the dependency or the migration guide every two years.
Code targets TypeScript 5.6+ on Node, Bun, or Deno. Examples use Zod 4 because its schema-inference surface maps onto this example most directly, but every concept generalizes to Valibot, ArkType, or any validator that exposes an _output phantom type.
What tRPC is, in one paragraph
tRPC is a TypeScript library that lets a server export a router object and a client import type { AppRouter } to get fully typed .query and .mutation calls. No schema file, no OpenAPI spec, nothing generated. The router is a TypeScript value on the server. The client receives only its type, never the value, and uses that type to drive autocomplete and parameter checking. At runtime the client makes plain HTTP requests; the type system is what closes the loop.
That is the whole pitch. Everything else (middleware, batching, subscriptions) is layered on top.
Trick #1: path-string inference at the type level
When you call t.router({ greet: t.procedure.query(...) }), what comes back is a value whose type records every procedure name and its input/output types as keys in an object. The path is a tree of types the compiler walks at typecheck time.
Toy version:
type Procedure<I, O> = {
_input: I;
_output: O;
};
type Router<P extends Record<string, Procedure<any, any>>> = {
_procedures: P;
};
function procedure<I, O>(): Procedure<I, O> {
return {} as Procedure<I, O>;
}
function router<P extends Record<string, Procedure<any, any>>>(
procedures: P,
): Router<P> {
return { _procedures: procedures };
}
const appRouter = router({
greet: procedure<{ name: string }, string>(),
add: procedure<{ a: number; b: number }, number>(),
});
type AppRouter = typeof appRouter;
// AppRouter._procedures.greet._input is { name: string }
// AppRouter._procedures.greet._output is string
That is the spine. Procedure names and their I/O shapes both live in the type, so anything the client needs is reachable from typeof appRouter.
Real tRPC adds nesting (router({ users: router({ list: ... }) })), middleware chains, context propagation, and a few more layers of phantom generics, but the core trick is the same. The server's router is a typed map.
Trick #2: schema-validator inference
The second trick is how z.infer<typeof input> makes its way to the client. You write the input schema once, on the server, as a runtime validator. The validator's static type becomes the function signature on the client. One schema, two consumers.
import { z } from "zod";
const GreetInput = z.object({ name: z.string().min(1) });
type GreetInput = z.infer<typeof GreetInput>;
// GreetInput is { name: string }
Zod 4's z.infer is a conditional type that walks the schema and assembles the output. tRPC reuses this. When you do procedure.input(GreetInput).query(handler), the procedure type carries z.infer<typeof GreetInput> as its _input, and the client reads that as the parameter type for .query().
The validator does double duty. At runtime it parses incoming JSON and rejects bad payloads. The same value is the source of truth for the input shape on both ends at compile time. Change the schema, both sides update.
This is the part that disappears if you write your own micro-router and skip the validator step. You can still get type safety from a Procedure<I, O> declared with explicit type parameters, but skip it and you lose runtime validation. For production code, keep the validator.
Building it: the server side
Three pieces: a procedure helper that records input schema and handler, a router function that bundles them into a typed map, and a fetch handler that dispatches incoming POSTs to the right procedure.
// server.ts
import { z, ZodType } from "zod";
type Handler<I, O> = (input: I) => Promise<O> | O;
type Procedure<I, O> = {
_input: I;
_output: O;
schema: ZodType<I>;
handler: Handler<I, O>;
};
export function procedure<S extends ZodType>(schema: S) {
return {
query<O>(handler: Handler<z.infer<S>, O>): Procedure<z.infer<S>, O> {
return {
_input: undefined as unknown as z.infer<S>,
_output: undefined as unknown as O,
schema,
handler,
};
},
mutation<O>(handler: Handler<z.infer<S>, O>): Procedure<z.infer<S>, O> {
return {
_input: undefined as unknown as z.infer<S>,
_output: undefined as unknown as O,
schema,
handler,
};
},
};
}
type AnyProcedure = Procedure<any, any>;
export type Router<P extends Record<string, AnyProcedure>> = {
_procedures: P;
};
export function router<P extends Record<string, AnyProcedure>>(
procedures: P,
): Router<P> {
return { _procedures: procedures };
}
_input and _output are phantom: they exist on the type but their runtime values are never read. They are there so the client side has somewhere to look up the I/O shapes.
A small app router that uses it:
// app-router.ts
import { z } from "zod";
import { procedure, router } from "./server";
export const appRouter = router({
greet: procedure(z.object({ name: z.string().min(1) })).query(
({ name }) => `Hello, ${name}`,
),
add: procedure(
z.object({ a: z.number(), b: z.number() }),
).query(({ a, b }) => a + b),
createUser: procedure(
z.object({ email: z.email(), age: z.number().int().min(13) }),
).mutation(async ({ email, age }) => {
return { id: crypto.randomUUID(), email, age };
}),
});
export type AppRouter = typeof appRouter;
There are no path strings, OpenAPI tags, or schema exports. The router itself is the source of truth.
(If you are still on Zod 3, swap z.email() for z.string().email(). The top-level format helpers landed in Zod 4 and are the recommended form going forward.)
The dispatcher is the runtime side. It picks the procedure by name, validates input, runs the handler, returns JSON.
// handler.ts
import type { Router } from "./server";
export function createFetchHandler<P extends Record<string, any>>(
appRouter: Router<P>,
) {
return async (req: Request): Promise<Response> => {
if (req.method !== "POST") {
return new Response("Use POST", { status: 405 });
}
const url = new URL(req.url);
const procName = url.pathname.replace(/^\/rpc\//, "");
const proc = appRouter._procedures[procName];
if (!proc) {
return Response.json(
{ error: `Unknown procedure: ${procName}` },
{ status: 404 },
);
}
let body: unknown;
try {
body = await req.json();
} catch {
return Response.json({ error: "Invalid JSON" }, { status: 400 });
}
const parsed = proc.schema.safeParse(body);
if (!parsed.success) {
return Response.json(
{ error: "Invalid input", issues: parsed.error.issues },
{ status: 400 },
);
}
try {
const result = await proc.handler(parsed.data);
return Response.json({ data: result });
} catch (err) {
const message = err instanceof Error ? err.message : "Server error";
return Response.json({ error: message }, { status: 500 });
}
};
}
That is one POST endpoint per procedure, mounted under /rpc/<name>. You can wire it to any Fetch-compatible runtime: Bun, Deno, Node 20+, Cloudflare Workers, Hono, Elysia, or whichever platform handler your stack already speaks.
// bun-server.ts
import { appRouter } from "./app-router";
import { createFetchHandler } from "./handler";
const handler = createFetchHandler(appRouter);
Bun.serve({ port: 3000, fetch: handler });
Building it: the client side
The client imports the router's type, never its value. That is what keeps server-only code (database imports, secrets, the validator's runtime) out of the client bundle. Then it builds a runtime proxy that, when you say client.greet.query({ name: "Sam" }), posts to /rpc/greet with that payload.
// client.ts
import type { Router } from "./server";
type AnyProcedure = { _input: any; _output: any };
type ProcedureClient<P extends AnyProcedure> = {
query: (input: P["_input"]) => Promise<P["_output"]>;
mutation: (input: P["_input"]) => Promise<P["_output"]>;
};
type Client<R extends Router<any>> = {
[K in keyof R["_procedures"]]: ProcedureClient<R["_procedures"][K]>;
};
export function createClient<R extends Router<any>>(opts: {
url: string;
}): Client<R> {
const call = async (procName: string, input: unknown) => {
const res = await fetch(`${opts.url}/rpc/${procName}`, {
method: "POST",
headers: { "content-type": "application/json" },
body: JSON.stringify(input),
});
const json = (await res.json()) as
| { data: unknown }
| { error: string; issues?: unknown };
if (!res.ok || "error" in json) {
const err = "error" in json ? json.error : `HTTP ${res.status}`;
throw new Error(err);
}
return (json as { data: unknown }).data;
};
return new Proxy({} as Client<R>, {
get(_, procName: string) {
return {
query: (input: unknown) => call(procName, input),
mutation: (input: unknown) => call(procName, input),
};
},
});
}
The Client<R> mapped type is where the proxy gets its types. [K in keyof R["_procedures"]] walks the procedure map and produces a query/mutation function whose parameter is P["_input"] and whose return is Promise<P["_output"]>. The proxy is the runtime, the mapped type is the static surface.
Use it from anywhere (browser, Bun script, Deno worker):
// usage.ts
import { createClient } from "./client";
import type { AppRouter } from "./app-router";
const trpc = createClient<AppRouter>({ url: "http://localhost:3000" });
const greeting = await trpc.greet.query({ name: "Sam" });
// ^? string
const sum = await trpc.add.query({ a: 2, b: 3 });
// ^? number
const user = await trpc.createUser.mutation({
email: "sam@example.com",
age: 22,
});
// ^? { id: string; email: string; age: number }
If you remove name from the call, TypeScript fires before you save. Change the server route to require firstName, and the client .query({ name: ... }) lights up red the next time the editor reads app-router.ts. No codegen, no rebuild step, no shared schema file beyond the one TS-import path that imports a type.
That is the whole loop.
What that costs and what tRPC v11 adds
The micro-router above is around 200 lines counting comments and blank space. Enough for a small product. Not a replacement for tRPC v11, because v11 ships features you would build yourself only if you needed them, and would regret skipping.
The two features that earn the dependency for most teams are middleware composition and the React-Query bindings. tRPC's t.middleware().pipe() composes auth, logging, rate limiting, and per-request transactions with full type-flow into the procedure context. You can hand-roll a middleware array on each procedure executed before the handler in another 50 lines, but the type-flow story is what makes the framework version pay rent. The React side is the second hook: @trpc/tanstack-react-query is the new package replacing @trpc/react-query for v11 (see the migration guide), and it gives you useQuery/useMutation hooks fully typed off your router. Building it on top of your own router takes another few hundred lines, but it is the single biggest reason teams stay with tRPC: the React DX is the product.
The rest of v11's surface is layered on top. Request batching collapses multiple .query calls in the same tick into one POST via httpBatchLink, which matters when a typical dashboard mount fires 8–12 queries and you want one round-trip. Subscriptions over SSE became stable in tRPC v11, released in 2025; if you need pub/sub, you want this and you want to not write the reconnect logic yourself. Transformers like superjson plug in so Date, BigInt, Map, and Set round-trip across JSON; the micro-router will drop a Date to a string until you add the same two JSON.stringify swaps on both sides. And the adapter set (Express, Fastify, Next.js App Router, AWS Lambda, Cloudflare Workers, standalone, Hono) is a thin shim around the same createFetchHandler shape we built. Each one is small. There are just enough of them that shipping them yourself is real work.
When the hand-rolled version wins
The 200-line micro-router earns its keep when one team owns both ends of the wire and the surface stays small. A backend service talking to one frontend you control, fewer than ten endpoints, a stack that already uses Zod (or Valibot, or ArkType) for validation, and no plans to grow batching or subscriptions into the picture: that is the exact shape where tRPC's ecosystem is overhead and a typed fetch wrapper is the right tool. The other thing the hand-rolled version buys you is readability. A new hire can step into the Proxy get and see the whole shape in an afternoon, which is something you cannot say about createRecursiveProxy and the layered link system.
Reach for tRPC v11 once any of three things show up. You ship a non-trivial React app and want useQuery typed off the server, where the hours saved are real and writing the bindings yourself is the longest tail. Subscriptions matter and you do not want to own SSE reconnect logic. Or the team grows past the point where one person owns the wire format, and tRPC's conventions become a free shared language across services.
The real question is which of those layered features you actually want. Most production teams want the React Query bindings and the middleware story. Plenty of internal tools want neither and ship faster without the framework.
Verifying with the real thing
The two tricks are not hidden. The tRPC v11 source has a Procedure type with _input_in, _input_out, _output_in, _output_out phantom fields, separated for input transforms and output transforms. Same idea as the toy _input/_output above. The router builder lives in the server package's core router module, and the proxy client uses createRecursiveProxy to walk the path tree at runtime. Same Proxy move our 30-line client uses, generalized to nested routers.
Hono RPC (hono/client) applies the same trick to Hono routes: server route type imported on the client, client proxy walks it at the type level, await client.greet.$post({ json: { name } }) is fully typed. The framing is different (request/response objects instead of input/output), but the two tricks underneath are the same.
The pattern is general. Once you see it once, you spot it everywhere.
Closing
If you have never written the 200-line version, write it once. You will read tRPC's source faster afterwards. You will also, the next time you scaffold an internal tool that does not need batching or subscriptions, type it directly on top of fetch and skip the framework.
The deeper material on Proxy-driven APIs, mapped types, and how to author libraries that expose this kind of inference without melting tsc is in TypeScript in Production. The path-inference and schema-inference patterns above are chapter-level material there.
If this was useful
If you want the tooling-and-authoring layer underneath this kind of code (tsconfig for libraries, dual ESM/CJS, JSR publishing, monorepo wire-up, the whole production layer), that is what TypeScript in Production covers. Pick the entry point that matches where you are.
The TypeScript Library — 5-book collection:
- TypeScript Essentials — From Working Developer to Confident TS, Across Node, Bun, Deno, and the Browser — types, narrowing, modules, async, daily-driver tooling
- The TypeScript Type System — From Generics to DSL-Level Types — generics, mapped/conditional types, infer, template literals, branded types
- Kotlin and Java to TypeScript — A Bridge for JVM Developers — variance, null safety, sealed→unions, coroutines→async/await
- PHP to TypeScript — A Bridge for Modern PHP 8+ Developers — sync→async paradigm, generics, discriminated unions
- TypeScript in Production — Tooling, Build, and Library Authoring Across Runtimes — tsconfig, build tools, monorepos, library authoring, dual ESM/CJS, JSR
Books 1 and 2 are the core path. If you are already a JVM or PHP dev, books 3 or 4 substitute for book 1. Book 5 is for anyone shipping TypeScript at work.
All five books ship in ebook, paperback, and hardcover.



Top comments (0)