You've done it. We all have.
A new endpoint shows up. You curl it, get back eighty lines of nested JSON, and start typing:
interface User {
id: string
name: string
// wait, is `email` always present?
}
Twenty minutes later you've shipped it, ten minutes after that production is throwing Cannot read property 'name' of undefined because the API returns null for soft-deleted accounts, and you're patching it on the train home.
Hand-writing TypeScript interfaces from JSON responses is one of those tasks that looks like five minutes of work and is actually a slow drip of bugs. This post walks through five better ways to do it, the tradeoffs of each, and the specific failure modes you'll still need to watch for.
Why "just type it out" fails more than people admit
Three things go wrong when humans translate JSON to TypeScript:
You miss optional fields. The sample payload has email: "x@y.com" but the API spec says email can be omitted. Your interface has a non-optional email: string. The compiler is happy. Production isn't.
You confuse null and undefined. APIs love returning null for missing values. TypeScript loves using ? for optional. These aren't the same. email?: string means the key might not exist; email: string | null means the key exists but the value can be null. Mixing them silently is one of the most common ways type-safe code becomes type-theater.
You assume one sample tells the whole story. A field that comes back as ["admin"] in your sample is string[]. The same field returns [] for unprivileged users — and [] could just as easily have been number[] if you'd seen a different sample. Your "type-safe" code is now wrong half the time.
These aren't theoretical. They're a big chunk of the runtime bugs that survive code review in well-typed codebases. Auto-generation doesn't eliminate them, but it cuts the surface area dramatically.
Method 1 — Paste it into a browser tool
For one-off conversions during development, the lowest-friction option is a browser tool that converts JSON to TypeScript in real time. The JSON-to-TypeScript converter on json.renderlog.in runs entirely client-side — paste the JSON, copy the interface, done. Nothing leaves your browser, which matters when the JSON contains anything sensitive (auth tokens, internal IDs, customer data you shouldn't be uploading to a random web tool).
When this is the right tool:
- Quick exploration ("what's the shape of this thing?")
- Pairing with a teammate or screen-sharing in a meeting
- One-time scripts where you just need to ship something today
When it isn't:
- Anything that runs in CI
- Repeatedly regenerating types from a moving schema
- Type generation that needs to live alongside your API client
For repeated work, you want a CLI.
Method 2 — quicktype (the workhorse)
quicktype is the most flexible converter for production use. It reads JSON, JSON Schema, GraphQL, or TypeScript and outputs almost any language. For a Node project:
npx quicktype --src-lang json --lang typescript \
--top-level User user-sample.json > types/user.ts
It also handles arrays of samples, which is how you avoid the "single-sample lies" problem:
npx quicktype --src-lang json --lang typescript \
--top-level Order order-1.json order-2.json order-3.json \
> types/order.ts
Feed it three or four real responses and it'll correctly mark fields as optional that appear in some samples but not others.
The catch: quicktype's defaults are conservative. If a field is null in even one sample, it widens to T | null. This is correct but can produce noisy types for fields that are only nullable in degenerate cases. Use --no-maps, --no-enums, and --no-combine-classes to control the output shape.
Method 3 — generate from a JSON Schema (the right way, if you have one)
If your API publishes an OpenAPI or JSON Schema, throw away the response-sniffing approach entirely. Generate directly from the schema:
npx json-schema-to-typescript schema.json -o types.d.ts
This is the only method that's actually correct, because it uses the contract — not a snapshot of the contract on a particular Tuesday. If your backend doesn't publish a schema, this is the moment to push for one. It pays back forever.
For OpenAPI specifically, openapi-typescript is better:
npx openapi-typescript ./openapi.yaml -o ./src/api-types.ts
You get path-typed clients, request/response unions, and discriminated unions where the spec defines them. Pair it with a tool like openapi-fetch and your API client becomes effectively impossible to call wrong.
Method 4 — Zod schemas, then infer
If your code already validates incoming JSON at runtime — and it should, at any system boundary — you don't need a separate type-generation step. Zod gives you both:
import { z } from 'zod'
const UserSchema = z.object({
id: z.string().uuid(),
email: z.string().email().nullable(),
roles: z.array(z.enum(['admin', 'editor', 'viewer'])).default([]),
createdAt: z.string().datetime(),
})
type User = z.infer<typeof UserSchema>
User is now a fully typed interface, and UserSchema.parse(data) validates incoming JSON at runtime. The type and the validator can't drift apart because they're literally the same object.
This is the pattern I reach for in any new TypeScript service. Auto-generation from JSON is fine for exploration, but the contract for your runtime should be expressed in code, not in a sample file someone forgot to update.
For converting a JSON sample to a Zod schema, the JSON-to-Zod converter on json.renderlog.in handles the mechanical part. Generate the rough schema, then tighten the validation rules — adding .uuid(), .email(), length bounds, refinements — by hand. That's the part that actually deserves human attention; the boilerplate isn't.
Method 5 — going the other direction
Sometimes you have TypeScript types and need a JSON Schema (for documenting an API, generating OpenAPI specs, validating environment variables). ts-json-schema-generator does the reverse:
npx ts-json-schema-generator --path 'src/types.ts' \
--type 'User' --out 'schema/user.json'
This rounds out the toolkit. JSON ↔ TypeScript ↔ JSON Schema are all interconvertible; pick the direction that matches your source of truth and let tooling handle the rest.
Five mistakes auto-generation still makes
No tool is perfect. Here's what to fix by hand after generating:
1. Empty arrays infer as never[] (or worse, any[]). If your sample has roles: [], the generator can't know what type the array contains. Either feed it a sample with elements, or hand-edit to string[].
2. Optional vs nullable. Tools that infer from a single sample treat { email: null } as email: null (the literal type). You almost always want string | null. Multi-sample tools handle this better, but watch for it.
3. Discriminated unions get flattened. A field that's { kind: 'user', name: string } in one sample and { kind: 'org', members: number } in another should be a tagged union. Most generators merge them into one interface with everything optional. Fix manually:
// what you usually get
interface Account {
kind: 'user' | 'org'
name?: string
members?: number
}
// what you actually want
type Account =
| { kind: 'user'; name: string }
| { kind: 'org'; members: number }
The second form is the one that actually narrows in switch statements.
4. Date strings stay as strings. createdAt: "2026-04-25T10:00:00Z" types as string, not Date. That's technically correct for JSON over the wire, but you usually want a parse step that converts it on receive. Zod's .datetime() plus a .transform(s => new Date(s)) handles this in one shot.
5. The unspoken any. When the generator hits a deeply dynamic field (a metadata blob, a "custom fields" map, a field whose value is itself JSON-encoded), it falls back to any or unknown. Search the generated file for any before committing — it's the single biggest source of "we have TypeScript but our runtime still crashes."
A workflow that scales
Here's the pattern that holds up across teams:
- Source of truth in the schema — OpenAPI or JSON Schema, owned by the API team.
-
Generated types in CI —
openapi-typescriptruns on every backend deploy, types get committed to a shared package consumed by the frontend. - Runtime validation at the boundary — Zod (or Valibot, or Yup) at every fetch call, even for "trusted" internal services. Internal services lie.
- Hand-written types for derived shapes — UI state, reducers, props. Don't generate these.
The rule: types that come from an external system should be generated. Types that you create should be hand-written. Mixing the two — generating UI state from a sample API response — is how you end up with a User type that has a _internalShoppingCartId field nobody understands, two years later.
When you should not auto-generate
A few cases where the overhead isn't worth it:
-
Stable, small payloads. If the response is
{ ok: boolean, message: string }, just type it. Generation tooling is more code than the type itself. -
Heavily customized types. If you're going to add
readonly, branded types (type UserId = string & { __brand: 'UserId' }), or template-literal constraints, the generator output is a starting point you'll rewrite anyway. -
One-time scripts. Don't add tooling to a five-line migration script.
as anyis a fine choice sometimes.
The honest answer is most production codebases benefit from generation, most exploratory code doesn't, and the gradient between them is what experienced engineers spend years calibrating.
A short detour: other JSON tools that pair well with this workflow
Type generation is one slice of the JSON-debugging work most engineers do every week. A few related browser tools that show up in the same workflow:
- Compare two JSON responses — when a frontend works locally but breaks in staging, diffing the two payloads side-by-side finds the schema drift in 30 seconds.
- Generate a JSON Schema from a sample — useful when you want to pin down a contract for an API that doesn't publish one yet.
-
JSONPath tester — for validating queries against a payload before wiring them into a
jqscript or a JS reducer. -
JSON repair — handles trailing commas, missing quotes, and the other things that make
JSON.parsethrow in surprising ways. - JSON to Go struct, JSON to Python, JSON to CSV, JSON to YAML — same auto-generation idea, different target language.
- JWT decoder — because half the time the "JSON" you're debugging is actually a base64-encoded token. (Worth its own post; coming next.)
All of these run client-side, so the JSON you paste never leaves your browser.
TL;DR
- Stop hand-writing TypeScript types from JSON responses past trivial size.
- For one-off shapes, paste into a browser tool like the JSON-to-TypeScript converter.
- For repeatable, multi-sample inference:
quicktype. - If your API has an OpenAPI/JSON Schema: generate from that, not from samples.
- For runtime validation + types in one place: Zod with
z.infer— start the schema with a JSON-to-Zod converter, then refine. - After generation, always check for:
never[], accidental nullability, missed discriminated unions, date strings, and lurkinganys.
Five minutes setting up the right generation pipeline saves you the slow drip of Cannot read property X of undefined bugs forever.
If this was useful, I've also built a handful of other free, browser-based tools — no signup, no uploads, everything runs client-side:
- JSON Tools — https://json.renderlog.in (formatter, validator, JWT decoder, JSONPath tester, 40+ converters)
- Text Tools — https://text.renderlog.in (case converters, slug generator, HTML/markdown utilities, 70+ tools)
- PDF Tools — https://pdftools.renderlog.in (merge, split, OCR, compress to exact size, 40+ tools)
- Image Tools — https://imagetools.renderlog.in (compress, convert, resize, background remover, 50+ tools)
- QR Tools — https://qrtools.renderlog.in (WiFi, vCard, UPI, bulk QR codes with logos)
- Calc Tools — https://calctool.renderlog.in (60+ calculators for finance, health, math, dates)
- Notepad — https://notepad.renderlog.in (private, offline-first notes, no signup)
Top comments (0)