DEV Community

Cover image for Stop Reaching for `any`. The Modern `unknown` Pattern Fixes 90% of It
Gabriel Anhaia
Gabriel Anhaia

Posted on

Stop Reaching for `any`. The Modern `unknown` Pattern Fixes 90% of It


You're three weeks into a feature. Production gets a new version of an upstream service. A webhook lands in your handler that looks roughly like the one you tested against, except metadata.user arrived as a JSON-encoded string this time instead of an object, because someone on the other team toggled a serializer flag without telling anyone. Your code does this:

app.post("/webhook", (req, res) => {
  const payload = req.body as WebhookPayload;
  return res.json({ ok: true, userId: payload.metadata.user.id });
});
Enter fullscreen mode Exit fullscreen mode

The cast says: trust me, this is a WebhookPayload. The runtime says: Cannot read properties of undefined (reading 'id'). The 500 hits Sentry, the on-call rotation pings, and somebody opens the PR diff to figure out which line lied to the type checker.

It was the as. It always was the as.

The any family is the same shape of bug. as any, : any, JSON.parse returning any by default — every one of them is a place where you told the compiler to stop checking and the runtime found out the data didn't match. The unknown keyword is the one tool that makes this category go away. It's been in TypeScript since 3.0 (July 2018), and plenty of codebases still reach for any in the spots where unknown would do the job.

This post is about the patterns that turn unknown from a curiosity into the daily-driver type for everything crossing your trust boundary. There are five of them, and each one removes a place where any was hiding the bug.

The mental shift: trusted inside, unknown outside

any and unknown are both top types. Anything assigns into them. The difference is what you can do with them after the assignment.

const a: any = JSON.parse(raw);
a.foo.bar.baz();        // compiles, may explode at runtime

const u: unknown = JSON.parse(raw);
u.foo;                  // Object is of type 'unknown'.
Enter fullscreen mode Exit fullscreen mode

any says "every operation on this value is fine, and the result is also any, which is also fine." It's the off-switch for the type checker. unknown says "I have no idea what shape this is. Prove it before you touch it." It's the on-switch with a key.

The reframe that fixes most of a codebase is this. Inside your module, your types are real. Functions you wrote, classes you control, data structures you defined — those have shape, and the compiler enforces it. The moment a value crosses a boundary you don't own (network, disk, environment, third-party SDK, JSON.parse, the DOM), it becomes unknown until you've narrowed it. Treat the boundary as the place where parsing happens, and as stops looking like an option.

The five patterns below are the toolbox for that crossing.

Pattern 1: Type predicates for handwritten boundaries

The first reach is a function that takes unknown and returns a type predicate. The signature looks unusual the first time you see it.

type User = { id: string; email: string; name: string };

function isUser(value: unknown): value is User {
  if (typeof value !== "object" || value === null) return false;
  const v = value as Record<string, unknown>;
  return (
    typeof v.id === "string" &&
    typeof v.email === "string" &&
    typeof v.name === "string"
  );
}

function greet(value: unknown) {
  if (isUser(value)) {
    return `Hello ${value.name}`; // value is User here
  }
  throw new Error("not a user");
}
Enter fullscreen mode Exit fullscreen mode

The value is User return type is what makes the narrowing portable. Anywhere you call isUser(value), the compiler narrows the value to User for the truthy branch. Without that annotation, the predicate would still type-check, but every caller would only see boolean and the value would stay unknown.

A few things to know about hand-rolled predicates. The compiler does not validate that the body actually proves the type. If you forget to check email and your predicate returns true anyway, the compiler trusts the lie and the next line crashes. Treat the predicate body the same way you'd treat a parser: written carefully, tested, and ideally unit-tested with one valid and one malformed fixture each.

Also, the inner as Record<string, unknown> is a deliberate one-line escape after the typeof check has already proven the value is a non-null object. That single as lives inside the predicate so no caller has to write one. The whole point of the function is to be the only place a cast appears.

The predicate pattern earns its keep on the boundary types you write yourself: a webhook payload your team designed, an internal RPC response, a localStorage key. For anything from npm or anything with even mild nesting, pattern 3 is the better reach.

Pattern 2: Assertion functions when the bad path is "throw"

A type predicate returns a boolean and lets the caller decide what to do. An assertion function throws and narrows the rest of the scope. The signature uses asserts instead of is.

function assertIsUser(value: unknown): asserts value is User {
  if (typeof value !== "object" || value === null) {
    throw new Error("Expected user object, got " + typeof value);
  }
  const v = value as Record<string, unknown>;
  if (typeof v.id !== "string") throw new Error("user.id missing");
  if (typeof v.email !== "string") throw new Error("user.email missing");
  if (typeof v.name !== "string") throw new Error("user.name missing");
}

function processUser(input: unknown) {
  assertIsUser(input);
  // input is User from this line on, no `if` block needed
  return `${input.name} <${input.email}>`;
}
Enter fullscreen mode Exit fullscreen mode

After the call, the compiler treats input as narrowed for the rest of the function body. There's no nested if block, no early-return ladder, no input!.email. The invariant lives in one place and the rest of the code reads as if you had a typed value all along.

Two things worth knowing. Assertion functions don't compose with arrow functions: const assertIsUser = (v: unknown): asserts v is User => { ... } will fail to compile. The asserts clause is only valid on a function declaration. This has been a known design limitation since the feature shipped in TypeScript 3.7.

The other detail is when to reach for an assertion vs. a predicate. Use the predicate when the caller might want to recover (return a 400, fall back to a default, retry with a different fetcher). Use the assertion when "this thing is malformed" is a programmer error or an invariant violation, not a control-flow branch you expect to hit. Webhooks parse with predicates because the spec allows malformed input. Internal config loaded at boot asserts because malformed config is a deploy bug.

Pattern 3: Zod (or Valibot, or ArkType) at every boundary

Hand-rolled predicates and assertions are the right tool for one-off types and library code that can't take a dependency. For application code touching JSON, the right tool is a runtime schema validator. Zod is the default reach: TypeScript-first, schema-as-code, and the parsed output type is inferred from the schema definition rather than declared twice.

Zod 4 is the current major release. It's a rewrite of the v3 internals, and it's what npm i zod gives you today. The v4 release notes cover the perf and bundle changes if you're migrating from v3.

import { z } from "zod";

const UserSchema = z.object({
  id: z.string(),
  email: z.email(),
  name: z.string().min(1),
  roles: z.array(z.enum(["admin", "editor", "viewer"])),
});

type User = z.infer<typeof UserSchema>;

function parseWebhook(body: unknown): User {
  return UserSchema.parse(body);
  // throws ZodError on failure; returns a typed User on success
}

function tryParseWebhook(body: unknown) {
  const result = UserSchema.safeParse(body);
  if (!result.success) {
    return { ok: false as const, error: result.error.issues };
  }
  return { ok: true as const, value: result.data };
}
Enter fullscreen mode Exit fullscreen mode

What this gives you that the hand-rolled version doesn't. The schema is the type. z.infer<typeof UserSchema> produces the User type, and the schema is the parser. Add a field to the schema and the type updates on save. Remove a field and it disappears from both at once. The error messages tell you which field on which path failed validation, which is what your 400 response body should be carrying anyway.

safeParse is the form to reach for in HTTP handlers and message consumers. It returns a discriminated union { success: true, data } | { success: false, error }, and the caller decides whether to throw, log, or return a structured error. parse is the form for "this should always work or the deploy is broken" — boot-time config, internal RPCs you control end-to-end.

The cost is a real dependency. Zod adds bundle weight and a parse step on every boundary call. For a Node API that handles 200 webhooks a second, the parse cost is a rounding error against the JSON deserialization that already happens. For a browser bundle that needs to be small, Valibot is the lighter alternative with a similar API and a tree-shakable design that drops unused validators.

The principle is the same regardless of library choice: the parser is the boundary, the parsed value is typed, and the rest of the codebase never sees unknown again.

Pattern 4: try/catch and the unknown error

Before TypeScript 4.4, every catch clause variable was typed any by default. You wrote catch (err) and then err.message compiled, regardless of what the throw site actually threw. That was a hole the size of "anything in JavaScript can throw anything, including a string, a Promise, or undefined."

TypeScript 4.4 (August 2021) added the useUnknownInCatchVariables compiler flag, which changes the default to unknown. It's enabled automatically by strict: true, which is the configuration any new project should be on. With it on, this is what you write:

async function fetchProfile(id: string) {
  try {
    const res = await fetch(`/api/users/${id}`);
    return await res.json();
  } catch (err) {
    // err: unknown
    if (err instanceof Error) {
      logger.error({ message: err.message, stack: err.stack });
    } else {
      logger.error({ thrown: String(err) });
    }
    throw err;
  }
}
Enter fullscreen mode Exit fullscreen mode

The narrowing matters because JavaScript lets you throw anything. Most code throws Error instances, but library authors throw plain objects, strings, numbers, custom subclasses with extra fields, and (yes, in production code I've read) Promises. The instanceof Error branch is the path for the 95% case; the else is what catches the rest without throwing a Cannot read properties of undefined while you're trying to log the original failure.

For typed error hierarchies, the instanceof chain extends naturally:

class AuthError extends Error { constructor(public code: string) { super(code); } }
class RateLimitError extends Error { constructor(public retryAfter: number) { super("rate-limited"); } }

try {
  await callApi();
} catch (err) {
  if (err instanceof AuthError) return redirectToLogin();
  if (err instanceof RateLimitError) return scheduleRetry(err.retryAfter);
  if (err instanceof Error) return logUnexpected(err);
  return logUnexpected(new Error(String(err)));
}
Enter fullscreen mode Exit fullscreen mode

If you're on a codebase still on useUnknownInCatchVariables: false, flipping it on is a one-line config change that will surface every place a thrown value got blindly destructured. The fixes are mechanical. Do it.

Pattern 5: Third-party API responses

Every npm package that fetches data has a typed signature on paper and a JSON blob in reality. Some libraries do the parsing for you. Most return a typed object that's actually a cast over the deserialized JSON. APIs change shape: a field gets renamed, an enum gains a value, a date format flips from ISO to epoch milliseconds. Your code keeps compiling and runs into undefined exactly where the docs swore there was a string.

The fix is to treat third-party responses as unknown at the call site and parse them through your own schema. The library's TypeScript types are documentation; the schema is enforcement.

import { z } from "zod";

const GitHubRepoSchema = z.object({
  id: z.number(),
  full_name: z.string(),
  default_branch: z.string(),
  stargazers_count: z.number(),
  topics: z.array(z.string()),
});

type GitHubRepo = z.infer<typeof GitHubRepoSchema>;

async function fetchRepo(slug: string): Promise<GitHubRepo> {
  const res = await fetch(`https://api.github.com/repos/${slug}`);
  if (!res.ok) throw new Error(`GitHub ${res.status}`);
  const raw: unknown = await res.json();
  return GitHubRepoSchema.parse(raw);
}
Enter fullscreen mode Exit fullscreen mode

Three things to call out. The await res.json() annotation as unknown is intentional: fetch's .json() returns Promise<any> in the lib types, and any will silently propagate into your typed return. Annotating the local as unknown forces the parse step to actually happen. The schema only declares the fields you use. GitHub returns dozens of fields per repo, and Zod allows extra keys and drops them from the inferred type by default (the strip mode, still the default in Zod 4). If the upstream adds a new field, your code keeps working. If it renames stargazers_count, your parser fails loudly at the boundary instead of silently in a chart somewhere downstream.

The same shape applies to typed SDKs (Stripe, OpenAI, Twilio, AWS): they're better than raw fetch, but the type is still ahead of the runtime. For data you depend on for billing, auth, or anything regulated, parse it. For best-effort telemetry, the SDK type is fine.

When as actually earns its keep

There's a small set of situations where unknown-and-narrow doesn't apply, and as is the right tool. Knowing them is what stops the rule from rotting into dogma.

The DOM is the canonical example. document.getElementById("email") returns HTMLElement | null, and you know it's an HTMLInputElement because you wrote the HTML two files away.

const input = document.getElementById("email") as HTMLInputElement;
Enter fullscreen mode Exit fullscreen mode

There is no useful predicate here. You're not validating data; you're telling the compiler something the type system genuinely can't infer (which DOM element ID maps to which tag). as is the right reach.

The other category is bridging across unknown after a parse step inside a function whose callers will never see the cast. Inside isUser, the as Record<string, unknown> after the typeof check is one of these. So is as const for literal-type preservation (which isn't the same as as a type assertion — as const doesn't change the value, it narrows the literal types). Both stay.

What you're trying to delete from the codebase is the as SomeBigType cast over JSON, the : any parameter on a parser, the // @ts-expect-error over a third-party response. Those are the bugs unknown is built to remove.

What this looks like as a habit

If you adopt one rule out of this post, make it the boundary rule. Anything entering your module from outside — fetch, JSON.parse, process.env, a file read, a postMessage, a Buffer from a queue, a worker message — is unknown at the call site. Pick the parser that fits (predicate for a one-off, assertion for an invariant, Zod for anything you'd put in a types/ file). The parsed value is typed. The rest of your code lives inside the typed island.

The first time you do it, you'll find five places where the old as was hiding a real shape mismatch. The runtime crash you used to ship turns into a ZodError with a path string, caught by the same handler that catches every other 400. Next time someone on another team toggles a serializer flag without telling you, the failure surfaces as a 400 in your error tracker with the line "expected string at metadata.user.id, got object." Same bug, different blast radius.


If this was useful

This is the boundary chapter from TypeScript Essentials, distilled into one post. The book walks the type system end-to-end across Node, Bun, Deno, and the browser, with unknown, narrowing, and parser-at-the-boundary as the spine of the daily-driver chapter. If your team is still casting JSON into types and shipping the runtime crashes, the book is where the long form of this argument lives, with the discriminated unions, the satisfies rule, and the per-runtime tooling chapters that pair with it.

The full collection (The TypeScript Library) is five books that share a vocabulary. Books 1 and 2 are the core path. Books 3 and 4 substitute for them if you're coming from JVM or PHP. Book 5 is for whoever owns the build, the monorepo, or the dual ESM/CJS dual-publish problem.

  • TypeScript Essentials — From Working Developer to Confident TS, Across Node, Bun, Deno, and the Browser: amazon.com/dp/B0GZB7QRW3 — entry point: types, narrowing, modules, async, daily-driver tooling.
  • The TypeScript Type System — From Generics to DSL-Level Types: amazon.com/dp/B0GZB86QYW — generics, mapped and conditional types, infer, template literals, branded types.
  • Kotlin and Java to TypeScript — A Bridge for JVM Developers: amazon.com/dp/B0GZB2333H — variance, null safety, sealed classes to unions, coroutines to async/await.
  • PHP to TypeScript — A Bridge for Modern PHP 8+ Developers: amazon.com/dp/B0GZBD5HMF — sync to async paradigm, generics, discriminated unions for PHP-shaped brains.
  • TypeScript in Production — Tooling, Build, and Library Authoring Across Runtimes: amazon.com/dp/B0GZB7F471 — tsconfig, build tools, monorepos, library authoring, dual ESM/CJS, JSR.

All five books ship in ebook, paperback, and hardcover.

The TypeScript Library — the 5-book collection

Top comments (0)