DEV Community

Cover image for TypeScript Watch Mode in Big Projects: What --extendedDiagnostics Actually Tells You
Gabriel Anhaia
Gabriel Anhaia

Posted on

TypeScript Watch Mode in Big Projects: What --extendedDiagnostics Actually Tells You


Eleven seconds. That's the time it takes for the editor's red
squiggle to disappear after each save in a typical large
monorepo. pnpm dev is running. The watch process is running.
The TypeScript daemon says it's listening. The fan on the
laptop says everything else.

The team has been told to "buy a faster machine" or "split the
repo." Both work. Both miss the cheaper answer: ask the compiler
where the time is going. There's a flag for that, and it has
been there the whole time. tsc --extendedDiagnostics prints a
table that names the slow phase, the slow file count, and the
amount of memory the program is sitting on. Once you know which
phase is slow, the fixes are short. Most of them are one-line
edits.

This post walks through what the table actually says, which
numbers matter on a real codebase, and the four fixes that have
the highest hit rate on tsc --watch slowdowns in monorepos.

What the flag prints

Run any of these and the table shows up at the end of the
compile:

tsc --extendedDiagnostics
tsc --watch --extendedDiagnostics
tsc -b --extendedDiagnostics
Enter fullscreen mode Exit fullscreen mode

On a real project the output is roughly:

Files:                            4123
Lines of Library:                52187
Lines of Definitions:           241893
Lines of TypeScript:            204117
Nodes of Library:               202413
Identifiers:                    711804
Symbols:                        921433
Types:                          314205
Instantiations:                3914022
Memory used:                  1842113K
Assignability cache size:        69141
Identity cache size:              4321
Subtype cache size:              19872
Strict subtype cache size:       11003
I/O Read time:                    0.42s
Parse time:                       1.91s
Bind time:                        0.94s
Check time:                      14.62s
transformTime:                    0.13s
commentTime:                      0.04s
I/O Write time:                   0.21s
Emit time:                        0.71s
printTime:                        0.55s
Total time:                      18.93s
Enter fullscreen mode Exit fullscreen mode

Three numbers do most of the work for diagnosing watch mode:

Check time. This is type-checking. Almost every "tsc is
slow" complaint in a big project lives here. If Check time
swallows the wall clock, the fix is in your types, not in your
disk and not in your tsconfig's include glob.

Instantiations. Each entry is one generic type being
realised at one call site. A million is fine. Four million
starts to hurt. Forty million is the kind of number that comes
back to one million after a single change to one helper
function.

Types. The total count of type objects the checker has
allocated. Memory tracks this directly. A jump in Types
between two runs is the lead indicator that someone added a
mapped-conditional-recursive helper that scales badly.

The other numbers matter, but secondarily. Parse time going
up means files got bigger (or you accidentally included
node_modules). Bind time going up means the symbol table is
bigger (more declarations, more declare module, more globals).
Emit time going up while everything else is flat means
declaration emit is the new bottleneck. That's the topic of the
isolatedDeclarations post.

What watch mode actually does on a save

tsc --watch doesn't recompile from zero. It keeps the program
in memory, marks the file you saved as dirty, walks the import
graph to find what depends on that file, and reruns the checker
on those nodes. The number you feel (the eleven seconds before
the squiggle goes away) is Check time for the dirty subgraph.

That's why --extendedDiagnostics matters in watch mode. The
table prints once per incremental compile. You can save a file,
read the line that says Check time: 11.2s, and know that the
problem is in something that file imports.

A useful trick: save a known-cheap file (a constants module),
and a known-expensive file (the one that imports your generic-
heavy DSL helper). The diff between the two Check time
values is a free profile of that module's cost.

A magnifier hovering over a tsc diagnostics table, the Check time row glowing burnt orange.

What's actually slow: the four usual suspects

Across hundreds of diagnostics tables, the same four shapes
turn up again and again as the cause of a Check-time blowup.

1. Deep generic instantiations

A helper that takes a generic, hands it to another helper that
takes a generic, that hands it to a third helper. Each layer
multiplies the work the checker has to do at the call site. If
your Instantiations line shoots into the millions, this is
where the cost lives.

The diagnosis pattern: pick a file that imports the suspect
helper and a file that doesn't. Compare Instantiations in
isolated tsc --noEmit --extendedDiagnostics runs over each.
The delta is the cost of that helper's generic depth, in raw
realised types.

The fix is rarely "rewrite the helper." It's usually
"materialise the type at the boundary." Once a generic helper
returns a result that the consumer treats as a finished shape,
giving that result an explicit return-type annotation freezes
the realised shape and stops the call-site walk:

// Before: every consumer re-realises the generic chain.
export function buildPipeline<T>(steps: Step<T>[]) {
  return compose(steps);
}

// After: the public type is materialised once.
export function buildPipeline<T>(
  steps: Step<T>[],
): Pipeline<T> {
  return compose(steps);
}
Enter fullscreen mode Exit fullscreen mode

This is the same hygiene the --isolatedDeclarations flag
demands, and it pays the same dividend on watch-mode check
time.

2. Conditional-type recursion

Mapped + conditional + recursive types in one helper are the
combo move that turns a 2-million-instantiation project into a
30-million-instantiation project. The shape:

type DeepReadonly<T> = T extends Function
  ? T
  : T extends object
    ? { readonly [K in keyof T]: DeepReadonly<T[K]> }
    : T;
Enter fullscreen mode Exit fullscreen mode

Every time the checker hits a DeepReadonly<SomeBigType>, it
walks the entire shape and recurses. If SomeBigType is a
union of API responses with 200 fields apiece, that's tens of
thousands of instantiations from a single use site.

Three things help, in order of how often they apply.

First, narrow the input. If the helper is only meant to deep-
freeze configuration, type the parameter as the configuration
shape, not as a free generic. The recursion still runs but on a
finite, hand-written input.

Second, cap the depth. TypeScript has a hard recursion limit,
but you can make yours shallower with a counter type:

type DeepReadonly<T, D extends number = 4> =
  D extends 0
    ? T
    : T extends Function
      ? T
      : T extends object
        ? {
            readonly [K in keyof T]: DeepReadonly<
              T[K],
              Prev<D>
            >
          }
        : T;
Enter fullscreen mode Exit fullscreen mode

Third, pre-compute. If five files use DeepReadonly<Config>,
write type ReadonlyConfig = DeepReadonly<Config> once and use
the alias. The checker realises the recursion once and reuses
it. This is the single biggest watch-mode win for projects that
lean on type-level DSLs.

3. Unhinted union narrowing

A union type with a hundred members. A function that takes the
union and a switch that narrows by kind. No discriminant. Or
worse, a discriminant the checker can't statically pick from
without walking the entire union on every call:

function handle(event: AppEvent) {
  switch (event.type) {
    case "user.created": ...
    case "user.updated": ...
    // 98 more cases
  }
}
Enter fullscreen mode Exit fullscreen mode

If AppEvent is a flat union of a hundred objects, the checker
narrows by walking the union linearly per case. A switch over
this union becomes O(n²) in checker work. On a watch-mode save
that touches the file with handle, you feel it.

The fix is a discriminated union with a string literal as the
discriminant (which the checker is hardwired to narrow in
constant time per case), combined with grouping. Group events
by domain, give each domain its own union, then combine:

type UserEvent =
  | { type: "user.created"; ... }
  | { type: "user.updated"; ... };

type OrderEvent =
  | { type: "order.placed"; ... }
  | { type: "order.fulfilled"; ... };

type AppEvent = UserEvent | OrderEvent;
Enter fullscreen mode Exit fullscreen mode

A handler that only cares about user events takes UserEvent,
not AppEvent. Half the union is invisible to that file's
narrowing. The checker stops walking the other domains
entirely.

Two columns labelled deep generics and conditional recursion, with a stopwatch overlaid on the slow column.

4. Implicit any in inferred returns

The slowest helpers in a real codebase are often the ones that
look fine. An exported function that returns the result of a
chain of .map() and .filter() calls, with no return-type
annotation. The checker has to infer the entire chain, every
time it sees a call site, in every file that imports it.

Annotating the return type once moves that work to the
definition site, where it runs once. The same hygiene as fix #1.
The same reason --isolatedDeclarations exists.

If you flip the flag on a single subdirectory (the fence-and-
grow
pattern: turn it on in one folder and grow outward),
tsc --watch over that subdirectory is measurably faster.
Even without the flag, annotating return types on the top
exported helpers moves their inference cost off every consumer's
call site, which on a 200k-LOC repo is measured in seconds of
Check time.

The two tsconfig flags that pay for themselves

--extendedDiagnostics tells you which type code is slow.
Two compiler flags reduce how often that work runs in the
first place.

--incremental with --tsBuildInfoFile. The compiler
writes the program graph to disk and reads it back on the next
run. A cold tsc -b for the project skips files whose dirty
state hasn't changed since the last build. On a CI runner with
a warm cache, this turns a six-minute typecheck into a
forty-second one.

{
  "compilerOptions": {
    "incremental": true,
    "tsBuildInfoFile": "./.tscache/tsbuildinfo.json"
  }
}
Enter fullscreen mode Exit fullscreen mode

The cache file is a few MB. Commit the location into
.gitignore and into your CI cache key. The win is per-CI-
build; on a dev machine, watch mode keeps the same graph in
memory anyway.

Project references. A monorepo with one giant tsconfig
type-checks everything every time. Broken into project
references (one tsconfig per package, with references pointing
at dependencies), each package type-checks against its
dependencies' .d.ts, not their source.

{
  "compilerOptions": { "composite": true },
  "references": [
    { "path": "../core" },
    { "path": "../db" }
  ]
}
Enter fullscreen mode Exit fullscreen mode

The packages compile in topological order. Watch mode runs at
the package level. A change in a leaf package recompiles only
that package. A change in core recompiles core, then any
package that references it. Both stages are bounded by the
package, not by the repo.

The setup cost is a few hours of carving the existing tsconfig
into pieces and getting tsc -b to build them in the right
order. The wall-clock saving on a six-package monorepo is large:
easily a majority of the cost of an incremental build, depending
on package count and dependency depth.

Catch regressions before they ship

Wire --extendedDiagnostics into one CI job that prints the
table on every PR. Diff it against main. If Instantiations
or Check time jumps by more than 20%, post a PR comment with
the diff. The reviewer sees the cost before merging the new
helper that's about to make everyone's editor slower. A small
bash script that greps the output covers most regressions a team
ever ships.

What to do tomorrow

Run tsc --extendedDiagnostics. Look at three lines: Check
time
, Instantiations, Types. If Check time is over 5
seconds, you have a watch-mode problem worth fixing. If
Instantiations is over 5 million on a sub-million-LOC
project, one of the four shapes above is doing it.

Fix the loudest one. Re-run. Watch the number drop. Then turn
on incremental and project references and stop fighting tsc
--watch
for the year.


If this was useful

--extendedDiagnostics is one of the diagnostic tools
TypeScript in Production covers alongside
--generateTrace, the project-references topology, build
sequencing, and the publishing dance that decides whether your
package's .d.ts reaches a consumer's IDE quickly. If your day
job is keeping a published TypeScript codebase fast at scale,
that's the volume in the set to put on the desk.

For the type shapes that turn slow (generics depth, conditional
types, mapped recursion), The TypeScript Type System is the
deeper home. TypeScript Essentials covers the language
fundamentals that feed into a clean public surface. From JVM,
Kotlin and Java to TypeScript makes the bridge; from PHP 8+,
PHP to TypeScript covers the same ground.

The five-book set:

  • TypeScript Essentials — From Working Developer to Confident TS, Across Node, Bun, Deno, and the Browser — entry point: amazon.com/dp/B0GZB7QRW3
  • The TypeScript Type System — From Generics to DSL-Level Types — deep dive: amazon.com/dp/B0GZB86QYW
  • Kotlin and Java to TypeScript — A Bridge for JVM Developers — bridge for JVM devs: amazon.com/dp/B0GZB2333H
  • PHP to TypeScript — A Bridge for Modern PHP 8+ Developers — bridge for PHP devs: amazon.com/dp/B0GZBD5HMF
  • TypeScript in Production — Tooling, Build, and Library Authoring Across Runtimes — production layer: amazon.com/dp/B0GZB7F471

All five books ship in ebook, paperback, and hardcover.

The TypeScript Library — the 5-book collection

Top comments (0)