DEV Community

Cover image for scrml's Living Compiler
Bryan MacLee
Bryan MacLee

Posted on

scrml's Living Compiler

It's Alive

"It's alive! It's alive!" — Henry Frankenstein, 1931

If you read the intro post, you saw scrml's headline pitch: one file, full stack, compiler does everything.

This post is about the design choice that scares me the most. The one where, every time I describe it, somebody pauses and says wait, you're doing **what?

scrml has a living compiler (in phase 4 of impl). The codegen layer isn't fixed. Community-contributed transformations compete for usage, the population decides which graduate to canonical, and the compiler evolves with the ecosystem instead of with a release cycle.

That's the monster. Let me explain how it's stitched together.


The frozen compiler is convenient until it isn't

Compilers, in the normal world, are frozen things. A standards committee meets. RFCs are drafted. Implementations follow. New patterns wait years. The codegen layer, in particular, is sacred; it's where the language's identity lives.

That model has good reasons behind it. Stability. Reproducibility. A clear blame surface when something breaks. You know what your compiler does because someone with merge rights decided what it does.

The cost of that model is everything that doesn't fit committee timelines. A new browser API ships and you wait three years for the codegen to adopt it. A pattern emerges in real apps that would compile to better output, but the maintainers can't justify the risk to existing users. A specific domain (games, dashboards, embedded) has codegen needs that the general-purpose path will never optimise for.

In the JavaScript world, we route around this with userland libraries. We npm install the new pattern, write a runtime adapter, and pay the cost in bundle size, indirection, and supply-chain risk. The compiler stays frozen and the ecosystem grows messy around it.

scrml does the opposite. The compiler stays alive and the package layer goes away.


The argument I'm extending

This is not my argument. It's an argument gingerBill has been making for years with Odin, and I'm pushing it one step further.

gingerBill's case — in talks, blog posts, and the Odin language itself — is that the package layer is the problem. Central registries, transitive dependencies, opaque update cadences aren't solving a problem; they are the problem, restated. Odin's answer is radical and consistent: no package manager, no registry, vendor everything, dependencies-as-liabilities.

scrml inherits that premise completely. Every time you read "no npm, no transitive trust, vendor-everything" in this post, that's gingerBill's argument, and it's worth reading him directly before reading the rest of what I'm about to say.

The living compiler, is scrml taking his rejection seriously. Then going one step further. Odin rejects package management. scrml rejects it too, and then notices that even if you vendor every library, the compiler itself is still a frozen dependency — and freezing the compiler has its own costs. So we keep the registry idea, but make it distribute compiler transformations instead of runtime code, and let the population drive what graduates.

That's the moon-shot. It only exists because gingerBill already made (proved, IMO) the first half of the argument.


What "living" means

Inside the compiler, a scrml program decomposes into a graph of transformation signatures — typed shapes that say "this construct, in this context, with this type, becomes this output." A signature is the unit of codegen. The compiler ships with one canonical transformation per signature in the box, and that's what your code compiles down to today.

Now imagine the next part:

A developer writes an alternative transformation for one of those signatures. Maybe their version emits faster code for hot paths. Maybe it produces smaller output for a specific browser target. Maybe it specializes for an architecture pattern the canonical path didn't anticipate. They publish it.

Other developers can opt in to that alternative — explicitly, by use-ing it, the same way you'd use a syntax extension or a stdlib module. The compiler picks it up, runs it through a verification pass, and starts emitting that codegen for matching signatures in projects that opted in.

The interesting part isn't that alternatives can exist. It's what happens next.

The compiler observes which alternatives are getting used. Population-level signals — adoption, regression rate, performance deltas, error counts — feed a quality gate. Alternatives that stay green and grow adoption past a threshold graduate to canonical. Alternatives that regress get demoted. The canonical transformation for a given signature is whatever the population, through actual use, has converged on.

The compiler evolves with the ecosystem. Not with a release cycle. Not with a committee. But with the people writing apps in it.


This isn't a package registry. It's a registry of transformations.

npm distributes runtime code. Every dependency you install becomes JavaScript that runs in your app. Transitive dependencies pull more JavaScript. The blast radius of a single bad package is everything downstream that runs your code, including code you didn't write and didn't read.

scrml's living-compiler registry distributes compile-time codegen patterns. A transformation alternative isn't a runtime library — it's a function that takes an AST node and emits compiled output. It runs once, at build time, in the compiler's process. Its output is the same kind of plain JS the canonical transformation would have emitted. There's no transitive dependency tree because compiled output isn't a dependency. Your app links against the compiler, not against the alternatives the compiler considered.

That changes the threat model in ways that matter:

  • No transitive runtime exposure. A bad transformation can emit bad code, but it can't pull in 47 sub-dependencies that each get to run on your users' devices.
  • Output is verifiable. The compiler runs a verification pass over emitted code (sandboxed evaluation, type-shape check, regression suite). A transformation that emits something the verifier rejects never ships.
  • The blast radius is the signature, not the app. A bad transformation affects the codegen for one signature, in projects that explicitly opted in to that alternative. It doesn't get to touch unrelated parts of your code.

npm's problem isn't packages. It's the runtime trust model around packages. The living-compiler registry has a different trust model because it distributes a different thing.


Yes, you're allowed to be suspicious — there's a trust gradient

Three tiers, named at project creation. Pick the one that matches your paranoia level:

scrml init (default — the living compiler in full). Quality-gated alternatives are pulled in based on signatures in your code. Audit log committed to your repo so you can see exactly which transformations the build used. Fast, evolves with the ecosystem, takes the population's word for what's canonical.

scrml init --stable (lock-file mode). Same registry, but pinned. Your project locks the transformation set at init time and only updates when you say so. You get the population's choices but on your release cadence, not theirs. This is the closest analogue to a typical "lockfile + versioned deps" workflow.

scrml init --secure (vendor-everything). No registry trust. All transformations live in your repo, copied in as source, audited by you. The compiler never reaches out at build time. The population's signals don't touch your build. This is the gingerBill / Odin philosophy: dependencies are liabilities; vendor everything.

The default is the bold one. The other two exist because not every project wants to be on the bleeding edge of the ecosystem's collective opinion, and that's fine.


The objections, plus what we're doing about them

"You're going to need telemetry to make this work." Yes — opt-in, anonymised, signature-shaped (no source code, no project identifiers). Population signals are aggregate counts: how many projects use this transformation, what fraction green-build with it, what their build-time and bundle-size deltas are. Projects on --secure participate in nothing. Projects on --stable opt in to whichever surface they want. The default tier is opt-in by default to a minimal surface; the consent prompt at scrml init is concrete about what's measured.

"The supply-chain attack surface is enormous." It's a real surface, and it's the reason the verification pass and signing exist. Every transformation runs in a sandbox at build time. Output is verified against a regression suite before it can graduate. Transformations are cryptographically signed by their authors. Compromised authors get revoked at the registry layer. The threat model is documented and the mitigations are first-class concerns, not afterthoughts.

"Who decides when an alternative graduates?" The population does, against quality gates the compiler enforces. Humans confirm. The graduation step is git-committable to the registry repo, which is itself open. This is closer to the canary-test model used in big distributed systems than to a maintainer-pick or vote. The system recommends, humans decide, the recommendation is auditable.

"This sounds like it could go very wrong." It could. The closing argument for it is also the closing argument against it: the compiler grows with its users. If the ecosystem produces bad transformations, the ecosystem owns that. There is no maintainer-of-last-resort to blame. The compiler's identity is whatever the people writing apps in it have collectively converged on. That's either the most exciting governance story in language design or a slow-motion catastrophe, and honestly the difference depends on what we build now to make verification, signing, and graduation work as well as the verification of the language itself.


Where it is today

scrml is pre-1.0. The full living-compiler vision — the registry, the graduation pipeline, the telemetry surface — is Phase 4 work. The foundations are landing now: the use keyword that lets you opt in to vendored extensions, the vendor/ model that makes copy-everything trivial, the ^{} meta layer that lets local code override compiler behaviour today. Those are the rails the registry rolls onto.

What that means for you, today:

  • The use vendor:my-extension import path works — you can write a transformation alternative as a vendored module and use it in your project right now. It just doesn't graduate anywhere.
  • The compiler is open and watchable. You can read the canonical transformations, write your own, and see what the registry will eventually distribute.
  • The threat model and verification design are in the open. The deep-dives are in the scrml-support repo under docs/deep-dives/.

If the idea sounds interesting, the most useful thing is to write a couple of vendor: transformations now and see how it feels. The shape of those is the shape of what the registry will distribute later.


It's alive

Henry Frankenstein doesn't get to keep his monster. The story is about hubris, the cost of stitching things together you didn't fully understand, and the obligations you take on when you give something its own life.

I think about that a lot. The living compiler is the part of scrml's design that makes me, the maintainer, least comfortable — and the part I'm most certain about. Frozen compilers are conservative for good reasons. Frozen compilers also calcify. The web ecosystem already has one of those, and we paid for it with node_modules.

The bet here is that giving the codegen layer its own life, with verification and signing and population-level quality gates around it, lets the ecosystem grow without needing a kingmaker — and that the obligations that come with that are obligations worth taking on.

It's alive. Now we have to keep it that way.


Try it / follow along

  • Landing + quick start: https://scrml.dev
  • Repo + spec: https://github.com/bryanmaclee/scrmlTS
  • X / Twitter: @BryanMaclee
  • Living-compiler design notes: look in scrml-support/docs/deep-dives/ for transformation-registry-design-2026-04-08.md and debate-transformation-registry-2026-04-08.md — the architecture is open-source, including the arguments against it.

If this design choice seems sketchy. Thats because it is. I am building this language to push the envelope into the future, or off a cliff. Either way, should be a fun ride.

Reply here, on X (@BryanMaclee), or open an issue on the repo.

Top comments (0)