Getting Deno and pnpm workspaces to play nicely turned out to be trickier than expected. Between Deno’s refusal to import
.ts
files fromnode_modules
and the need to manually rewrite package.json fields, here’s how we wrangled everything into a working Effect Cluster deployment.
At Cap, we're big fans of Effect and are using their Workflow system for our durable workflow needs. Deploying this requires us to run an Effect Cluster, containing a number of machines running our workflows and one dedicated machine for orchestrating the runners.
After seeing this example of an Effect cluster being deployed on AWS ECS via SST, I decided we would also deploy our cluster on ECS and distribute the application as a Docker image. The application itself is a bunch of TypeScript that I chose to run on Deno, due to my existing familiarity with it and its modern feature set - namely --watch
mode in dev, and the ability to execute .ts
files.
First, I verified that the code would run: deno run --allow-all src/runner.ts
worked, and that was good enough for me. Onto the docker image!
Since Cap is a large monorepo, I wanted the Docker image to only contain the code relevant to the application. To help with this I chose to utilise pnpm's deploy
command, which copies a package and all of its dependencies into a specified output file. Crucially, the dependencies are hard-linked rather than symlinked to the global store, so the resulting output is entirely self-contained and doesn't contain dependencies from other packages in the monorepo. I then tried running the app using the previous approach but ran into a problem: Deno doesn't support importing .ts
files from node_modules
.
My initial attempt to solve this was to use tsdown to bundle the whole app to a collection of .js
files, but this resulted in the bundled output attempting to import node-only modules like node:module
. I couldn't be bothered figuring out why that happened, so I decided against bundling the app itself.
Since Deno is fine with importing the app's .ts
files but none of our dependencies', I figured we could just bundle our libraries to .js
files in dist
folders. tsdown made this easy, and after giving each package a tsdown.config.ts
and a "build": "tsdown"
script, our app's build script is just pnpm run --filter @cap/web-cluster^... build
.
run --filter @cap/web-cluster^...
tells pnpm to run the specified script for all the workspace dependencies of@cap/web-cluster
, see pnpm's filtering docs
Unfortunately, there was another problem: In our packages' package.json
files, we reference .ts
files in src
. For our other apps this works great, since Next.js and Vite bundle everything without complaint, but for Deno this meant it wasn't being pointed to our dist
folders.
To solve this, I took a page from my experience building packages like SolidBase. We export .ts
files during development so that testing code changes doesn't require a rebuild, and using pnpm's publishConfig
we override the exports
field to reference the build .js
files. The only problem with this is that we're not actually publishing the packages in our monorepo, so the publishConfig
never gets used. Not to worry, we can use a script to crawl the node_modules
produced by pnpm deploy
and apply the publishConfig
to our packages:
// scripts/post-deploy.ts
import { FileSystem } from "@effect/platform";
import { NodeContext, NodeRuntime } from "@effect/platform-node";
import { Effect } from "effect";
Effect.gen(function* () {
const fs = yield* FileSystem.FileSystem;
const dotPnpm = "./node_modules/.pnpm";
const deps = yield* fs.readDirectory(dotPnpm);
const capDeps = deps.filter((dep) => dep.startsWith("@cap"));
for (const key of capDeps) {
const pkgName = key.split("@file")[0].replace("+", "/");
const pkgJsonPath = `${dotPnpm}/${key}/node_modules/${pkgName}/package.json`;
let pkgJson = JSON.parse(yield* fs.readFileString(pkgJsonPath));
if (pkgJson.publishConfig) {
pkgJson = { ...pkgJson, ...pkgJson.publishConfig };
}
yield* fs.writeFileString(pkgJsonPath, JSON.stringify(pkgJson));
}
}).pipe(Effect.provide(NodeContext.layer), NodeRuntime.runMain);
I've taken a liking to writing scripts with Effect recently, not having to import from
node:fs/promises
feels nice 😄
This script finds all the dependencies in our deployed node_modules
that are workspace packages (for us they all start with @cap
), reads the package.json
for each package, and updates it with the contents of publishConfig
overwriting any existing values, with the goal of changing any .ts
imports to .js
. This works great provided that all packages have the necessary publishConfig
.
With this script sorted, Deno successfully runs our app's .ts
files, happily consuming our newly built .js
files for dependencies. Now to codify this build -> pnpm deploy
-> post-deploy.ts
process into a Dockerfile:
FROM node:24-slim AS base
RUN corepack enable
FROM base AS builder
WORKDIR /app
COPY . .
RUN corepack enable pnpm
RUN --mount=type=cache,id=pnpm,target=/root/.local/share/pnpm/store pnpm i --frozen-lockfile
RUN pnpm run --filter=@cap/web-cluster build
RUN pnpm deploy --filter=@cap/web-cluster out
RUN cd out && node scripts/post-deploy.ts
FROM denoland/deno:2.5.3 AS runner
WORKDIR /app
COPY --from=builder --chown=deno:deno /app/out /app
USER deno
ENTRYPOINT ["deno", "run", "--allow-all"]
Our app has multiple entrypoint files - one for the shard manager and one for the runner - so
ENTRYPOINT
allows us to runsrc/shard-manager.ts
orsrc/runner/index.ts
depending on the situation.
At last, we have an opimitised docker image to deploy on ECS that runs our Effect Cluster, without having to remove .ts
imports from our workspace packages that would otherwise ruin our dev experience. Getting ECS working was another story, but I'll save you that and just link our SST config that controls our cluster - I've had enough writing for one day!
Top comments (0)