The office network had rules. Strict ones.
No apt-get. No brew. No npm. No downloading binaries from the internet. If it wasn't on PyPI, it didn't exist. The IT policy was clear, the firewall was clearer, and the list of exceptions was empty.
I had one job: automate the documentation pipeline. Diagrams, architecture charts, flow diagrams — all written in Mermaid, all living as .mmd files in the repo, all needing to be rendered to SVG on every build. Simple enough, in theory.
The first thing I found was mermaid-cli. The official tool. Maintained by the Mermaid team themselves. I opened the installation docs, and the first line was:
npm install -g @mermaid-js/mermaid-cli
Closed the tab.
I kept searching. There was a Python package — mermaid-cli on PyPI. I felt a small rush of hope. I ran pip install. It installed. I ran it.
It printed:
playwright install chromium
Of course. Under the hood, it needed a browser. And installing a browser meant downloading a binary from the internet, outside of PyPI, which the network blocked. Even if it hadn't — I didn't want a browser. A browser meant hundreds of megabytes of dependency for what was, at its core, a text-to-SVG conversion.
The hope disappeared.
I sat with the problem for a while.
What does Mermaid.js actually need? I read the source. It needs a DOM. Not a full browser with tabs and network requests and a GPU process — just a DOM. document.createElement. querySelector. CSS computed styles. The ability to measure text. That's it.
The reason everyone reaches for a browser is that browsers are where DOMs live. But a DOM and a browser aren't the same thing.
I remembered PhantomJS.
Most people think PhantomJS is dead. And for what it was originally built for — web scraping, UI testing, automated screenshots of modern sites — it is. Playwright killed it for those use cases in 2018, and the project hasn't had a release since.
But PhantomJS is, underneath all of that, a self-contained WebKit binary. It has a JavaScript engine. It has a real DOM. And it ships as a single executable file — no installation, no system dependencies, no apt-get required.
More importantly: it was on PyPI. Wrapped, bundled, ready to pip install.
The question was whether I could build something thin and clean on top of it. Not a web scraping tool. Not a browser automation framework. Just: run this JavaScript file, give it a DOM, capture what it prints to stdout.
That was phasma.
The first version was small. Almost embarrassingly small. A Python class that started a PhantomJS subprocess, wrote a JS file to a temp directory, ran it, and captured the output. No async, no fancy API, no browser context abstraction. Just driver.exec.
from phasma.driver import exec as run_js
output = run_js("render_diagram.js", capture_output=True)
I pointed it at Mermaid.js. Wrote a small script that loaded the library, created a DOM element, called mermaid.render(), and printed the SVG to stdout.
It worked.
The whole thing — PhantomJS starting up, loading Mermaid, rendering the diagram, printing SVG — took about 800 milliseconds. For a CI pipeline that ran once per push, that was completely acceptable.
mmdc was maybe two hundred lines of Python on top of that. Read the .mmd file. Pass the content to phasma. Capture the SVG. Write it to disk. Done.
pip install mmdc
mmdc --input architecture.mmd --output architecture.svg
No Node.js. No npm. No browser. No apt-get. Just pip — the one thing the network allowed.
There's a version of this story where I found a better solution. Where someone had already built the right thing and I just hadn't searched hard enough. Where the constraint turned out to be navigable with an existing tool.
That version didn't happen.
What happened instead is that the constraint — only PyPI, nothing else — pushed me into a corner narrow enough that the only way out was to build something. And the thing I built turned out to be useful beyond the original problem.
People use mmdc now in Docker containers where they don't want a browser. In CI pipelines where Node.js isn't available. In air-gapped environments where the internet doesn't exist. The constraint that created the tool turns out to be a constraint a lot of people have.
phasma grew a little after that. A Playwright-inspired async API got added — not because mmdc needed it, but because the lower layer was interesting enough to build on. That part is still rough around the edges, still needs work, still has edge cases that aren't handled cleanly. It's the part of the project that's most alive, and most in need of people who want to dig into Python async internals. The door is open.
But the core — driver.exec, a bundled PhantomJS binary, a DOM you can use from Python with nothing but pip — that part works. It works because it had to.
The firewall never had to open. The diagrams appeared in the documentation. The pipeline ran.
The constraint didn't block the solution — it was the solution.
Links:
- phasma on GitHub — if the async API interests you, PRs are open
- mmdc on GitHub
- phasma on PyPI · mmdc on PyPI
Top comments (0)