Your README has a Setup section. It has eight steps, two of them are out of date, and step 5 assumes you're on macOS. A new dev joins the team and spends their first afternoon figuring out which version of Node you're actually on.
You can do better than that. One .bp file in the repo and one command is all it takes.
What Blueprint Does
Blueprint is a declarative rule engine for development environments. You write a plain-text .bp file that describes what a machine needs packages, language versions, dotfiles, SSH config, secrets, cron jobs and Blueprint applies it.
install git curl on: [mac, linux]
mise python 3.12.0
mise node 20.11.0
clone https://github.com/yourorg/dotfiles.git to: ~/.dotfiles
run "pip install -r requirements.txt" after: mise-python
run "pre-commit install" after: pip-install
blueprint apply project.bp
Same result on a fresh laptop, an existing machine, or CI. That's reproducibility.
The Best Parts
Removing something actually removes it
This is the thing that makes Blueprint genuinely different from shell scripts and most setup tools.
Most tools are additive. You add a package, it installs. You remove it from the config, nothing happens. Over time every machine in your team drifts into a slightly different state.
Blueprint tracks everything it touches in ~/.blueprint/status.json. Remove a package from your .bp file, run blueprint apply again it uninstalls. Your environment stays in sync with your config. Always.
Preview before you touch anything
blueprint plan project.bp
Dry run. Shows exactly what would run, without running it. New teammate on an existing machine? They run plan first. No surprises.
It writes cross-platform for you
Write install git. Blueprint runs brew install git on Mac and apt-get install -y git on Linux. No conditionals in your config, no separate files per platform.
Use on: [mac] or on: [linux] when a rule really is platform-specific. Otherwise write it once.
Apply directly from a Git repo
blueprint apply @github:yourorg/yourrepo project.bp
No cloning required. Send a new teammate one command and they're set up. Point at a branch, point at a subdirectory it handles it.
Your .bp file generates your Dockerfile
This one prevents "works on my machine" from becoming a Dockerfile problem.
You define your Python version once in .bp:
mise python 3.12.0
install build-essential on: [linux] stage: build
install curl on: [linux] stage: runtime
Then render your Dockerfile from a template:
blueprint render project.bp --template Dockerfile.tmpl --output Dockerfile
The template pulls straight from the blueprint:
FROM python:{{ mise "python" }}-slim AS builder
RUN apt-get install -y {{ packages "" "build" }}
FROM python:{{ mise "python" }}-slim
RUN apt-get install -y {{ packages "" "runtime" }}
When you bump the Python version in .bp, the Dockerfile updates with it. And if someone hand-edits the Dockerfile without updating the blueprint, CI catches it:
blueprint check project.bp --template Dockerfile.tmpl --against .
One source of truth. No drift.
Export to a shell script for CI and Docker
blueprint export project.bp --output setup.sh
Generates a standalone, idempotent shell script no Blueprint required to run it. Use it in CI pipelines, Docker builds, or onboarding scripts. Or pipe directly:
blueprint export @github:yourorg/setup | bash
Secrets in the repo, safely
blueprint encrypt ~/.ssh/id_rsa
Encrypts with AES-256-GCM. Commit the .enc file. Then in your blueprint:
decrypt id_rsa.enc to: ~/.ssh/id_rsa password-id: main
Blueprint prompts for the password at apply time, decrypts to the right place with 0600 permissions, and never persists the password. Multiple files sharing the same password-id share one prompt.
Local LLMs as a first-class action
ollama llama3 codellama on: [mac, linux]
Blueprint installs Ollama if it's not there, pulls the models, and like everything else removes them if you delete the rule and reapply. Local AI tooling is part of your reproducible setup, not a manual afterthought.
Modular configs for teams
include base.bp
include team/backend.bp
Shared base for everyone, specialized additions per role. The backend team gets their database tools; the ML team gets their Python stack.
Full audit trail
Every apply is logged to ~/.blueprint/history.json with timestamps and output. blueprint status shows what's installed, from which blueprint file, and when. No more wondering why something is on a machine.
Getting Started
curl -fsSL https://install.getbp.dev | sh
Single binary. No runtime dependencies. Works on macOS, and Linux.
Add a project.bp to your repo. Start with language versions and the packages from your README setup steps. Run blueprint plan to preview, blueprint apply to execute.
Your README setup section becomes:
blueprint apply @github:yourorg/yourrepo project.bp
That's a reproducible project setup.
Top comments (0)