The standard Python AI tool install experience:
- Install Python (which version?)
- Create a venv
- pip install
ModuleNotFoundError: No module named 'textual'- pip install again with correct extras
- Figure out PATH
Seven steps. Fifteen minutes. That's before you've even seen the tool.
We fixed this for pydantic-deep — the modular agent runtime for Python:
curl -fsSL https://raw.githubusercontent.com/vstorm-co/pydantic-deep/main/install.sh | bash
That's it.
What install.sh does
- Detects whether
uvis installed - If not: installs uv via the official Astral installer
- Runs
uv tool install "pydantic-deep[cli]"— isolated environment, binary available globally - Verifies the install
- Prints PATH fix instructions if needed
No virtual environment management. No extras guessing. The cli extras group includes everything including textual (which was the original bug — it was missing from the base install).
Self-update
pydantic-deep update
Uses uv tool upgrade if available, falls back to pip. One command to stay current.
Startup notifications
Every invocation checks PyPI for updates silently (2-second timeout, 24-hour cache):
Update available: v0.3.6 → v0.3.7 Run: pydantic-deep update
Never blocks startup.
Why uv?
uv tool install is designed exactly for this use case — isolated tool environments, global binary access, no activation required. Fast, well-maintained, increasingly standard for Python CLI tooling.
Alternatives considered: pipx (slower, needs separate install), Homebrew tap (maintenance overhead), native binary (too brittle for dynamic imports).
Full write-up with implementation details: oss.vstorm.co/blog/pydantic-deep-one-command-install-curl-bash
GitHub: github.com/vstorm-co/pydantic-deep
What's the worst install experience you've had with an AI/ML tool?
Top comments (0)