Turn prompts into real CLI progs: .prompt → command, with --help, args, stdin/stdout, pipes. promptcmd (Rust, 300+ commits, 44★) supports multiple providers, caching, and model variants. If you like terminal tooling, read: https://news.ycombinator.com/item?id=47440358
The clever bit: promptctl ssh user@host forwards your local prompt daemon so remote shells run LLM commands locally. No API keys or internet required on the server. Great for air‑gapped hosts — but remember the tunnel trusts your local agent and logs matter.
Practical flow: author a Handlebars .prompt, enable with promptctl enable, then script it: cat compose.yml | askdocker "add a load balancer" ai-analyze-logs --container nginx. Provider pooling (OpenAI/Anthropic/Ollama/OpenRouter/Google) + response cache keeps repeated token
My take: solid tool for terminal-first builders who want composable prompts and secure remote use. Needs clearer auth/audit docs and failure-mode guidance for CI. Would you run LLM-backed steps via SSH-forwarding in your CI or keep keys on runners?
Top comments (0)