DEV Community

Cover image for I Built an Offline-First AI Copilot for the Terminal – Meet Promptly
Harsh Singh
Harsh Singh

Posted on

I Built an Offline-First AI Copilot for the Terminal – Meet Promptly

Hey folks! πŸ‘‹
Like many developers, I often found myself Googling things like:
"kill process on port 3000"
"revert last git commit"
"list all large files in a directory"
So I built something to fix that: Promptly – an AI-powered, offline-first assistant for your terminal. 🧠⌨️

What is Promptly?
Promptly turns plain English into shell commands you can trust. It works like a copilot, but for your terminal:

πŸ’¬ "List all Python files modified today"
➑️ find . -name '*.py' -mtime 0
You get:
πŸ”Ή The command
πŸ”Ή A plain-English explanation
πŸ”Ή A confirmation prompt before execution
All of this happens either offline using a local LLM, or online via OpenAI (if enabled).

βš™οΈ Features
βœ… Natural Language β†’ Shell Commands
βœ… Offline mode (via Ollama or llm-rs)
βœ… Online fallback (OpenAI-compatible APIs)
βœ… Plugin system (Git, Docker, AWS coming soon)
βœ… Safe command execution with dry-run checks
βœ… Built in Rust for speed, reliability, and safety

How It Works ?
Promptly is built around a modular Rust core and can be used:
As a standalone CLI tool
As a plugin integrated into your terminal (like in VS Code or Alacritty)
With a Tauri-based GUI in the works
You can choose which LLM powers your assistant β€” local-first models like CodeLlama, DeepSeek-Coder, or WizardCoder via Ollama.

πŸ“¦ Try It Out
🌐 Website: https://shell-assistant-promptly.vercel.app
πŸ’» GitHub: https://github.com/Harshcreator/promptly/tree/main/shell-assistant
Install instructions are on the site β€” just clone and run. It’s free and open source.

πŸ“£ Why I Built This
I wanted something that worked without needing to call OpenAI for every command, especially for tasks that are repetitive or security-sensitive.
Privacy, speed, and local intelligence were key.
And I wanted devs to be able to extend it easily β€” so I built a plugin system from day one.

πŸ™Œ What’s Next?
Better vector-based local learning (Qdrant + embeddings)
More plugins (Terraform, K8s, AWS CLI)
TUI interface
VS Code extension

πŸ—£οΈ Feedback & Contributions
If you like the idea or have feature ideas, I’d love your feedback or contributions! Drop a star on GitHub ⭐ or open an issue.

All of this happens either offline using a local LLM, or online via OpenAI (if enabled).

Top comments (0)