Hey folks! π
Like many developers, I often found myself Googling things like:
"kill process on port 3000"
"revert last git commit"
"list all large files in a directory"
So I built something to fix that: Promptly β an AI-powered, offline-first assistant for your terminal. π§ β¨οΈ
What is Promptly?
Promptly turns plain English into shell commands you can trust. It works like a copilot, but for your terminal:
π¬ "List all Python files modified today"
β‘οΈ find . -name '*.py' -mtime 0
You get:
πΉ The command
πΉ A plain-English explanation
πΉ A confirmation prompt before execution
All of this happens either offline using a local LLM, or online via OpenAI (if enabled).
βοΈ Features
β
Natural Language β Shell Commands
β
Offline mode (via Ollama or llm-rs)
β
Online fallback (OpenAI-compatible APIs)
β
Plugin system (Git, Docker, AWS coming soon)
β
Safe command execution with dry-run checks
β
Built in Rust for speed, reliability, and safety
How It Works ?
Promptly is built around a modular Rust core and can be used:
As a standalone CLI tool
As a plugin integrated into your terminal (like in VS Code or Alacritty)
With a Tauri-based GUI in the works
You can choose which LLM powers your assistant β local-first models like CodeLlama, DeepSeek-Coder, or WizardCoder via Ollama.
π¦ Try It Out
π Website: https://shell-assistant-promptly.vercel.app
π» GitHub: https://github.com/Harshcreator/promptly/tree/main/shell-assistant
Install instructions are on the site β just clone and run. Itβs free and open source.
π£ Why I Built This
I wanted something that worked without needing to call OpenAI for every command, especially for tasks that are repetitive or security-sensitive.
Privacy, speed, and local intelligence were key.
And I wanted devs to be able to extend it easily β so I built a plugin system from day one.
π Whatβs Next?
Better vector-based local learning (Qdrant + embeddings)
More plugins (Terraform, K8s, AWS CLI)
TUI interface
VS Code extension
π£οΈ Feedback & Contributions
If you like the idea or have feature ideas, Iβd love your feedback or contributions! Drop a star on GitHub β or open an issue.
All of this happens either offline using a local LLM, or online via OpenAI (if enabled).
Top comments (0)