The goal was simple: I wanted AI help from the terminal without defaulting to a hosted API, and I wanted the shell workflow to feel like a real tool instead of a novelty.
So far it supports:
- shell mode
- code mode
- explain mode
- saved sessions
- file context
- local cache
- reusable prompt roles
- guarded command execution
Example usage:
ollama-sgpt --shell "list all Python files recursively."
ollama-sgpt --code "write a Python function to reverse a string"
ollama-sgpt --context app.py "review this code"
Install:
pipx install git+https://github.com/sadorect/ollama-sgpt.git
Repo:
https://github.com/sadorect/ollama-sgpt
If you try it, I'd love feedback on the shell UX, PowerShell behavior, and whether the local-first workflow is actually compelling in day-to-day use.
Top comments (0)