Ever written a commit message like "fix stuff" or "updates"? Yeah, me too. Last weekend, I built a solution: AI Commit - a CLI tool that generates intelligent commit messages using local AI.
Table of Contents
The Problem
We've all been there. You've spent hours writing code, and when it's time to commit:
git commit -m "fix stuff"
git commit -m "updates"
git commit -m "final version"
git commit -m "actually final this time"
This creates several issues:
- Unclear history - Good luck finding that bug fix from 3 months ago
- Poor collaboration - Team members can't understand what changed
- Difficult code reviews - Reviewers waste time deciphering changes
- Broken automation - Changelog generators fail without proper commits
The Solution
AI Commit analyzes your git diff and generates meaningful commit messages following best practices.
Features
π€ AI-Powered - Uses local Ollama models (llama2, codellama, mistral)
π Privacy-First - Everything runs locally, no cloud APIs
π° Free - No API keys, no costs, unlimited usage
β‘ Fast - Local generation in seconds
π¨ Multiple Styles - Conventional, Semantic, or Detailed formats
π Offline - Works without internet connection
Example Output
Input:
git add src/auth.py
ai-commit
Output:
βββββββββββββββββββββββββββββββββββββββββββββ
β π€ AI Commit Message Tool β
βββββββββββββββββββββββββββββββββββββββββββββ
Generated Commit Message:
ββββββββββββββββββββββββββββββββββββββββββββββββββ
feat(auth): implement JWT-based authentication
- Add user login and logout endpoints
- Implement token validation middleware
- Create secure session management
- Add password hashing with bcrypt
ββββββββββββββββββββββββββββββββββββββββββββββββββ
Options:
y - Accept and commit
r - Regenerate message
e - Edit message
n - Cancel
How It Works
Architecture
ββββββββββββββββ
β User β
ββββββββ¬ββββββββ
β
β git add .
β ai-commit
β
βΌ
ββββββββββββββββ
β GitService ββββ
ββββββββββββββββ β
β get diff
ββββββββββββ
β
βΌ
βββββββββββββββββββ
β CommitGenerator β
ββββββββββ¬βββββββββ
β
β analyze & prompt
β
βΌ
ββββββββββββββββ
β OllamaClient β
ββββββββ¬ββββββββ
β
β API call
β
βΌ
ββββββββββββββββ
β Local Ollama β
ββββββββ¬ββββββββ
β
β generated message
β
βΌ
ββββββββββββββββ
β User β
ββββββββββββββββ
(review)
Workflow
-
Stage Changes - User runs
git add - Analyze Diff - Tool reads staged changes
- Generate Prompt - Creates context for AI
- Call Ollama - Sends to local LLM
- Parse Response - Cleans up AI output
- User Review - Interactive options
- Create Commit - Executes git commit
Technical Implementation
1. Git Integration
class GitService:
@staticmethod
def get_staged_diff() -> Optional[str]:
"""Get diff of staged changes"""
try:
result = subprocess.run(
['git', 'diff', '--cached'],
capture_output=True,
text=True,
check=True
)
return result.stdout
except subprocess.CalledProcessError:
return None
@staticmethod
def commit(message: str) -> bool:
"""Create a git commit"""
try:
subprocess.run(
['git', 'commit', '-m', message],
check=True
)
return True
except subprocess.CalledProcessError:
return False
2. Ollama Integration
class OllamaClient:
def __init__(self, base_url: str = "http://localhost:11434",
model: str = "llama2"):
self.base_url = base_url.rstrip('/')
self.model = model
def generate(self, prompt: str) -> Optional[str]:
"""Generate text using Ollama"""
try:
response = requests.post(
f"{self.base_url}/api/generate",
json={
"model": self.model,
"prompt": prompt,
"stream": False
},
timeout=30
)
if response.status_code == 200:
return response.json().get('response', '').strip()
return None
except requests.exceptions.RequestException as e:
print(f"Error: {e}")
return None
3. Prompt Engineering
The key to good output is a well-crafted prompt:
prompt = """You are a git commit message expert. Analyze the
following git diff and generate a commit message following the
Conventional Commits format.
Rules:
- Use format: <type>(<scope>): <subject>
- Types: feat, fix, docs, style, refactor, test, chore
- Subject should be lowercase, no period at end
- Keep it concise (max 50 characters for subject)
- If needed, add a body explaining what and why (not how)
Git diff:
{diff}
Generate ONLY the commit message, nothing else:"""
4. Interactive CLI
def get_user_choice(message: str) -> str:
"""Get user input"""
while True:
choice = input(f"{message} [y/n/r/e]: ").lower()
if choice in ['y', 'n', 'r', 'e']:
return choice
print("Invalid choice")
5. Beautiful Terminal Output
class Colors:
GREEN = '\033[92m'
YELLOW = '\033[93m'
RED = '\033[91m'
CYAN = '\033[96m'
BOLD = '\033[1m'
END = '\033[0m'
# Usage
print(f"{Colors.GREEN}β Success!{Colors.END}")
print(f"{Colors.RED}β Error!{Colors.END}")
Demo
Here's a real example from the development of AI Commit itself:
Changes made:
- Added OllamaClient class
- Implemented API integration
- Added error handling
Traditional commit:
git commit -m "added ollama stuff"
AI Commit generated:
feat(ollama): implement Ollama API client integration
- Add OllamaClient class for API communication
- Implement model listing and availability check
- Add error handling for connection failures
- Support configurable base URL and model selection
Much better, right?
Lessons Learned
1. Subprocess Management
Working with git commands from Python taught me:
- Always capture both stdout and stderr
- Handle exit codes properly
- Use
check=Truefor automatic error handling
2. API Design
The Ollama API is beautifully simple:
POST /api/generate
{
"model": "llama2",
"prompt": "your prompt here",
"stream": false
}
3. User Experience
CLI tools need:
- Clear visual feedback
- Color-coded output
- Progress indicators
- Graceful error handling
4. Prompt Engineering
Getting good AI output requires:
- Clear, specific instructions
- Examples of desired format
- Constraints (length, style)
- Context about the task
5. Error Handling
Users will do unexpected things:
- No git repo? Handle it.
- No staged changes? Handle it.
- Ollama not running? Handle it.
- Network issues? Handle it.
Performance Comparison
| Model | Speed | Quality | Memory |
|---|---|---|---|
| llama2 | 2-3s | βββββ | 4GB |
| codellama | 2-3s | βββββ | 4GB |
| mistral | 1-2s | ββββ | 4GB |
| phi | <1s | βββ | 2GB |
| llama3 | 3-4s | βββββ | 8GB |
Recommendation: codellama for best balance of speed and quality.
What's Next
Short-term (1-2 months)
- [ ] Configuration file support (
.ai-commit.yml) - [ ] Custom prompt templates
- [ ] Git hooks integration
- [ ] Emoji support in commits
Medium-term (3-6 months)
- [ ] VSCode extension
- [ ] Batch commit support
- [ ] Commit history analysis
- [ ] Team collaboration features
Long-term (6+ months)
- [ ] Fine-tuned model for commits
- [ ] GitHub/GitLab integration
- [ ] Multi-language support
- [ ] Enterprise features
Try It Yourself
Installation
# 1. Install Ollama
curl -fsSL https://ollama.ai/install.sh | sh
# 2. Pull a model
ollama pull llama2
# 3. Install AI Commit
git clone https://github.com/himanshu231204/ai-commit.git
cd ai-commit
bash install.sh
Usage
# In any git repository
git add .
ai-commit
Contributing
AI Commit is open source and contributions are welcome!
Ways to contribute:
- π Report bugs
- π‘ Suggest features
- π Improve documentation
- π§ Submit pull requests
GitHub: https://github.com/himanshu231204/ai-commit
Conclusion
Building AI Commit was a fantastic learning experience. I went from idea to working product in a weekend, and I'm excited to see where the community takes it.
Key takeaways:
- Local AI is powerful - Privacy + Performance
- CLI tools are fun - Direct, no-nonsense UX
- Open source rocks - Community collaboration
- Scratch your own itch - Build what you need
What commit message format do you prefer? Let me know in the comments!
Connect With Me
- π GitHub: @himanshu231204
- πΌ LinkedIn: himanshu231204
- π¦ Twitter/X: @himanshu231204
- β Buy Me a Coffee
If you found this helpful, please:
- β Star the repo
- π Share with friends
- π¬ Leave a comment
Happy coding! π
This is my first major open source project. Feedback and suggestions are more than welcome!
Top comments (0)