DEV Community

Cover image for How I Built an AI-Powered Git Commit Tool Using Ollama in a Weekend
HIMANSHU KUMAR
HIMANSHU KUMAR

Posted on

How I Built an AI-Powered Git Commit Tool Using Ollama in a Weekend

Ever written a commit message like "fix stuff" or "updates"? Yeah, me too. Last weekend, I built a solution: AI Commit - a CLI tool that generates intelligent commit messages using local AI.

Table of Contents


The Problem

We've all been there. You've spent hours writing code, and when it's time to commit:

git commit -m "fix stuff"
git commit -m "updates"
git commit -m "final version"
git commit -m "actually final this time"
Enter fullscreen mode Exit fullscreen mode

This creates several issues:

  1. Unclear history - Good luck finding that bug fix from 3 months ago
  2. Poor collaboration - Team members can't understand what changed
  3. Difficult code reviews - Reviewers waste time deciphering changes
  4. Broken automation - Changelog generators fail without proper commits

The Solution

AI Commit analyzes your git diff and generates meaningful commit messages following best practices.

Features

πŸ€– AI-Powered - Uses local Ollama models (llama2, codellama, mistral)
πŸ”’ Privacy-First - Everything runs locally, no cloud APIs
πŸ’° Free - No API keys, no costs, unlimited usage
⚑ Fast - Local generation in seconds
🎨 Multiple Styles - Conventional, Semantic, or Detailed formats
🌐 Offline - Works without internet connection

Example Output

Input:

git add src/auth.py
ai-commit
Enter fullscreen mode Exit fullscreen mode

Output:

╔═══════════════════════════════════════════╗
β•‘         πŸ€– AI Commit Message Tool         β•‘
β•šβ•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•β•

Generated Commit Message:
──────────────────────────────────────────────────
feat(auth): implement JWT-based authentication

- Add user login and logout endpoints
- Implement token validation middleware
- Create secure session management
- Add password hashing with bcrypt
──────────────────────────────────────────────────

Options:
  y - Accept and commit
  r - Regenerate message
  e - Edit message
  n - Cancel
Enter fullscreen mode Exit fullscreen mode

How It Works

Architecture

β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚     User     β”‚
β””β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”˜
       β”‚
       β”‚ git add .
       β”‚ ai-commit
       β”‚
       β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚  GitService  │──┐
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜  β”‚
                  β”‚ get diff
       β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
       β”‚
       β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ CommitGenerator β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”€β”˜
         β”‚
         β”‚ analyze & prompt
         β”‚
         β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ OllamaClient β”‚
β””β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”˜
       β”‚
       β”‚ API call
       β”‚
       β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚ Local Ollama β”‚
β””β”€β”€β”€β”€β”€β”€β”¬β”€β”€β”€β”€β”€β”€β”€β”˜
       β”‚
       β”‚ generated message
       β”‚
       β–Ό
β”Œβ”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”
β”‚     User     β”‚
β””β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”€β”˜
   (review)
Enter fullscreen mode Exit fullscreen mode

Workflow

  1. Stage Changes - User runs git add
  2. Analyze Diff - Tool reads staged changes
  3. Generate Prompt - Creates context for AI
  4. Call Ollama - Sends to local LLM
  5. Parse Response - Cleans up AI output
  6. User Review - Interactive options
  7. Create Commit - Executes git commit

Technical Implementation

1. Git Integration

class GitService:
    @staticmethod
    def get_staged_diff() -> Optional[str]:
        """Get diff of staged changes"""
        try:
            result = subprocess.run(
                ['git', 'diff', '--cached'],
                capture_output=True,
                text=True,
                check=True
            )
            return result.stdout
        except subprocess.CalledProcessError:
            return None

    @staticmethod
    def commit(message: str) -> bool:
        """Create a git commit"""
        try:
            subprocess.run(
                ['git', 'commit', '-m', message],
                check=True
            )
            return True
        except subprocess.CalledProcessError:
            return False
Enter fullscreen mode Exit fullscreen mode

2. Ollama Integration

class OllamaClient:
    def __init__(self, base_url: str = "http://localhost:11434", 
                 model: str = "llama2"):
        self.base_url = base_url.rstrip('/')
        self.model = model

    def generate(self, prompt: str) -> Optional[str]:
        """Generate text using Ollama"""
        try:
            response = requests.post(
                f"{self.base_url}/api/generate",
                json={
                    "model": self.model,
                    "prompt": prompt,
                    "stream": False
                },
                timeout=30
            )

            if response.status_code == 200:
                return response.json().get('response', '').strip()
            return None
        except requests.exceptions.RequestException as e:
            print(f"Error: {e}")
            return None
Enter fullscreen mode Exit fullscreen mode

3. Prompt Engineering

The key to good output is a well-crafted prompt:

prompt = """You are a git commit message expert. Analyze the 
following git diff and generate a commit message following the 
Conventional Commits format.

Rules:
- Use format: <type>(<scope>): <subject>
- Types: feat, fix, docs, style, refactor, test, chore
- Subject should be lowercase, no period at end
- Keep it concise (max 50 characters for subject)
- If needed, add a body explaining what and why (not how)

Git diff:
{diff}

Generate ONLY the commit message, nothing else:"""
Enter fullscreen mode Exit fullscreen mode

4. Interactive CLI

def get_user_choice(message: str) -> str:
    """Get user input"""
    while True:
        choice = input(f"{message} [y/n/r/e]: ").lower()
        if choice in ['y', 'n', 'r', 'e']:
            return choice
        print("Invalid choice")
Enter fullscreen mode Exit fullscreen mode

5. Beautiful Terminal Output

class Colors:
    GREEN = '\033[92m'
    YELLOW = '\033[93m'
    RED = '\033[91m'
    CYAN = '\033[96m'
    BOLD = '\033[1m'
    END = '\033[0m'

# Usage
print(f"{Colors.GREEN}βœ“ Success!{Colors.END}")
print(f"{Colors.RED}βœ— Error!{Colors.END}")
Enter fullscreen mode Exit fullscreen mode

Demo

Here's a real example from the development of AI Commit itself:

Changes made:

  • Added OllamaClient class
  • Implemented API integration
  • Added error handling

Traditional commit:

git commit -m "added ollama stuff"
Enter fullscreen mode Exit fullscreen mode

AI Commit generated:

feat(ollama): implement Ollama API client integration

- Add OllamaClient class for API communication
- Implement model listing and availability check
- Add error handling for connection failures
- Support configurable base URL and model selection
Enter fullscreen mode Exit fullscreen mode

Much better, right?


Lessons Learned

1. Subprocess Management

Working with git commands from Python taught me:

  • Always capture both stdout and stderr
  • Handle exit codes properly
  • Use check=True for automatic error handling

2. API Design

The Ollama API is beautifully simple:

POST /api/generate
{
  "model": "llama2",
  "prompt": "your prompt here",
  "stream": false
}
Enter fullscreen mode Exit fullscreen mode

3. User Experience

CLI tools need:

  • Clear visual feedback
  • Color-coded output
  • Progress indicators
  • Graceful error handling

4. Prompt Engineering

Getting good AI output requires:

  • Clear, specific instructions
  • Examples of desired format
  • Constraints (length, style)
  • Context about the task

5. Error Handling

Users will do unexpected things:

  • No git repo? Handle it.
  • No staged changes? Handle it.
  • Ollama not running? Handle it.
  • Network issues? Handle it.

Performance Comparison

Model Speed Quality Memory
llama2 2-3s ⭐⭐⭐⭐⭐ 4GB
codellama 2-3s ⭐⭐⭐⭐⭐ 4GB
mistral 1-2s ⭐⭐⭐⭐ 4GB
phi <1s ⭐⭐⭐ 2GB
llama3 3-4s ⭐⭐⭐⭐⭐ 8GB

Recommendation: codellama for best balance of speed and quality.


What's Next

Short-term (1-2 months)

  • [ ] Configuration file support (.ai-commit.yml)
  • [ ] Custom prompt templates
  • [ ] Git hooks integration
  • [ ] Emoji support in commits

Medium-term (3-6 months)

  • [ ] VSCode extension
  • [ ] Batch commit support
  • [ ] Commit history analysis
  • [ ] Team collaboration features

Long-term (6+ months)

  • [ ] Fine-tuned model for commits
  • [ ] GitHub/GitLab integration
  • [ ] Multi-language support
  • [ ] Enterprise features

Try It Yourself

Installation

# 1. Install Ollama
curl -fsSL https://ollama.ai/install.sh | sh

# 2. Pull a model
ollama pull llama2

# 3. Install AI Commit
git clone https://github.com/himanshu231204/ai-commit.git
cd ai-commit
bash install.sh
Enter fullscreen mode Exit fullscreen mode

Usage

# In any git repository
git add .
ai-commit
Enter fullscreen mode Exit fullscreen mode

Contributing

AI Commit is open source and contributions are welcome!

Ways to contribute:

  • πŸ› Report bugs
  • πŸ’‘ Suggest features
  • πŸ“ Improve documentation
  • πŸ”§ Submit pull requests

GitHub: https://github.com/himanshu231204/ai-commit


Conclusion

Building AI Commit was a fantastic learning experience. I went from idea to working product in a weekend, and I'm excited to see where the community takes it.

Key takeaways:

  1. Local AI is powerful - Privacy + Performance
  2. CLI tools are fun - Direct, no-nonsense UX
  3. Open source rocks - Community collaboration
  4. Scratch your own itch - Build what you need

What commit message format do you prefer? Let me know in the comments!


Connect With Me

If you found this helpful, please:

  • ⭐ Star the repo
  • πŸ”„ Share with friends
  • πŸ’¬ Leave a comment

Happy coding! πŸš€


This is my first major open source project. Feedback and suggestions are more than welcome!



Enter fullscreen mode Exit fullscreen mode

Top comments (0)