An open-source tool to measure and analyze AI coding tool adoption (GitHub Copilot, Windsurf, Cursor, ChatGPT, Claude) by analyzing git commit history
The Problem: Measuring AI Tool Adoption
Organizations are rapidly adopting AI coding assistants like GitHub Copilot, Windsurf, Cursor, and ChatGPT. But here's the challenge: How do you actually measure AI tool usage across your development teams?
Most AI coding tools don't provide granular usage data at the commit level. You might know how many seats are active, but not:
- Which commits were AI-assisted?
- Which developers are leveraging AI tools most effectively?
- What's the trend of AI adoption over time?
- Which AI tools are being used?
The Solution: AI Usage Measurement Framework
I built an open-source framework that analyzes git commit history and Agents.md files to detect AI-assisted development. It's privacy-friendly (analyzes local data only) and works with any git repository.
Demo
Here's the framework analyzing the github/awesome-copilot repository:
Results from the demo:
- 377 total commits analyzed
- 86 AI-assisted commits detected (22.81%)
- 3 AI tools identified: GitHub Copilot, ChatGPT, Claude
- 159 contributors tracked
Key Features
Multi-Tool Detection
The framework detects commits from various AI coding tools:
| Tool | Detection Patterns |
|---|---|
| GitHub Copilot |
copilot, github copilot, co-authored-by:.*copilot
|
| Windsurf | windsurf |
| Cursor | cursor |
| ChatGPT |
chatgpt, gpt-4, gpt-3
|
| Claude |
claude, anthropic
|
| Devin | devin |
| Amazon Q | amazon q |
| Codeium | codeium |
| Tabnine | tabnine |
Interactive Web Dashboard
The Streamlit-based dashboard provides:
- Overview: Pie charts showing AI vs regular commits
- Timeline: Track AI usage trends over time
- Authors: See which contributors use AI tools
- AI Commits: Browse detected AI-assisted commits
- Agents.md Files: Parse AI documentation files
GitHub Teams Integration
Analyze all repositories under a GitHub team with a single click. Perfect for engineering managers who want organization-wide insights.
CLI for Automation
# Analyze a single repository
ai-usage-measurement-framework analyze https://github.com/owner/repo
# Analyze all repos for a GitHub team
ai-usage-measurement-framework team my-org my-team --token $GITHUB_TOKEN
# Export results to JSON
ai-usage-measurement-framework analyze /path/to/repo --output results.json
Quick Start
Prerequisites
- Python 3.11+
- Git
- uv (fast Python package manager)
Installation
# Clone the repository
git clone https://github.com/your-org/ai-usage-measurement-framework.git
cd ai-usage-measurement-framework
# Install dependencies
uv sync
# Launch the web dashboard
uv run streamlit run src/ai_usage_measurement_framework/webapp/app.py
Using the CLI
# Analyze a public GitHub repository
uv run ai-usage-measurement-framework analyze https://github.com/github/awesome-copilot
# Output:
# Analysis Results: awesome-copilot
# Total Commits: 377
# AI-Assisted Commits: 86 (22.81%)
# Tools Detected: GitHub Copilot, ChatGPT, Claude
How It Works
The framework uses pattern matching on commit messages to detect AI-assisted development:
from ai_usage_measurement_framework.analyzers import GitAnalyzer
with GitAnalyzer("https://github.com/owner/repo") as analyzer:
results = analyzer.analyze()
print(f"Total commits: {results.total_commits}")
print(f"AI-assisted: {results.ai_assisted_commits}")
print(f"AI percentage: {results.ai_percentage}%")
print(f"Tools detected: {results.tools_detected}")
Detection Confidence
Each detection has a confidence score based on the pattern matched:
- High confidence (0.9): Explicit tool mentions like "copilot", "windsurf"
- Medium confidence (0.7-0.85): Tool-related patterns like "gpt-4", "claude"
- Low confidence (0.3-0.6): Generic patterns like "ai-generated", "ai-assisted"
Use Cases
1. ROI Measurement
Quantify AI tool adoption to justify your Copilot/Windsurf investment to leadership.
2. Team Benchmarking
Compare AI usage across teams to identify adoption gaps and training opportunities.
3. Trend Analysis
Track how AI usage grows over time as your organization matures in AI-assisted development.
4. Compliance & Auditing
Some organizations need to track AI-generated code for compliance purposes.
Extending the Framework
Add Custom Patterns
# In patterns.py
TOOL_PATTERNS["Your Custom Tool"] = {
"patterns": [r"custom-tool", r"customtool"],
"weight": 0.9,
}
Contributing
This is an Apache 2.0 licensed open-source project. Contributions are welcome!
Ways to contribute:
- Add detection patterns for new AI tools
- Improve the web dashboard UI
- Add new export formats
- Write tests and documentation
Get Started Today
- Star the repo: github.com/satinath-nit/ai-usage-measurement-framework
- Try it out: Analyze your own repositories
- Share feedback: Open issues or discussions
What AI coding tools does your team use? How do you currently measure AI adoption? Let me know in the comments!
Top comments (0)