We use Cursor at our company on a shared Enterprise account. One budget, one blanket that every developer pulls in their direction. Staying within budget is a challenge every engineering team is dealing with right now.
What Happened
Engineering costs used to be two things: headcount and cloud infrastructure. You had tools for both. Then AI coding assistants showed up, and suddenly there's a third cost center that nobody has good tooling for.
There's also model inflation -- it's genuinely hard to tell which model is better, cheaper, or more expensive. The names don't help. So one developer, in all innocence, picked a model with "Fast" in the name thinking it was the lighter, cheaper option. Turns out it was 10x more expensive per request than what everyone else was using.
$1,500. One day. Nobody knew.
Cursor's Dashboard Won't Save You
Cursor gives you an admin panel with raw usage numbers. But it won't tell you when something is off. No anomaly detection, no alerts, no spending limits per developer. You find out about cost spikes when the invoice arrives -- weeks after the damage is done.
For a company with 50, 100, 500 developers, this is a serious blind spot.
So We Built Something
After paying our tuition, I decided to build a monitoring tool that connects to Cursor's Enterprise APIs, runs anomaly detection, and sends Slack alerts when something looks off. When a developer's daily spend spikes to 4x their average, we know within the hour, not next month.
Here's a 90-second demo of how it works:
But monitoring was just the beginning. The dashboard now answers questions we didn't even know we had:
- Cost monitoring -- who's spending how much, on which models, and is it reasonable?
- Adoption tracking -- is everyone actually using the tool we're paying for? We found "empty chairs" -- developers with active licenses who weren't using Cursor at all.
- Model optimization -- which models cost more and which cost less, shown with actual per-request pricing from your own usage data.
- Team comparison -- how does each team's usage compare, and where are the optimization opportunities?
Thanks to this tool, we could figure out which models to recommend, which to block, and how to use Cursor more effectively across the entire dev department.
Open Source
I built it open source so any team dealing with the same problem can deploy it themselves. It's a Next.js dashboard backed by PostgreSQL, self-hosted with Docker, takes about 10 minutes to set up.
MIT licensed, free forever: cursor-usage-tracker on GitHub
The Question Every Engineering Leader Should Be Asking
AI coding tools are becoming a real cost center -- right alongside headcount and cloud infrastructure. But unlike AWS or GCP, there's no mature observability tooling for this category yet.
If your team is on Cursor Enterprise, ask yourself: do you know how much each developer is spending? Do you know which models they're using? Would you find out if someone accidentally switched to a model that costs 10x more?
If the answer is no, you might want to check your bill.
Top comments (0)