DEV Community

wentao long
wentao long

Posted on

Stop letting your AI repeat mistakes: I built an open-source MCP observability dashboard (React 19 + ECharts) 🚀

Vibe coding with tools like Claude Code or Cursor feels like magic—until your AI repeats the exact same bug it made 10 minutes ago.

As developers, we are dealing with two massive pain points in AI-assisted development right now:

The Black Box: We have no idea how many tokens we are burning or where the time actually goes.

AI Amnesia: You correct the AI, but in the next session, it forgets everything and breaks your codebase again.

To solve this, I built ai-dev-analytics (AIDA).

Meet AIDA 🕵️‍♂️
AIDA is an open-source, 100% local Model Context Protocol (MCP) server that acts as an observability layer for your AI sessions.

Instead of just being a "token counter," AIDA is a Rule Auto-Codifier.

✨ The Killer Feature
When AIDA detects that the AI has gone off-track or introduced an architectural deviation, it doesn't just log it. It automatically distills that failure into persistent project rules (compatible with .cursorrules or .clauderules).

Your AI actually learns from its failures and stops repeating them.

🛠️ The Tech Stack
As a web frontend developer, I wanted the dashboard to be modern, fast, and beautiful:

Dashboard: Built from scratch with React 19, Tailwind CSS 4, and ECharts for real-time ROI and bottleneck visualizations.

Runtime: Node.js + TypeScript.

Security: 100% Local. Zero runtime dependencies. Verified A/A/A Score on Glama.ai.

⚡ Quick Start (Zero Config)
You don't need to clone the repo or install heavy dependencies. Just add this single line to your MCP client config:

{
  "mcpServers": {
    "aida": {
      "command": "npx",
      "args": ["-y", "ai-dev-analytics", "mcp"]
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

🤝 Let's build together
I just released v1.0.0 today. If you are tired of AI amnesia and want to make your vibe coding measurable, give it a try!

🔗 GitHub Repository: https://github.com/LWTlong/ai-dev-analytics

Drop a ⭐ if it helps your workflow, and let me know your thoughts in the comments! What's the most annoying mistake your AI keeps repeating?

Top comments (2)

Collapse
 
botanica_andina profile image
Botánica Andina

Totally relate to the AI amnesia problem. The 'Rule Auto-Codifier' to make the AI actually learn from its failures is genius! I'm curious, how does AIDA reconcile new rules with existing ones, especially when the project context evolves?

Collapse
 
wentao_long_b5409a04bcfb5 profile image
wentao long

Great question! AIDA handles rule reconciliation at multiple levels.

  1. Exact dedup via fingerprinting

    Every rule gets a SHA-256 fingerprint of its normalized content. If the same rule is suggested twice — across runs or

    branches — it's silently skipped. No duplicates ever enter the registry.

  2. Similarity detection

    aida rules dedupe uses Jaccard keyword overlap to surface rules with >40% similarity. This catches cases where a new rule is
    essentially a stricter version of an old one — you decide manually which one wins.

  3. Status lifecycle

    Rules have a status field: active / pending / conflict / deprecated. When project conventions evolve, you deprecate old rules
    — they drop out of the generated .md files your AI reads, but stay in rules.json as an audit trail.

  4. Branch merge conflicts — step by step

The source of truth is .aidevos/rules.json (committed to git). The .aidevos/rules/*.md files are auto-generated views and

gitignored — so conflicts only ever happen in one JSON file.

When two developers add rules on different branches:

# Step 1: git merge produces a conflict in rules.json (standard conflict markers)

# Step 2: one command resolves it

aida rules merge

→ ✓ Merged: 4 total rules (1 new from incoming branch)

# Step 3: rebuild markdown views

aida rules build

# Step 4: commit normally

git add .aidevos/rules.json

git commit -m "merge: resolve rules conflict"

How conflicts are handled:

  • A and B add the exact same rule → fingerprint match, only one kept
  • A and B add similar but different rules → both kept, aida rules dedupe flags them for manual review
  • Three-way conflict → aida rules merge handles multiple conflict marker blocks

One thing to keep in mind: after any git pull where rules.json changed, run aida rules build to regenerate your local rule

views. Two lines:

git pull

aida rules build

The key design decision: reconciliation always happens in structured data (rules.json), never in markdown — so it's deterministic and scriptable, no manual JSON editing required.