DEV Community

Codequal
Codequal

Posted on

AI Writes Your Code. Who Watches for Drift?

The problem

AI coding tools are incredible — until they quietly drift off course.

You're using Cursor, Copilot, or Claude Code. The code looks fine. Tests pass. But over a few commits, subtle shifts accumulate: more files touched per commit than usual, dependency trees growing unexpectedly, CI times creeping up.

By the time you notice, the damage has compounded across dozens of commits.

What Evolution Engine does

Evolution Engine is a local-first CLI that monitors your SDLC signals and flags statistical anomalies:

  • Git patterns — file dispersion, change locality, co-change novelty
  • CI pipelines — duration trends, failure patterns
  • Dependencies — count changes, depth shifts
  • Deployments — release cadence, prerelease patterns
  • Testing — failure rates, skip rates, suite duration
  • Coverage — line and branch rate changes

When a metric deviates significantly from your project's baseline, EE raises an advisory — not a bug report, a drift alarm.

No AI APIs required

This was a deliberate design choice. Your code never leaves your machine. EE does pure statistical analysis locally.

When you want deeper investigation, EE generates a structured prompt you can paste into your own AI tool — ChatGPT, Claude, Cursor, whatever you trust. You control what leaves your machine.

How to use it

pip install evolution-engine
cd your-project
evo analyze .
Enter fullscreen mode Exit fullscreen mode

That's it. EE builds a baseline from your git history and flags deviations.

As a GitHub Action

- uses: alpsla/evolution-engine@v1
  with:
    github-token: ${{ secrets.GITHUB_TOKEN }}
Enter fullscreen mode Exit fullscreen mode

As a git hook

evo init --hooks
Enter fullscreen mode Exit fullscreen mode

Open source

Evolution Engine is available on PyPI and GitHub. Git analysis is free — no account, no license key needed.

Would love feedback from the community. What signals matter most in your workflow?

Top comments (0)