DEV Community

Brian Shen
Brian Shen

Posted on

I built an open-source system to track how engineers actually adapt to AI

The problem

Everyone has opinions about what engineers should do in response to AI. Almost no one has data about what they actually do.

I wanted data.

What I built

HumanExodus is a longitudinal observation system. It captures how engineers respond to AI pressure at the moment it's happening, then follows up 30 days later to find out what actually happened.

The gap between intention and reality is the dataset.

How it works

  1. Engineer enters role, experience, and tech stack
  2. Rule-based engine estimates AI exposure level (HIGH / MEDIUM / LOW)
  3. Claude API generates personalised adjacent moves based on their profile
  4. Engineer selects their intended next step
  5. Session saved to Supabase
  6. 30 days later: automated email via Resend asks what actually happened
  7. Outcome saved as a follow-up record

What we're seeing so far

With just 11 sessions, a pattern is already emerging:

  • HIGH exposure engineers: high uncertainty (40% "not sure")
  • MEDIUM exposure engineers: mostly staying put (80% "stay same")
  • LOW exposure engineers: 100% staying same

This is early and small. But it's real behavioural data, not survey responses.

The tech stack

  • Frontend: single HTML file, no build step, no framework
  • Database: Supabase (Postgres)
  • AI: Claude API for move generation
  • Email: Resend
  • Hosting: GitHub Pages

The moat

The code is simple. The data is not.

If this works, HumanExodus becomes the largest structured dataset of how engineers adapt to AI — role by role, stack by stack, over time.

What's next

  • v0.5: Predictive layer — "people like you tend to move into X within 30-60 days"
  • But only after real follow-up data exists

Try it / contribute

Top comments (0)