<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Brian Shen</title>
    <description>The latest articles on DEV Community by Brian Shen (@shenbrian).</description>
    <link>https://dev.to/shenbrian</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/shenbrian"/>
    <language>en</language>
    <item>
      <title>HumanExodus: Why I'm Building Measurement Infrastructure for the Largest Labour Transition in History</title>
      <dc:creator>Brian Shen</dc:creator>
      <pubDate>Sun, 29 Mar 2026 02:47:57 +0000</pubDate>
      <link>https://dev.to/shenbrian/humanexodus-why-im-building-measurement-infrastructure-for-the-largest-labour-transition-in-j0m</link>
      <guid>https://dev.to/shenbrian/humanexodus-why-im-building-measurement-infrastructure-for-the-largest-labour-transition-in-j0m</guid>
      <description>&lt;p&gt;In 2021, a young man showed me the DALL-E pilot. His parents are close friends. He'd just taken an internship at OpenAI.&lt;br&gt;
After the demo, I told him straight: based on what I just saw, CS students won't have anywhere to work. Their jobs will be done by this.&lt;br&gt;
He didn't answer right away. Smiled a little and said: "Well, machines still need someone to operate them."&lt;br&gt;
Fair point. Machines do need people. What he didn't convince me of was that we'd need anywhere near as many.&lt;br&gt;
That conversation stuck. Not because he was wrong exactly — but because neither of us had any way to measure what came next.&lt;/p&gt;

&lt;p&gt;The thing that annoys me about how people talk about this.&lt;br&gt;
It's not denial. Most people know the storm is coming. The thing they keep getting wrong is the pace. They see the direction, assume they have time, and then get surprised anyway. The velocity is the thing. It keeps catching people off guard — including people who really should know better.&lt;/p&gt;

&lt;p&gt;The irony that started this.&lt;br&gt;
Software engineers are building AI. They're also the people closest to the replacement wave. The same hands writing the code are the most exposed to what that code eventually does.&lt;br&gt;
That felt like a signal. If I wanted to understand how people reposition under AI pressure, engineers were the right place to start — not because they're the only ones affected, but because the pressure on them is earliest, most visible, and most measurable.&lt;/p&gt;

&lt;p&gt;What I'm actually building.&lt;br&gt;
HumanExodus is not career advice. Not a prediction engine. Not a dashboard.&lt;br&gt;
It's measurement infrastructure.&lt;br&gt;
The atomic unit is:&lt;br&gt;
AI-Induced Pressure → Human Repositioning → Observed Outcome&lt;br&gt;
The goal is an open, longitudinal dataset of how people actually reposition — not what they say they'll do, but what they do, and what happens after.&lt;br&gt;
Here's the honest truth: nobody really knows where AI leads. Not for the economy, not for society, not for culture or ethics. The destination is still forming. But longitudinal data gives us something useful — it lets us see which vectors are emerging to dominate the patterns. That's not prediction. That's at least navigation.&lt;/p&gt;

&lt;p&gt;Why open. Why now.&lt;br&gt;
I'm not an engineer. This project exists because I kept watching and couldn't find anyone measuring it properly.&lt;br&gt;
The schema is open. The methodology is open. The data will be open as it accumulates.&lt;br&gt;
If you've repositioned because of AI pressure — changed roles, picked up new tools, shifted focus, left the field — your record belongs here. Not as a data point. As evidence.&lt;/p&gt;

&lt;p&gt;HumanExodus: github.com/shenbrian/humanexodus&lt;/p&gt;

</description>
      <category>ai</category>
      <category>career</category>
      <category>opensource</category>
      <category>database</category>
    </item>
    <item>
      <title>I built an open-source system to track how engineers actually adapt to AI</title>
      <dc:creator>Brian Shen</dc:creator>
      <pubDate>Sat, 28 Mar 2026 03:20:39 +0000</pubDate>
      <link>https://dev.to/shenbrian/i-built-an-open-source-system-to-track-how-engineers-actually-adapt-to-ai-2k4p</link>
      <guid>https://dev.to/shenbrian/i-built-an-open-source-system-to-track-how-engineers-actually-adapt-to-ai-2k4p</guid>
      <description>&lt;h2&gt;
  
  
  The problem
&lt;/h2&gt;

&lt;p&gt;Everyone has opinions about what engineers should do in response to AI. Almost no one has data about what they actually do.&lt;/p&gt;

&lt;p&gt;I wanted data.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I built
&lt;/h2&gt;

&lt;p&gt;HumanExodus is a longitudinal observation system. It captures how engineers respond to AI pressure at the moment it's happening, then follows up 30 days later to find out what actually happened.&lt;/p&gt;

&lt;p&gt;The gap between intention and reality is the dataset.&lt;/p&gt;

&lt;h2&gt;
  
  
  How it works
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Engineer enters role, experience, and tech stack&lt;/li&gt;
&lt;li&gt;Rule-based engine estimates AI exposure level (HIGH / MEDIUM / LOW)&lt;/li&gt;
&lt;li&gt;Claude API generates personalised adjacent moves based on their profile&lt;/li&gt;
&lt;li&gt;Engineer selects their intended next step&lt;/li&gt;
&lt;li&gt;Session saved to Supabase&lt;/li&gt;
&lt;li&gt;30 days later: automated email via Resend asks what actually happened&lt;/li&gt;
&lt;li&gt;Outcome saved as a follow-up record&lt;/li&gt;
&lt;/ol&gt;

&lt;h2&gt;
  
  
  What we're seeing so far
&lt;/h2&gt;

&lt;p&gt;With just 11 sessions, a pattern is already emerging:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;HIGH exposure engineers: high uncertainty (40% "not sure")&lt;/li&gt;
&lt;li&gt;MEDIUM exposure engineers: mostly staying put (80% "stay same")&lt;/li&gt;
&lt;li&gt;LOW exposure engineers: 100% staying same&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This is early and small. But it's real behavioural data, not survey responses.&lt;/p&gt;

&lt;h2&gt;
  
  
  The tech stack
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Frontend: single HTML file, no build step, no framework&lt;/li&gt;
&lt;li&gt;Database: Supabase (Postgres)&lt;/li&gt;
&lt;li&gt;AI: Claude API for move generation&lt;/li&gt;
&lt;li&gt;Email: Resend&lt;/li&gt;
&lt;li&gt;Hosting: GitHub Pages&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  The moat
&lt;/h2&gt;

&lt;p&gt;The code is simple. The data is not.&lt;/p&gt;

&lt;p&gt;If this works, HumanExodus becomes the largest structured dataset of how engineers adapt to AI — role by role, stack by stack, over time.&lt;/p&gt;

&lt;h2&gt;
  
  
  What's next
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;v0.5: Predictive layer — "people like you tend to move into X within 30-60 days"&lt;/li&gt;
&lt;li&gt;But only after real follow-up data exists&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Try it / contribute
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Tool: &lt;a href="https://shenbrian.github.io/humanexodus/humanexodus-v01.html" rel="noopener noreferrer"&gt;https://shenbrian.github.io/humanexodus/humanexodus-v01.html&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Patterns: &lt;a href="https://shenbrian.github.io/humanexodus/patterns.html" rel="noopener noreferrer"&gt;https://shenbrian.github.io/humanexodus/patterns.html&lt;/a&gt;
&lt;/li&gt;
&lt;/ul&gt;

</description>
      <category>opensource</category>
      <category>ai</category>
      <category>career</category>
      <category>buildinpublic</category>
    </item>
  </channel>
</rss>
