I've been watching the AI narrative closely. Building with it. Learning in public. Talking to developers, founders, and regular people trying to figure out what's real and what's noise.
And I keep running into the same story, told the same way, by the same people:
"AI is going to replace developers. No one will have a job. AGI is around the corner."
And every single time, the person saying it is trying to raise money.
Let's talk about what's actually happening
OpenAI, Anthropic, and the other big labs are in an arms race. Not just for talent or compute. For capital. We're talking about rounds measured in billions. And to justify those valuations, they need a story that's big enough.
So what's the biggest story you can tell?
"Our product replaces high-value white collar workers."
That's the pitch. Not to you. To investors.
Here's how the math works in every pitch deck you'll never see: "Our AI replaces ten people making $150K each. That's $1.5M in value per customer. There are 500,000 companies that fit our ICP. That's a $750B TAM."
Cue the standing ovation from Sand Hill Road.
But here's the thing. That math only works if you believe the people disappear.
And I don't. (Spoiler: neither does 200 years of economic history.)
Enter Jevons Paradox
In 1865, an economist named William Stanley Jevons noticed something weird. England had just made steam engines way more efficient at burning coal. Everyone assumed coal usage would go down.
It went up. Way up.
Because when something gets cheaper and more efficient, people don't just do the same amount of it. They do more. Way more. New use cases emerge. New industries form. Demand explodes.
This isn't some obscure footnote. This is one of the most well-documented patterns in economic history. And it applies directly to what's happening with AI right now.
This has happened before. Every single time.
Let me give you a few examples that should feel familiar.
ATMs were supposed to kill bank tellers.
When ATMs rolled out in the 1970s and 80s, everyone assumed bank tellers were done. A machine that dispenses cash? Pack it up, Karen from the third window.
What actually happened: the number of bank tellers went up. ATMs made it cheaper to open bank branches, so banks opened more of them. And those branches needed people. The role shifted from counting cash to advising customers and selling financial products. The job didn't disappear. It evolved and expanded.
Spreadsheets were supposed to kill accountants.
VisiCalc and then Excel automated calculations that used to take teams of people days to complete. The fear was real. Why hire an accountant when a spreadsheet does it faster? (Turns out, because someone still needs to explain to the CEO why the spreadsheet says they're broke.)
What actually happened: the number of accountants exploded. Suddenly every small business could afford to do serious financial analysis. The demand for people who could interpret, strategize, and advise around those numbers grew far beyond what existed before. The tool didn't replace the person. It created a bigger market for the person.
Cloud computing was supposed to kill ops engineers.
"You don't need a server room anymore. You don't need sysadmins. Just put it in the cloud." That was the pitch. Somewhere, a sysadmin reading this just felt a chill.
What actually happened: DevOps became one of the fastest growing roles in tech. The infrastructure got more complex, not less. Someone still needs to architect it, secure it, optimize it, and keep it running at 3am when the pager goes off. The tools got better. The demand for people who understand them got bigger.
The internet was supposed to kill retail jobs.
E-commerce was going to make stores irrelevant. No more cashiers. No more salespeople.
What actually happened: the internet created an entirely new category of retail jobs. Fulfillment centers, logistics, customer experience, digital marketing, content creation, social media management. The U.S. has more retail-adjacent jobs now than before Amazon existed.
The pattern is always the same. The technology makes something cheaper. Cheaper means more people use it. More usage means more demand. More demand means more jobs. Different jobs, sometimes. But more of them.
Every. Single. Time.
So why does the "jobs are going away" narrative persist?
Because it's useful. Not to you. To the people raising money.
If you're an AI lab trying to justify a $100B+ valuation, the story has to be enormous. "We help people be a bit more productive" doesn't exactly make a venture capitalist reach for their checkbook. "We replace entire categories of workers" does.
It's not even that they're lying exactly. It's that the framing is self-serving. When the CEO of an AI company talks about pricing their product based on "the cost of the worker it replaces," that's not an economic insight. That's a sales pitch wearing a lab coat.
And look, I get it. VCs need big narratives to deploy big checks. Founders need those checks to build. It's how the game works. I'm not mad at it.
But we don't have to internalize their fundraising deck as our worldview. You wouldn't take career advice from a company whose business model depends on you not having a career.
The real opportunity is expansion, not replacement
Here's what I think is actually happening, and it's way more exciting than the doom narrative:
AI is about to make millions of people capable of things they couldn't do before.
Not because it replaces their skills. Because it augments them.
A marketer who couldn't write code can now build internal tools. A small business owner who couldn't afford a legal review can now get a solid first pass. A student who couldn't afford a tutor can now get one-on-one help at 2am. A solo founder who couldn't afford a team of ten can now ship like they have one.
That's not replacement. That's expansion. That's Jevons Paradox playing out in real time.
And when you expand what's possible, you don't get fewer jobs. You get new ones. Ones that don't have names yet. Ones we can't predict because they'll be created by the very people we're currently telling to be afraid.
The self-fulfilling prophecy problem
Here's what actually scares me. Not AI. The narrative around AI.
Because narratives shape behavior. If every developer believes their job is going away, they stop investing in their craft. Companies freeze hiring because "AI will handle it." Students pivot away from computer science. Organizations delay projects because they're "waiting for AI to get better."
Congratulations. We just created a recession with vibes.
And then what happens? A slowdown. Not because the technology demanded it. But because we collectively talked ourselves into it.
That's the real danger. Not that AI takes our jobs. That we give them away because we believed someone's Series C deck.
Techno-optimism isn't naive. Defeatism is.
I know "techno-optimism" gets a bad rap sometimes. People think it means ignoring problems or being blindly cheerful about technology.
That's not what I'm talking about.
I'm talking about looking at 200 years of economic history and recognizing a pattern. Every major technology wave has created more prosperity, more jobs, and more opportunity than it displaced. Not without pain. Not without transition. But the net effect has always been expansion.
The printing press didn't kill scribes and create nothing. It created an entire publishing industry, literacy movement, and eventually the modern knowledge economy. (Sorry, scribes. But also, you're welcome, everyone who can read.)
The automobile didn't just kill horse-related jobs. It created suburbs, supply chains, tourism, and an entire middle class built around manufacturing and infrastructure.
The internet didn't just kill some jobs. It created millions more. Including "influencer," which honestly no one saw coming.
AI will be the same. If we let it.
The key phrase being: if we let it.
We create the world we choose to see
This is the part I feel most strongly about.
Right now, we're at a crossroads. The technology is powerful. The potential is enormous. But the direction it goes depends on the story we tell ourselves about it.
If we collectively decide that AI is a tool for replacement, that's what it'll become. Companies will use it to cut headcount. Workers will be treated as costs to eliminate. And we'll build a smaller, meaner version of the future.
But if we collectively decide that AI is a tool for expansion, the math changes completely.
More people building. More problems being solved. More small businesses competing with big ones. More individuals with capabilities that used to require entire teams. More creativity, more experimentation, more shots on goal.
That's not wishful thinking. That's what happens every single time we make a powerful capability cheaper and more accessible. The demand curve does what it always does. It goes up.
My ask to developers
If you're reading this on dev.to, you're probably someone who builds things. Someone who has influence over how technology gets used and talked about.
So here's my ask:
Stop repeating the AI doom talking points as if they're settled science. They're not. They're marketing.
When someone at your company says "should we even hire for this role, won't AI handle it?" push back. The answer is almost always that AI will make that person more productive, not unnecessary.
When you see a headline about AGI replacing all developers, ask yourself: who benefits from me believing this? Follow the money. It usually leads to someone with a cap table, a pitch deck, and a very specific number they need you to be scared of.
And when you're building with AI, build for expansion. Build tools that make more people capable. Build products that create new possibilities instead of just automating old ones.
Because the builders who define this era won't be the ones who used AI to cut costs. They'll be the ones who used it to create things that didn't exist before.
The jobs aren't going away. They're going to multiply in ways we can't yet imagine. But only if we choose to believe that and build accordingly.
What do you think? Am I being too optimistic, or is the doom narrative really just a fundraising strategy that we've all accidentally internalized? I'd love to hear from people who are actually building with AI every day.





Top comments (0)