Lately, I’ve been watching something happen in our industry that feels familiar in a way I can’t quite shake.
Companies are talking about “efficiency gains” from AI. Numbers get thrown around — 20%, 30%, sometimes more. And alongside those numbers come decisions: smaller teams, fewer engineers, faster timelines.
I’ve even heard stories — whether apocryphal or not — of companies laying off the majority of their engineering teams entirely. Product steps in, increasingly armed with AI tools, generating code directly. The few engineers that remain are no longer building systems so much as reviewing them, deploying them, and keeping the lights on.
I understand the appeal. I really do. If I’m being honest, part of me wants it to be true. But I can’t help thinking about Don Quixote. In the novel, there’s a potion called the Balsam of Fierabras. It’s supposed to be a miracle cure — a remedy capable of healing any wound.
Quixote believes in it completely.
When he finally prepares it and drinks it, the result is… less miraculous. He becomes violently ill. It does not heal him. It does not restore him. It very nearly breaks him.
And yet, he insists it worked.
What stuck with me about that scene isn’t that the balsam failed. It’s that the belief didn’t. The promise of a cure was so powerful that the evidence didn’t matter.
Right now, AI is starting to feel like that kind of promise. Not as a tool — but as a cure. A way to move faster, reduce cost, replace effort, and somehow come out ahead without trade-offs.
To be clear, I use AI. I think it’s useful. In some cases, it’s incredibly useful. It can accelerate workflows, help explore ideas, and remove friction from parts of the development process that used to take longer than they should have.
I do worry about engineers being replaced. Not completely, not overnight — but enough to matter. Enough to change who gets to stay, who gets to grow, and who never gets a chance to start.
But what worries me more is something harder to see. I worry about engineering being redefined as something smaller than it actually is. Because software development isn’t just output. It’s judgment. It’s trade-offs. It’s understanding why something should exist, not just how to build it.
When we reduce engineering to output, tools that accelerate output start to look like complete solutions.
I’ve already started seeing hints of this in the wild.
Engineers running into problems with AI that aren’t really code problems at all — but still burning time and credits trying to solve them like they are.
One example I saw recently stuck with me. An engineer had an AI stuck in a loop, trying to fix the same issue over and over again. Different approaches, different variations — sometimes even repeating the same solution.
The problem wasn’t the code.
It was the environment.
One system was using a case-insensitive file system. The deployed environment was case-sensitive. A single character difference was enough to break everything.
The AI never caught it.
And that’s not a knock on AI — it’s just outside the kind of context it naturally understands.
I’ve run into similar issues myself over the years. The kind of problems where experience matters. Where you’re not just reading code — you’re accounting for the messy, inconsistent, very human systems that code runs inside of.
That’s the part that doesn’t translate cleanly.
If we start removing experienced engineers from the equation — or replacing them with people who rely heavily on AI without that depth of experience — how often do we end up stuck in that loop?
Burning time. Burning tokens. Trying solution after solution… when someone with the right context could spot the issue in minutes.
Efficiency doesn’t just come from speed. It comes from knowing where to look. And that’s where I start to get uneasy. Because the danger isn’t that AI doesn’t work.
The danger is believing it works without consequence.
There’s another piece of this that I don’t hear talked about enough: what happens to the next generation of engineers.
A lot of us didn’t learn this craft by writing perfect code the first time. We learned by struggling through problems. By debugging things that didn’t make sense. By following threads deep into systems until we finally understood what was actually happening.
That process is slow. Sometimes frustrating. Occasionally painful. But it’s where the real skill comes from.
If the day-to-day work of engineering shifts toward writing prompts and reviewing generated code, the skill curve changes. The barrier to producing code gets lower — which sounds like a win — but the opportunity to develop deeper understanding starts to shrink.
And over time, that matters.
Because troubleshooting is not a surface-level skill. It’s not something you pick up by reviewing code that already works. It’s something you earn by being lost, over and over again, until you aren’t anymore.
If we reduce the number of engineers building systems from the ground up — if fewer people are forced to wrestle with complexity directly — we shouldn’t be surprised when deep troubleshooting ability becomes rare.
Not gone. But smaller. Harder to find. Concentrated in fewer people.
And that creates its own kind of fragility.
Once decisions are made on the belief that we can do more with less — smaller teams, fewer experienced engineers, more reliance on generated output — the cost doesn’t show up immediately.
It shows up later.
In complexity. In fragility. In systems that no one fully understands anymore.
Systems don’t stay healthy because they were generated quickly. They stay healthy because someone understands them deeply enough to take care of them.
I don’t think this ends with engineering disappearing. But I do think it’s possible we create a gap. Companies optimize for smaller and smaller teams. More output per person. More reliance on generated code. On paper, it looks efficient. Maybe it even is — for a while.
But over time, I can’t shake the feeling that we’re pushing toward a kind of critical mass. A point where the cost of all that generated output — in tokens, in credits, in complexity — starts to rival what it would have cost to simply have experienced engineers in the first place.
And by the time that realization sets in, the landscape may have changed.
Some of those engineers will have moved on. Not out of fear, but out of frustration. Burnout. Or simply because they saw the writing on the wall and chose something more stable.
Others will still be here, but fewer. More spread out. Carrying more of the load.
And the ones coming up behind them may not have had the same opportunities to learn the craft deeply. Not because they lacked ability, but because the environment around them changed.
Fewer chances to struggle through problems. Fewer chances to build systems from the ground up. Fewer mentors with the time, the mandate, the desire — or the instinct — to teach.
That’s the part I keep coming back to. Not the tools. Not the efficiency. But the long-term shape of the profession.
Don Quixote didn’t just believe in the Balsam of Fierabras — he doubled down on it. Even when the results didn’t match the promise.
I don’t think we’re there. Not yet. But I do think we’re at a point where it’s worth asking harder questions.
Not just “can we do this faster?”
But “what are we losing in the process?”
Because if we’re not careful, we may eventually find ourselves trying to rebuild something we quietly let slip away — and realizing the people who knew how to do it are no longer around to ask.
I’m not charging windmills here. Just trying to call it how I see it — and maybe ask a few questions before we all start drinking the balsam.
Top comments (0)