It's February 2026 and I've got a fractured leg, which turns out to be the right conditions for looking back at a year that moved too fast to examine properly at the time. Laptop on the sofa, nowhere to be, no standups to attend, no Slack pings filling the gaps between thoughts. Just the kind of enforced pause that doesn't happen in a normal engineering career.
And what keeps coming back is 2025. Specifically, how different the work felt, and why.
David Farley defines software engineering as "the application of an empirical, scientific approach to finding efficient, economic solutions to practical problems in software. It requires practitioners to become experts at both learning effectively and managing complexity sustainably."
Sitting here with time to actually think, I keep returning to that definition. Not as a textbook quote — as a recognition. It describes exactly what 2025 forced a return towards.
We forgot this. Or rather, we let others forget it for us. And 2025 was the year the consequences became impossible to ignore.
How Software Engineers Became the Least Important People in Software Engineering
Somewhere in the 2010s, we collectively decided that software engineering was a coding problem. The bootcamp explosion promised that anyone could learn to code in twelve weeks and immediately qualify for a six-figure job. General Assembly, Lambda School, Flatiron School — they all taught variations of the same curriculum: React, Rails, JavaScript fundamentals, maybe some basic database work. The implicit promise was that coding was the skill that mattered. Learn the syntax, learn the frameworks, and you were an engineer.
This was never true, but it was convenient. Convenient for the industry, which needed implementers faster than universities could produce them. Convenient for business people, who wanted to believe that the "vision" and the "strategy" were the hard parts, while the coding was just execution. Convenient for the bootcamps themselves, which could sell a transformational experience in a timeframe that fit between unemployment benefits and desperation.
What we actually produced was a generation of developers who knew how to build components but not how to decide what components to build. Developers who could write React but couldn't sit with a stakeholder and understand why a feature mattered. Developers who knew the technical implementation of a user story but had no context for the business problem that story was supposed to solve.
The "full-stack" developer became the industry's darling — not because full-stack represented deep competence, but because it represented flexibility. One person who could do everything, which really meant one person who could be assigned to any ticket without complaining. The stack didn't matter; what mattered was that we'd found a way to make engineers interchangeable.
Meanwhile, the complexity didn't go away. It just got managed by people who weren't trained to manage it. Business founders and product managers took ownership of "the vision" — often without understanding what was technically possible. Designers took ownership of "the experience" — frequently without understanding the data models that would have to support their interfaces. Architects emerged to "guide" technical decisions, often from positions where they no longer wrote production code. Engineering managers optimised for velocity metrics that measured activity rather than outcomes.
Every new role that appeared was built on the same assumption: engineers couldn't be trusted with the full picture. They needed translation, guidance, oversight. The person who actually understood the system — who knew where the complexity lived, which assumptions were fragile, what would break under load — had the least authority to influence decisions.
Technical debt became a "developer problem" rather than a business reality. Refactoring became something you did "when you had time" between feature deliveries. The complexity engineers managed was invisible to business stakeholders, which meant it was unvalued. When engineers tried to explain why a simple-sounding feature would take weeks, they were seen as making excuses rather than describing constraints.
The bootcamp model reinforced this dynamic by design. They taught React not because React was the right tool for every problem, but because React was what employers wanted. They taught CRUD applications because CRUD applications were easy to teach and easy to evaluate. They produced developers who could follow tutorials, copy patterns, and implement specifications — but not developers who could define problems, evaluate tradeoffs, or own outcomes.
This wasn't the fault of the bootcamp graduates. They were doing exactly what the system asked of them. The fault was with an industry that had convinced itself that coding was the valuable part, and that everything else — understanding context, designing approaches, managing complexity sustainably — was someone else's job.
You could say I'm being elitist about bootcamps — that they gave access to people who couldn't afford CS degrees. But this isn't about credentials. It's about what we taught. Bootcamps taught coding as a commodity skill because that's what the industry demanded. The critique is of an industry that wanted implementers, not engineers. Access matters. What we give people access to matters more.
By the early 2020s, software engineers had become the least important people in software engineering. We were the ones who actually built the systems, who understood how they worked, who managed the complexity that everyone else ignored. But we weren't the ones who decided what to build, or why, or for whom. We'd been reduced to expensive typists, implementing decisions made by people who didn't understand their implications.
We called it "collaboration." It was mostly translation — endless meetings where engineers tried to explain technical constraints to business people who didn't want to hear them, and business people tried to explain user needs to engineers who weren't allowed to talk to users directly. The boundary between roles wasn't about efficiency; it was about control. And engineers had lost it.
Then 2025 happened.
When AI Exposed the Gap
The first time I used Copilot to generate a React component in early 2025, I felt a strange mix of exhilaration and dread. The exhilaration was obvious — I'd just written a complex form handler in seconds instead of minutes. The dread took longer to identify. It wasn't that the AI was going to replace me. It was that the AI was making visible something I'd been trying not to see: the coding was never the hard part.
I'd spent the previous decade optimising for coding speed. Learning new frameworks, mastering type systems, keeping up with the JavaScript ecosystem's relentless churn. All of that became nearly worthless overnight — not because the AI could do it better, but because the AI could do it fast enough that the difference between "good at coding" and "competent at coding" stopped mattering.
What the AI couldn't do was understand why we were building something. It couldn't sit with risk engineers and learn how they actually processed reports. It couldn't evaluate whether a technical approach would scale with the business, or whether we were solving the right problem, or what would happen when the edge cases we hadn't considered inevitably appeared.
Those weren't coding problems. They were engineering problems. And they'd been my problems all along — I just hadn't been allowed to own them.
In January 2025, I started building a risk assessment platform for insurance underwriters. LightRAG would process their reports and generate standardised grading according to company guidelines. In 2024, this would have triggered the full organisational machinery: product manager for discovery, designer for workflows, architect for technical approach, probably three engineers for six months of implementation.
Instead, I started with the SDK. Not because someone prioritised it in a roadmap, but because my data scientist partner needed something to test their prompt engineering against. They needed real inputs and real outputs, a way to iterate on scoring guidelines without waiting for a full platform. So I built a simple Python library — ingest reports, run them through LightRAG, return structured grading. A few days' work.
While they tested prompts, I built the server backend. No handoff documents. No estimation rituals. Just parallel work coordinated through conversation. The speed was disorienting. I'd spent years waiting — for requirements, for designs, for approvals — and now there was nothing to wait for. The work was just... done. Then I could do more work.
But the real shift came when I did something I'd never done before. I sat down with the risk engineers themselves. Not through a product manager who would translate. Not by reviewing personas someone else created. I sat in their workspace, watched them process reports, understood the actual pain of their current workflow. Then I sketched UI flows on a whiteboard whilst they told me what would work.
This would have been impossible in 2024. Not because I lacked the skills — I could always sketch, always ask questions. But because the structure prevented it. The structure said that was product's job, or design's job. The structure said engineers implement, they don't discover. The structure said you wait for specifications, you don't create them.
We refined those mockups over a week. I'd sketch something, they'd try it in their actual workflow, we'd identify what didn't work, I'd iterate. When we landed on something that felt right, I wrote the requirements documentation myself — translating their domain knowledge into technical specs whilst I could still ask clarifying questions.
The platform grew organically. SDK became foundation. Backend took shape. UI emerged from those collaborative sessions. When the first business line was stable, I started conversations with the second business line directly — understanding their variations, adapting what we'd built. In 2024 this would have required a product manager, a designer, multiple engineers, six months to get started. I delivered the complete platform — SDK, backend, UI, two business lines, end-to-end stakeholder management within a year, alone.
What made this possible wasn't "10x coding." The AI helped with boilerplate, sure. But what actually made it possible was exercising the full engineering competence that Farley defined: understanding the problem deeply enough to design an efficient, economic solution. The coding was trivial. The engineering was not.
When coding becomes fast, engineering judgement becomes visible. When implementation is cheap, understanding the problem becomes valuable. The things we'd offloaded to PMs and designers — understanding stakeholders, designing workflows, making tradeoffs — turned out to be engineering work after all. We'd been solving problems all along. We just weren't allowed to own the solutions.
What the Acceleration Broke
The transition wasn't clean. Three things broke, and they all point to the same root cause: treating engineering as coding for so long meant that when the full competence was suddenly required, neither organisations nor people were ready for it.
My engineering manager said something like: "Now that AI handles the coding, you've got bandwidth for more." The scope expanded. The timeline didn't. Headcount stayed static. The assumption was that coding had become solved, freeing up capacity for "higher value work."
But what he called "higher value work" was actually the engineering I'd been doing all along — understanding context, designing approaches, owning outcomes. The cognitive load increased whilst the recognition of that load didn't. The organisation saw efficiency gains and demanded more capability from the same people, without acknowledging that "doing it all" requires different skills, different energy, different support than doing one well-defined piece.
In mid-2025, an intern joined our team. Bright, eager, armed with the same AI tools I used daily. I gave them a CI pipeline to migrate — a straightforward work based on a template and documented decisions. They stared at it, then reached for the AI. The AI completed the job, but not the whole job, there were environments and team standards and internals the AI didn't understand. The deployment failed, the tool suggested a fix and they applied it without understanding it. As expected this led to a successful deployment with the failed application. In this particular case, the error was trivial enough to be missed by us even in PR review.
This isn't their fault. They were doing exactly what the system taught them: code fast, use tools, deliver features. The system just never taught them that understanding matters more than output. AI accelerates experienced engineers because we already have patterns in our heads. For juniors, the same tools prevent those patterns from forming. We're creating a generation who can generate but can't understand.
Then there was the production incident. The platform shipped fast because implementation had "no delay." There was no pause to learn Datadog properly, to understand observability best practices, to set up meaningful dashboards and alerts. I knew how to build the thing. I hadn't given myself time to learn how to operate it.
When something broke at 2am, I was debugging blind. The dashboards were there — I'd set them up quickly, checking boxes without understanding what I was looking at. The metrics didn't tell me what I needed to know because I hadn't learned which metrics mattered. I fixed the immediate issue, but I didn't understand why it had happened, and that meant I couldn't be sure it wouldn't happen again.
All three incidents stem from the same source. When coding is all you value, everything else becomes invisible. The organisation saw speed and demanded more of it. The junior saw tools and skipped the work. The senior saw a delivery target and missed the operational depth. Same mistake, three levels.
The Return to Form
The concrete problem isn't philosophical. The tools that accelerate experienced engineers actively damage junior formation. AI removes the struggle that builds intuition. Bootcamps already taught implementation over understanding. Combine them and you get developers who can produce but can't evaluate — who can generate code but can't hold a mental model of what that code actually does.
The industry will need to deliberately rebuild how engineers are trained and mentored. Not by going back to the old gatekeeping — access matters — but by designing structures that force understanding. Assign problems where AI can't be the first answer. Require explanation, not just generation. Build in the pause that acceleration removes. Create the friction that learning requires.
This is structural, not personal. New graduates aren't doomed. The structure that trained them needs to change — from teaching coding as a commodity skill to teaching engineering as problem-solving. From producing implementers who follow specifications to producing engineers who can sit with stakeholders, understand constraints, and own outcomes.
"Finding efficient, economic solutions to practical problems." That's the job. It always was.
The rest — the tickets, the ceremonies, the handoffs, the theatre of process that let everyone feel important whilst obscuring who was actually responsible — that was the deviation. We built an industry around the assumption that engineers couldn't handle complexity, then wondered why the complexity kept overwhelming us. We optimised for coding speed and forgot that understanding matters more than typing.
AI didn't create a new kind of engineer. It revealed that the old definition was always the right one. We let business people convince us that coding was the valuable part. In doing so, we let ourselves become the least important people in the room. 2025 was the year that stopped working.
The engineers who thrive in 2026 won't be the ones who prompt best. They'll be the ones who can understand a problem, design a solution, implement it reliably, and learn from it properly. The ones who never forgot — or who are now remembering — what software engineering actually means.
Top comments (0)