DEV Community

Cover image for Why We Suddenly Have Developers Who Can't Think in Systems

Why We Suddenly Have Developers Who Can't Think in Systems

Narnaiezzsshaa Truong on January 25, 2026

And why it's not their fault. This article is a response to @itsugo's "Learning Starts After Graduation"—which makes a valid point but stops sh...
Collapse
 
itsugo profile image
Aryan Choudhary

This is a powerful reframing. My post came from a personal experience of feeling that gap between theory and real systems, but what you’re describing names the structural issue much more clearly: the disappearance of the environments where systems thinking is actually formed.

The idea that we’ve replaced apprenticeship and architectural exposure with production pressure + AI shortcuts really resonates. It explains why people can ship but struggle to reason.

Thanks for extending this, it feels like you’re diagnosing the pipeline failure, not just the symptoms.

Collapse
 
narnaiezzsshaa profile image
Narnaiezzsshaa Truong

Thank you for the thoughtful read. Your original post surfaced an important signal: developers are experiencing a gap they can feel but can’t quite name. My aim wasn’t to add another anecdote to the pile, but to map the structural conditions that produce that gap in the first place.

When apprenticeship, architectural exposure, and reasoning environments disappear, the system selects for output rather than understanding. Production pressure and AI shortcuts aren’t the cause—they’re the accelerants. The underlying issue is the collapse of the environments that once formed systems thinkers.

Symptoms tell us something is failing. The structure tells us why. My interest is in the latter.

Collapse
 
itsugo profile image
Aryan Choudhary

That distinction between symptoms and structure is exactly what I’ve been trying to sharpen in my own thinking lately. Appreciate you laying it out so clearly.

Collapse
 
phasmax profile image
A E

I've been saying for years since the AI craze started that we're moving to a point where the hardcore engineers are retired and all we're left with are engineers who vibe conded themselves into seniority.

As good as AI models are, you still need compitency to provide it with proper instructions, good principles, architecture patterns etc. And when it starts spewing output, if you lack foundational knowledge, how would you know it's correct, stable and sound?

I can't count how many times AI has generated fully working code, but when I look at it my spidey senses start tingling, I look deeper and realise it's slow, inefficient, leaks memory, or just aren't as secure as you think it should be. Or it's based on a deprecated version or whatever library it used. Someone who's vibe coded their whole life and never had to bash their head on a keyboard to solve a problem, someone who doesn't understand the underlying mechanisms of the language they're working in etc - they won't spot these things and it's going to come back and bite them. And then nobody will know how to fix it (quickly).

I know teams who's vibe coded a product quickly and then suddenly run into a bug/issue - no matter how many times they ask Claude, Gemini etc to fix it, it does a death spiral. Then you require someone to go in there and dig....and nobody can.

Ai is a great tool but there is a line to be drawn both on how far you push it and to what degree you let teams use it without the core foundation knowledge of software engineering. As things stand now we're looking at a massive gap in a few years. I'm hoping people still put some emphasis on proper upskilling.

The most dangerous programmer is the one who doesn't understand the tech they're working in.

Collapse
 
narnaiezzsshaa profile image
Narnaiezzsshaa Truong

I appreciate the concern—it’s valid, urgent, and echoes what many of us feel when we see teams spiral into debugging hell with no one left who can trace the architecture.

But let’s be clear: that’s a symptom, not the disease.

The real collapse isn’t just about bad code or missing instincts. It’s about the disappearance of systems thinking environments—places where engineers were trained to see the whole, not just the part. We’ve lost the scaffolding that used to catch people mid-fall. Apprenticeship structures, transmission rituals, lineage-aware debugging—all gone or fragmented.

AI isn’t the villain here. It’s just exposing the void.

When teams hit a wall, and no one can fix the bug, it’s not just because they lack “hardcore engineering” chops. It’s because they were never taught to think in systems. They were onboarded into fragmented toolchains, not disciplines. They were given tasks, not architecture. And now, when the AI spirals, they have no map.

So yes, we need foundational knowledge. But we also need environments that teach systems literacy, restore transmission, and dignify the operator. Otherwise, we’re just yelling at the symptoms while the disease spreads.

Collapse
 
phasmax profile image
A E

Brilliantly articulated - it was what I was alluding too but alas, you have done a much better job than me.

Collapse
 
slick_phantom profile image
phåńtøm šłîçk

Actually it feels like most juniors are not even ready to learn they hit a bug then paste ai codes without reading or even trying to understand where they bug came from
This is basically introducing bug to a already buggy code

Collapse
 
slick_phantom profile image
phåńtøm šłîçk

Ai is a great tools
And to me I feel it’s not a villain

It’s the "juniors"

  • Everyone wants to ship code
  • what’s to do this do that (saas)
  • without understanding how it works what to make it work And then they end up going to Claude thinking that’s all to software development am not saying it’s bad I do vibe code for fun I vibe coded my portfolio but ended up deleting it Cause it’s trash 👀
Collapse
 
sighlentnite profile image
Alex Trollip

What an eloquent article.

I've felt something similar to this, I've seen even some companies that recently had Internships remove them.

For short term it's obviously beneficial as spending time on a a handful of no productivity individuals + time from people who could be shipping code that could be making money just makes sense.

But long term, we'd have to only hire experienced people. If we're lucky they have some experience to systems similar to ours or there's a upskilling gap even though the pay is high(as per required experience)

While the intern might only be useful in a few years, the ability to train and mould them is invaluable.
...
Even discarding that, my first job was basically heres an ancient project that ideally needs to be upgraded soon. So I broke tons of things and had to figure out pretty much more than half of the project.
Which sucked at the time, but it gave me invaluable debugging skills that save me immense time(especially where prod is down, as time is literally money then).
...
I do find the difficulty in learning some newer things with the resource like AI providing such quick answers and solutions(of sometimes dubious quality but that's another topic).
That it feels inefficient, even though I feel I know better, to learn the standard way.

Collapse
 
narnaiezzsshaa profile image
Narnaiezzsshaa Truong

Alex, this comment is quietly brilliant. You’ve mapped the emotional-operational arc of real systems learning—debugging chaos, breaking things, surviving prod outages—and you’ve done it without romanticizing the pain or shaming the learner. That’s rare.

Your reflection on internships being cut for short-term gains is a textbook example of systems erosion: optimizing for immediate throughput while cannibalizing the regenerative layer that trains future operators. It’s not just a hiring problem—it’s a governance failure.

And your note about AI answers feeling “inefficient” even when you know better? That’s the edge of epistemic drift. The system rewards speed, but your body remembers the cost of brittle knowledge. That tension—between seductive shortcuts and embodied literacy—is the frontier we’re all navigating.

I work in Restoration-era systems architecture, where we formalize emotional-operational cycles like the one you lived through: curiosity, chaos, debugging, mastery. Your story is a case study in how real operators are forged—not trained, not credentialed, but forged.

Collapse
 
vasughanta09 profile image
Vasu Ghanta

Spot on diagnosis, Narnaiezzsshaa! The collapse of apprenticeship pipelines and AI's "perform without competence" shortcut perfectly explain why we're seeing output over insight.

Your EDC framework is gold—enthusiasm for velocity has gutted the scaffolding for real systems literacy, from debugging flows to governance.

The fix starts with protected spaces for juniors to break things and trace root causes, not just ship tickets. Spot on that it's a trained skill we stopped training

Collapse
 
narnaiezzsshaa profile image
Narnaiezzsshaa Truong

Thank you, Vasu.

Collapse
 
david_hastings_6fae4ce6a3 profile image
David Hastings

When I interviewed college students for co-op jobs when I worked as a Network Administrator I always questioned them about their knowledge of the OSI Reference model. Not one of them had any idea about what I was talking about. Harkening back to my undergraduate college days, I was in the same boat. In 1988 I graduated with a BS in electrical engineering. Looking back in those days I was filled with theory, but lacked the 50 foot view that I later developed when I pursued a Masters Degree in IT Management and by my independent studies.
The OSI Reference model is always something I learned to depend on to get me "out of the forest". As a project manager, I was able to stop a project because the vendor was going to use a protocol that was incompatible with our phone system. So many vendors feel that when questioned they need to give a quick answer. It's an ego thing. I would rather have a well thought out answer than a quick wrong one. That's how teams get stuck down a rabbit hole. I picked up a book recently called Business Dynamics Systems Thinking and Modeling for a Complex World by John D Sterman. It's one of those books that can help IT folks "get out of the forest" and into the real productive world.

Collapse
 
narnaiezzsshaa profile image
Narnaiezzsshaa Truong

You're naming the distinction I didn't make explicit enough: the OSI model only becomes systems thinking when it's wielded as a sovereignty tool—boundary enforcement, failure anticipation, governance fluency. You used it to veto a protocol. That's recognition, not recall. Most engineers know the layers but never make that leap.

Collapse
 
aniruddhaadak profile image
ANIRUDDHA ADAK

Well said

Collapse
 
volt1480 profile image
Dominik Michelitsch

This hits uncomfortably close to home — and I think the framing around a broken developmental pipeline is exactly right.

The part that resonates most is the loss of the protected middle layer: the space where people were allowed to observe systems misbehave, form mental models, and learn stewardship without production pressure. That’s where systems thinking was actually forged, not in lectures or tutorials.

AI doesn’t cause the problem, but it absolutely masks the absence of that literacy layer. It lets people perform output without ever confronting state, constraints, or failure modes — which used to be unavoidable teachers.

I also like how you extend EDC beyond tools. Framing talent development itself as a governance failure explains a lot of what we’re seeing: velocity optimized, stewardship externalized, cognition deferred.

Systems thinkers aren’t disappearing — we just stopped giving people the environments where they’re allowed to become one.

Collapse
 
nileshadiyecha profile image
Nilesh A.

This explains the gap perfectly: we didn’t lose thinkers - we removed the environments that created them.

Collapse
 
manojsharma profile image
Manoj Sharma

We didn’t lose systems thinkers - we removed the conditions that created them.

Collapse
 
earlgreyhot1701d profile image
L. Cordero

Great read. I think this analysis can be applied to most other industries, too. There's a lack of mentorship and infrastructure for new workforce entrants that help them develop systems and critical thinking skills.

Collapse
 
narnaiezzsshaa profile image
Narnaiezzsshaa Truong

Thanks, L. Cordero—really appreciate the engagement and the cross-industry lens. You're right that mentorship and infrastructure matter. What I keep circling back to is something specific within that: the industry used to have rituals that metabolized knowledge into operational fluency—apprenticeships, architectural walkthroughs, debugging under real conditions. That scaffolding is what's collapsed. Smart devs aren't lacking exposure to the concepts; they're lacking the environments that make those concepts usable.

Collapse
 
shitij_bhatnagar_b6d1be72 profile image
Shitij Bhatnagar

Thanks for your comment, I feel it comes down to us (existing engineers and leads) to ensure the process is inculcated, I feel its a grave injustice to the next generation that the mentoring we received we do not pass that on, this will haunt everyone in future.

Collapse
 
marcelo_lopez_ff1739f1df7 profile image
Marcelo Lopez

Suddenly?

When for the past 20 years or so we've been pushing this narrative of "anyone can code" when that was never really the point. And people coming in mostly learn the hand cranking and not the discipline of CRITICAL THINKING, and now someone's wondering why software developers these days can't think in systems.

We allowed it to happen to ourselves...

Time to push back and return to learning fundamentals with out the "thinking buddy" crutch.

Collapse
 
narnaiezzsshaa profile image
Narnaiezzsshaa Truong

Appreciate the passion, Marcelo. But this piece isn’t about who “deserves” to code—it’s about the disappearance of environments that taught systems thinking as a discipline. The collapse isn’t sudden; the recognition is. And it’s not about crutches—it’s about scaffolding. Apprenticeship, not hand-holding. Discipline, not gatekeeping.

Collapse
 
shitij_bhatnagar_b6d1be72 profile image
Shitij Bhatnagar

Agree that it is not sudden.

Collapse
 
shitij_bhatnagar_b6d1be72 profile image
Shitij Bhatnagar

This is a very good explanation of the systemic problem that has creeped into the software firms, irrespective of their types (services, captive/GCCs, product, consulting, start ups and more), thanks for sharing your analysis.

There is another worrisome problem that is emerging and I have tried to capture it in this article here - dev.to/shitij_bhatnagar_b6d1be72/a... and I feel it is a fundamental aspect that is already impacting the next generation of engineers.

I see the onus of correction of the issues on senior engineers by asking more questions to juniors, creating a path for them / let them fail safely (even if it slows velocity) - otherwise, the long term effects are unimaginable coupled with the fact that in this climate, many above the age 40 are losing jobs and those in 20s are not mentored enough.

Not trying to take away responsibility of the system, however, there is something we all have in our control to do, that we should :-)

Collapse
 
inyoka profile image
Simon (Inyoka on SO)

The problems begin in early education. A lack of qualified Computer Science teachers, overly locked-down PCs, and a reliance on touchscreens in primary schools mean that most students leave mainstream education without any substantial computing knowledge, particularly in programming and computational thinking.

Collapse
 
narnaiezzsshaa profile image
Narnaiezzsshaa Truong

Appreciate the insight, Simon. I think you’ve named a real fracture point—but I believe the deeper issue is the structural rot in the transmission chain. Even if we had brilliant CS teachers and open machines, what scaffolds would have preserved systems thinking across adolescence, into adulthood, through industry onboarding, and into production environments?

The real question isn’t just where the fracture began—it’s why it was allowed to persist uncorrected. What governance structures failed to detect, restore, or reintroduce systems literacy at later stages? What incentives rewarded fragmented thinking over architectural clarity?

Collapse
 
hellooojoe profile image
Joe

I stopped reading after the first few lines when I spotted this whole article blog whatever is AI

Collapse
 
deondr profile image
DeondR • Edited

It is not just systems thinking that is never learned, the basics of analytical thinking are never learned, nor applied.

AI now further contributes to this problem created by the agile movement, where the concept of design thinking had to be introduced because nobody was thinking beyond their PC screen. There is a big difference in the cognitive processes of coding and designing, and AI is just adding to the loss of analytical and systems thinking

Collapse
 
narnaiezzsshaa profile image
Narnaiezzsshaa Truong

A quick clarification, since the thread seems to be drifting: in the context of this piece, design and development aren’t separate cognitive acts. Systems thinking is the stance that holds them together. It’s the 360‑degree orientation from the center of the system—seeing interactions, constraints, failure paths, and consequences—while also monitoring one’s own assumptions as part of that system.

When that stance is missing, everything collapses back into silos: “coders vs designers,” “process vs thinking,” “AI vs analysis.” The article is pointing to that fracture, not reinforcing it.

Collapse
 
juun_roh profile image
Juun Roh

Industries are worshiping business indicators, such as MAU, screen time, velocity, optimizing all the progresses.
Gaming industry shows this explicitly...