DEV Community

Cover image for Why IT Is an Expensive Mess — And Why Nobody Inside IT Notices
Leon Pennings
Leon Pennings

Posted on

Why IT Is an Expensive Mess — And Why Nobody Inside IT Notices

Software engineering is the only discipline where you can build the equivalent of a nuclear reactor using cardboard, glue, and emotional confidence — and nobody will notice until five years later when the intern presses the wrong button and the entire thing collapses into a flaming monolith of tech debt.

In real engineering, failure is visible.

In software, failure is deniable.

And this is why approaches to building software matter far more than most realize.

Let’s talk about why.


Most Teams Jump Straight to Functionality

Most development approaches jump straight to implementing functionality.

The best ones start by understanding and structuring the problem deeply before coding.

This single difference changes everything — yet most teams have never experienced it. Debating tools or techniques is like discussing wine with someone who’s only tasted Capri-Sun.

Part of the tragedy is that most developers genuinely cannot perceive the gap.

Because software hides its failure modes so effectively, a bad design and a good design both:

  • compile

  • run

  • pass tests

  • ship

Nothing in the daily work reveals the qualitative difference. Without contrast, people assume their way is fine — not because it is fine, but because nothing pushes back the way physical reality does in traditional engineering.


The Dunning–Kruger Amplifier in Software

Here’s where the industry gets hit with a double curse:

The Dunning–Kruger effect thrives in environments with weak or absent feedback loops.

If a developer:

  • ships features

  • closes tickets

  • sees green tests

  • gets no immediate explosions

…it becomes extremely easy to mistake the absence of visible failure for the presence of actual skill.

When software hides its mistakes, the people making them become more certain they aren’t making any.

Confidence rises while insight stays flat.

And because no one ever sees a contrast — a genuinely well-designed system — there’s nothing to trigger the realization that they’re missing entire dimensions of quality.

This is why so many people feel like they’re “doing great” while unknowingly constructing multi-million-dollar cardboard reactors.


Nuclear Reactors, Painters, and Software Developers Walk Into a Bar…

Imagine two people are asked to design a nuclear reactor:

Actual nuclear engineer:

Works from principles, physics, constraints, and deep understanding.

Software developer who believes “just write functions, bro”:

“Yeah so uhhh… I’ve got this YAML file, a queue, some Lambda functions… should be fine.”

In the real world, one of these people goes to jail if they’re wrong.

In software? We reward them with a conference talk and a Netflix documentary.

Likewise, imagine giving the world’s most advanced, AI-powered, aerospace-grade paintbrush to an amateur painter. No matter how fancy the tool, they can’t paint a happy little tree better than Bob Ross with a dollar-store brush and 10 minutes.

Skill beats tools.

Tools amplify skill — or amplify the lack of it.

The industry keeps confusing the two.


The Core Difference: Structure the Functionality, Don’t Just Implement It

Most programmers argue in terms of:

  • purity

  • data structures

  • functions

  • transformations

  • avoiding mutation

  • side effects

  • boilerplate

  • frameworks

These are implementation-first concerns.

A better approach starts at an entirely different altitude:

  • What does the problem really mean?

  • What are the conceptual boundaries?

  • What are the unbreakable rules?

  • What might change or break?

  • What behavior belongs where?

  • How do you structure it so the meaning remains clear for years?

  • How can we test or stress this understanding early?

Tools-first starts with coding.

Structure-first starts with comprehension.

Great software starts with a system that doesn’t just “do” the functionality but naturally encompasses it — a machine shaped around the domain’s essence.

You can chain functions forever and still end up with logic scattered across a thousand files and no big-picture cohesion.

Structuring-first says:

“One concept. One place. One responsibility.”

Not for tech purity — but for human navigability and domain fidelity.

This isn’t coding.

This is thinking.

And thinking techniques scale.

Tools alone don’t.


A Practical Example: The Library Application

A simple example:

Most developers start by modelling Books and Lending, with lending tied directly to the Book entity.

It works — until the moment you add CDs. Or DVDs. Or anything that isn’t a book.

Suddenly the whole model cracks.

Why?

Because the domain was never about books.

The real domain is lending items, where the nature of the item is mostly irrelevant to the lending process.

What matters are things like:

  • who borrowed it,

  • when it’s due,

  • whether it can be renewed,

  • and what its availability rules are.

Whether the item is a book, a CD, a board game, or a taxidermied penguin is just metadata — a description, not a core type.

So structure-first modelling reframes the domain:

“The domain is Lending — the item is just a describable thing being lent.”

This produces a dramatically more robust model:

  • The application needs far less code.

  • UI logic becomes simpler and more uniform.

  • Tests shrink, because you’re testing lending rules, not dozens of item-dependent variations.

  • Sonar and coverage metrics look identical — but the model underneath is cleaner, more general, and future-proof.

  • Adding new item types no longer requires redesigning half the system.

Most teams completely miss this structuring step.

They believe they understood the requirements — but they didn’t understand the domain.

They modelled the nouns they saw instead of the actual functional concept that drives the system.


Why Even Imperfect Structuring Beats Polished Implementation

Almost no one admits this:

Even mediocre domain structuring often beats excellent code without it.

Structured thinking naturally:

  • groups related logic

  • centralizes invariant rules

  • uses real-world language

  • gives clear places for truth

  • reduces change ripples

  • grows more correct as the domain evolves

Meanwhile, polished but structureless code:

  • scatters meaning

  • hides rules

  • requires glue and orchestration

  • buries logic in incidental steps

  • becomes fragile over time

  • breaks under scale

  • becomes mentally exhausting to navigate

Implementation-first is easy to start, hard to sustain.

Structure-first is harder to start, easy to sustain.


Why Deep Structuring Is Hard (And Why Most Skip It)

Deep structuring requires:

  • abstract thinking

  • spotting key ideas

  • intuition for invariants

  • sense for boundaries

  • predicting change

  • metaphorical reasoning

  • naming skill

  • responsibility identification

These are engineering skills, not coding fluency.

Not everyone has them honed.

No shame — just reality.

But software pretends everyone does.

So naturally:

  • Conceptual strugglers dismiss structuring as unnecessary

  • People weak in design call quick code “cleaner”

  • Abstraction-averse developers label it “overkill”

And for small tasks, they’re right.

But hard doesn’t mean optional.

A novice nuclear reactor is obviously dangerous.

A novice software foundation is quietly dangerous.


Why This Effect Is Invisible: The Missing Reference Problem

Here’s the bombshell:

Software lacks objective comparisons.

Civil engineers measure bridges by:

  • cost

  • durability

  • capacity

  • lifespan

  • safety

Painters? You see the quality.

Nuclear engineering? Physics enforces competence.

Software? No second version exists.

Bad, good, mediocre — all ship if:

  • UI works

  • tests pass

  • deadline hit

  • no loud crashes

Success declared.

This illusion hides incompetence.

Bad outcomes get blamed on:

  • business complexity

  • growth pains

  • “normal” tech debt

  • tools

  • legacy

  • “just software”

Fix attempts?

Not better structuring.

More layers. More frameworks. More ceremony.

Microservices. Containers. Pipelines. YAML. Orchestration. Chaotic layers to compensate for weak fundamentals.

The mental model stays weak — only the overhead increases.

Industry-wide Dunning–Kruger: total blindness to what “good” even is.


The Silent Cost: An Industry Without Quality Filters

And this is the most damaging aspect:

There is no way to spot real skill.

No exemplars.

No shared reference designs.

No structural standards.

No mechanism to distinguish thinkers from tool-stackers.

The result:

  • Inflated talent pools

  • Declining architectural skill

  • Quality drain hidden by money and hardware

  • Frameworks compensating for missing engineering

  • Complexity perceived as sophistication

  • Rewrites costing millions

  • Tech debt costing trillions (literally)

Conceptual modeling skill pays back tenfold.

But because it’s invisible, it’s undervalued.

Three times the money for one strong modeler would save:

  • years of development

  • mountains of bugs

  • massive rewrites

  • millions in tech debt

But this will never be visible in the short term.

And so the cycle continues.


The Bob Ross × Nuclear Reactor × Software Engineering Conclusion

Bob Ross achieves mastery with basics because he understands art.

Tools amplify understanding.

Nuclear engineers succeed because physics forces them to understand.

Software “works” because hidden failures allow misunderstanding.

Understanding feels optional — until it isn’t.

Great building isn’t frameworks or classes.

It’s:

  • clarity

  • structure

  • boundaries

  • meaning

  • stability

  • evolution

Code-first writes stuff.

Structure-first models the business.

The uncomfortable truth:

The world is complex.

Your problems too.

Grasp and structure isn’t optional — just unnoticed until year five.

Then cardboard vs. reactor?

Obvious.


Appendix: Practical Red Flags That You’re Building a Cardboard Reactor

These signs often indicate a tool-heavy, implementation-first culture that skips deep domain structuring. Not absolute rules — just patterns that tend to show where the thinking stopped and the tooling started.

1. Spring as the Architectural Centerpiece

If Spring effectively is the architecture, odds are high that behavior is pushed into services and entities are hollow data buckets.

Using Spring for domain-centric design is like using a Ferrari F40 to plow a field — possible, but nobody does it.

2. CQRS or Microservices on Top of a Single Relational Database

When everything ultimately writes to one ACID RDBMS, patterns like CQRS — and even microservices — become the same architectural red flag. You’re pretending to split the system, but all paths still converge on one shared schema. You solve imaginary scale while creating real complexity. A single normalized database is already a unified read/write model. Trying to layer CQRS or microservices on top of that isn’t architecture — it’s cosplay

3. Microservices or Big Data Tooling Without Actual Scale

Microservices without thousands of RPS per service, or Spark/Hadoop without hundreds of millions of rows, usually signal résumé-driven development rather than domain-driven engineering.

You’re paying operational overhead for a scale that does not exist.


These patterns don’t reveal incompetence — they reveal an industry conditioned to prioritize tooling over thinking.

If the appendix makes you nervous, good.

It’s meant to.

Top comments (0)