Modern IT has a strange problem: it talks incessantly about tools, yet increasingly struggles with engineering.
Frameworks, platforms, stacks, clouds, orchestrators, patterns, and “best practices” dominate the conversation. Success is measured by adoption curves, ecosystem size, and résumé keywords. Meanwhile, systems grow more complex, harder to reason about, more fragile under failure, and shorter-lived than ever.
This is not progress. It is a cultural regression from engineering to consumerism.
Engineering Is Not Tool Application
In every mature engineering discipline, tools are subordinate.
Civil engineers are not defined by AutoCAD.
Mechanical engineers are not defined by CNC machines.
Aerospace engineers are not defined by CFD software.
Tools assist calculation and execution. They do not define the solution.
Engineering begins before tooling, with questions such as:
What problem exists?
What constraints are real?
What failure modes are acceptable?
What is the simplest structure that satisfies correctness?
How long must this system live?
In IT, this order is often reversed: “We are using [X Framework] — now what problem can we solve with it?”
That inversion is the root of much modern architectural dysfunction.
The Engineering Definition We Forgot
Across disciplines, the common ground is clear. Engineering is the discipline of creating systems that:
Work correctly under defined constraints.
Fail predictably.
Are understandable by future engineers.
Are robust under change.
Notice what is not in that definition:
Novelty.
Flexibility for its own sake.
Tool sophistication.
Engineering optimizes for correctness, clarity, and longevity, not trend alignment.
Simplicity Is Risk Control
“Simplest solution” is often misunderstood. It does not mean the least code, the fastest prototype, or the fewest files.
It means the fewest concepts required to reason about correctness.
In software, simplicity is achieved by:
Modeling the business domain directly.
Enforcing invariants locally.
Keeping control flow explicit.
Avoiding indirection unless it pays for itself.
Complexity is not neutral. Every abstraction introduces failure modes. Every framework adds behavior you did not design. Unless you are solving a distributed systems problem, a distributed architecture is not a feature; it is a bug.
Tooling Masks Responsibility
Tool-centric IT culture creates a dangerous illusion: “If we follow the framework correctly, correctness will emerge.”
It will not. Frameworks do not understand your domain, your invariants, or your economic constraints. They encode generic assumptions optimized for wide applicability, not specific correctness.
When failures occur, tooling-first systems tend to fail ambiguously: timeouts without state certainty, partial successes without visibility, and compensations without guarantees.
True engineering systems fail loudly and deterministically.
The Singleton Reality
One crucial observation is often ignored: Every IT system is a singleton.
There is no true A/B comparison—no identical traffic, users, or operational context to compare "Tool A" vs "Tool B." Because there is no control group, tools are almost never falsified. They are adopted by association, not by proof.
If a system works, the tools get credit. If it fails, the domain or execution gets blamed.
Mature engineering disciplines mitigate this lack of falsifiability through formal reasoning, conservative design, and explicit assumptions. IT often replaces these with anecdote and popularity.
The Economic Cost: Accidental Complexity
The cost of losing engineering discipline is not abstract; it is economic.
In previous decades, enterprise systems of substantial value were often built and maintained by small teams (5–15 engineers) and remained operational for decades. Today, functionally comparable systems—bloated by orchestration layers, microservice sprawl, and continuous re-platforming—often require teams three to four times that size merely to "keep the lights on."
This is the explosion of Accidental Complexity.
We have traded hardware constraints for cognitive constraints. We are not getting 5x the business value; we are paying 5x the price to manage complexity we introduced ourselves through the abandonment of fundamentals.
Why Tooling Dominates Anyway
Tool obsession did not arise accidentally. It solves organizational problems, not engineering ones:
Onboarding speed.
Developer interchangeability.
Vendor alignment.
Hiring pipelines.
Companies often trade operational efficiency (system stability and low maintenance costs) for hiring efficiency (the ability to plug in a "Spring Developer").
However, this is a false economy. The efficiency gained in hiring is lost multiple times over in the crushing weight of maintenance and the cyclical cost of total re-builds. The net return is deeply negative. Yet, the Singleton illusion creates a blind spot: without a control group to demonstrate how stable the system could have been, the organization assumes the chaos is normal, mistaking self-inflicted complexity for the inherent difficulty of software.
Getting Back to Basics
"Back to basics" does not mean rejecting tools. It means:
Subtractive Engineering: Mature engineering progresses by removing unnecessary parts and speculative abstractions.
Sovereign Domain Models: The business logic dictates the tool, not the other way around.
Explicit State: Prefer determinism over magical orchestration.
Design for Failure: Assume parts of the system will fail, and handle it explicitly in the code.
Conclusion
IT does not suffer from a lack of tools. It suffers from a lack of engineering restraint.
Until software engineers stop cork-sniffing tools and return to engineering fundamentals—simplicity, correctness, explicitness, and responsibility—systems will continue to grow more complex while becoming less trustworthy.
The solution is not another framework.
It is judgment.
Top comments (1)
Well said! I wholeheartedly agree.