DEV Community

Leon Pennings
Leon Pennings

Posted on • Originally published at blog.leonpennings.com

Software Engineering Forgot About KISS

For decades, the software industry has repeated the mantra KISS: Keep It Simple, Stupid.

But somewhere along the way, we stopped practicing it.

Today, we design systems that are far more complex than the problems they try to solve.

We pile abstraction upon abstraction, create layers upon layers, and split systems into dozens of microservices long before the business even knows what it needs.

And then we wonder why software is difficult to change, expensive to maintain, and resistant to adaptation.

This article argues something uncomfortable but important:

Modern software engineering has drifted away from simplicity —

and the price we pay is adaptability.

Let’s bring KISS back into focus, not as a nostalgic slogan, but as a practical strategy for building resilient, maintainable systems.


1. Simplicity Is a Design Goal — Not an Afterthought

KISS is not about avoiding complexity at all costs.

It’s about minimizing accidental complexity — the kind that provides no business value but increases the cost of everything:

  • understanding

  • debugging

  • onboarding

  • testing

  • refactoring

  • adapting

Simplicity is not the absence of features.

It is the deliberate removal of friction.

This requires discipline:

saying no to unnecessary generalizations, unnecessary abstractions, and unnecessary layers.

True simplicity is intentional.


2. Simplicity Is a Cost-Saving Strategy

Every engineer intuitively knows this: complex systems cost more.

But in practice, teams often optimize for "engineering elegance" rather than total cost of ownership.

The long-term cost is hidden — until it isn’t.

Complexity increases:

  • the number of failure points

  • the integration surface

  • the testing burden

  • operational overhead

  • cognitive load for every future developer

And most dangerously:

Complex systems become rigid.

They resist change because every change feels risky.

Simplicity isn’t just good taste.

It’s economics.


3. The Only Certainty Is That Requirements Will Change

This is one of the biggest blind spots in software engineering:

Software is almost never fully understood before it is built.

Not the requirements.

Not the business rules.

Not the workflows.

And definitely not the technical structure needed to support them.

Why?

Because most knowledge emerges during development, through discovery:

  • Stakeholders refine their ideas when they see the system in use.

  • Automation reveals bottlenecks and new opportunities.

  • Business priorities shift.

  • The environment changes.

  • Constraints become visible only in hindsight.

The idea that we can “design it all upfront” is fiction.

And if change is inevitable, then simplicity becomes a kind of insurance policy:

  • fewer assumptions hard-coded into the system

  • less architectural rigidity

  • easier refactoring

  • faster team alignment

  • lower risk when adapting

Simplicity keeps systems changeable.


4. The Service Layer Trap: A Pile of Uncontextualized Logic

Many applications today rely heavily on “service layers” — procedural classes full of business operations.

But this style has a hidden cost:

Service classes are context-free zones.

They don’t represent a business concept.

They don’t own invariants.

They don’t encapsulate behavior.

They just orchestrate operations.

And because of that, business rules lose their home.

This inevitably leads to:

  • duplicated logic

  • inconsistent validation

  • fragile workflows

  • procedural soup

No matter how hard a team tries to remain disciplined, service-heavy architectures guarantee that business rules will be scattered across the codebase.

Why?

Because a rule with no home will be reinvented wherever it's needed.

Simplicity demands localization of business rules — usually through a rich domain model or other forms of contextual encapsulation.


5. Microservices Make Duplication Mandatory

Here is the hard truth many organizations discover too late:

Microservices do not eliminate complexity — they multiply it and distribute it.

And they guarantee duplication of logic.

Not because teams are poorly coordinated,

but because distributed systems force duplication.

If business rules touch multiple domain areas (they always do),

then splitting those areas into separate services means:

  • each service must enforce its own slice of the rule

  • each service must store its own related data

  • each service must validate independently

  • rule changes must be applied in multiple places

This is not optional.

It is inherent in the architecture.

Distributed systems cannot centralize business logic without destroying independence — which defeats the point of microservices.

Microservices therefore create:

  • coordination overhead

  • divergent interpretations of rules

  • delayed deployment pipelines

  • friction for every cross-domain change

Observability helps you see the complexity,

but it cannot reduce it.

Microservices are an optimization for mature, stable domains — not emerging or evolving ones.


6. Premature Distribution Kills Adaptability

Teams often adopt microservices early to "prepare for scale" or "get better oversight".

But here’s the paradox:

Microservices make it harder to understand the system as a whole.

You increase:

  • operational overhead

  • deployment complexity

  • synchronization requirements

  • partial failures

  • versioning problems

  • debugging difficulty

And at the same time,

you reduce:

  • coherence

  • consistency

  • correctness

  • adaptability

This is not simplicity.

This is architectural debt disguised as modernity.

The simplest architectures are often:

  • monolithic

  • well-modularized internally

  • rich in domain modeling

  • designed for change

  • capable of evolving into microservices later, when truly necessary

Not because monoliths are inherently superior — but because cohesion is superior to fragmentation during discovery phases.


7. Simplicity = Adaptability

When we combine all the points above, the message becomes clear:

  • Simplicity reduces accidental complexity.

  • Reduced complexity lowers the cost of change.

  • Lower cost of change increases adaptability.

  • Adaptability is the most important property of any long-lived software system.

The goal isn’t to avoid complexity entirely —

but to introduce it only when the domain demands it.

And only in places where it clearly carries its weight.

This is the essence of KISS in modern engineering.


8. Bringing KISS Back into Practice

To re-anchor simplicity in our engineering culture, we need to embrace a few clear principles:

1. Don’t implement more than you need right now.

Future-proofing via speculative abstraction is a trap.

2. Model business concepts explicitly, not procedurally.

Give business rules a home.

3. Prefer modular monoliths over microservices until the domain is stable.

Distribution is not free.

4. Delay irreversible decisions.

Choose flexibility over premature structure.

5. Optimize for clarity first.

A team that understands the system can improve it safely.

If we do this, systems will become easier to work with, not harder.

And engineering will become more focused on solving real problems, not managing complexity we created ourselves.


Conclusion: Bringing KISS Back

Software engineering didn’t intentionally abandon KISS.

We simply drowned it in tools, patterns, and architectural trends that promised scalability and elegance — but delivered complexity and brittleness.

It’s time to return to fundamentals.

Simplicity is not naive.

It’s not junior.

It’s not a lack of ambition.

Simplicity is strategic.

Simplicity is sustainable.

Simplicity is what keeps software alive.

Because in the end:

Complex systems break.

Simple systems adapt.

And adaptability is the true power of software.

Top comments (0)