We started with machine code.
Actual instructions to a computer. Zeros and ones. The machine did not care about human readability, developer experience, onboarding, documentation, architecture diagrams, or whether the person writing it had slept properly.
It was pure instruction.
Then we made the first abstraction layer.
Assembly language did not remove complexity, but it made the relationship between human intent and machine execution slightly less hostile. Instead of writing raw binary, we could use mnemonics.
Still close to the machine.
Still painful.
But easier.
Then came higher-level languages. FORTRAN, COBOL, C, Pascal, Java, Python, JavaScript, and the long chain of tools that followed. Each layer moved us further away from the physical machine and closer to human intent.
The pattern is obvious in hindsight:
Machine code → Assembly -> High-level languages -> Frameworks -> IDEs -> Cloud platforms -> AI coding agents.
Every step compresses the distance between what we want and what the machine executes.
That is the part I think we are underestimating.
The sentence is becoming the interface
For most of software history, the bottleneck was translation.
A business person had a need.
A developer translated that need into architecture, tickets, code, database schema, API contracts, tests, deployment configuration, infrastructure, and eventually something the computer could execute.
The developer was the translator between business intent and machine behavior.
That role is not disappearing.
But the translation layer is getting thinner.
When someone can write:
Build me a small internal calculator for pricing orchard export margins.
and an agent can generate the UI, database structure, validation, authenticaton, deployment config, and a first working version, something important has changed.
Not because the generated software is perfect.
It usually is not.
Not because companies no longer need developers.
They absolutely do.
But because the cost of creating simple software is collapsing.
And when the cost of something collapses, the economics around it change.
This is related to software commodization
I wrote earlier about What Actually Happens When Software Gets Easy.
This post is more of a philosophical rant than a structured technical argument, but I think it sits on the same line.
The uncomfortable idea is this:
A lot of software companies are not selling deep technology.
They are selling workflows.
A CRM for this niche.
A dashboard for that niche.
An approval system for this industry.
A reporting tool for that department.
A quoting system for that business type.
A booking tool, inventory tool, compliance tracker, invoice helper, HR portal, shift planner, document workflow, admin dashboard.
Individually, these products can be useful.
But structurally, many of them are not defensible if software creation becomes cheap enough.
If a company is paying thousands per month for a simple internal tool, and a competent sysadmin or operations person can recreate 70% of it with an agent in a week, the pricing power of that tool weakens.
Not immediately.
Not everywhere.
But directionally.
That is the pressure.
The middle layer gets crowded
The obvious reaction is to say:
But generated software will be buggy.
Yes. So is a lot of normal software.
But companies need support.
Yes. They do.
But integrations are hard.
Correct. That is one of the places where value survives.
But compliance, security, hosting, audit logs, pemrissions, data migration, and reliability are not trivial.
Exactly. That is the point. The value does not disappear.
It moves.
Raw implementation becomes cheaper. Context becomes more valuable.
The ability to produce a generic dashboard becomes less impressive. The ability to understand the business, the data model, the regulatory environment, the security, and the long-term maintenance burden becomes more important.
The floor rises.
More people can build.
But the middle gets squeezed.
Generic software is the next abstraction
My guess is that we are moving toward something I would call generic software.
Not generic as in bad.
Generic as in baseline.
A generalized finance system.
A generalized HR system.
A generalized CRM.
A generalized ERP.
A generalized compliance workflow.
A generalized inventory and operations system.
Then instead of buying a narrow SaaS product for every slightly different business process, a company says:
We are this type of company. Use these baseline modules. Adapt them to our workflows.
That sounds simple, but the implication is quite large.
Today, a software company often packages one workflow and sells it repeatedly.
In a generic-software world, the baseline workflow becomes open, shared, standardized, or extremely cheap.
The expensive part becomes adaptation, integration, governanve, and trust.
Software stops being a fixed product and becomes more like clay.
You start with a known base.
Then you shape it.
This is where open source probably comes back harder
I might be naive here, but I can imagine a future where many of these baselines are either government-owned, public infrastructure, or heavily open source.
Not every system.
Not every industry.
Not everything should be run by government, and not every open source project becomes maintainable just because people like the idea.
But think about the duplication.
How many companies need the same basic HR workflows?
How many need the same invoice approval process?
How many need the same supplier onboarding forms?
How many need the same audit trail?
How many need the same CRM objects?
How many need the same permission model with slightly different labels?
The amount of repeated software in the world is ridiculous.
A lot of it exists because, historically, customization was expensive.
So vendors packaged workflows.
Companies adapted themselves to the vendor.
Agents may reverse that.
Companies may increasingly expect software to adapt to them.
Standardization usually follows chaos
This is not new.
Software has gone through this cycle before.
First there is scarcity.
Then tools improve.
Then everyone builds their own version.
Then the ecosystem gets chaotic.
Then standards emerge.
Then the value moves upward.
Websites went through this.
Mobile apps went through this.
Cloud infrastructure went through this.
Now internal business software may go through the same compression cycle.
The difference is speed.
AI agents are not just another framework. They are a compression layer across the whole pipeline.
Requirements, code, tests, infrastructure, deployment, documentation, refactoring, debugging.
Not perfectly.
But enough.
And "enough" is usually where markets start moving.
The part companies should worry about
If you are selling a narrow SaaS tool, the question is no longer:
Can someone build this?
The question is:
Can someone build enough of this?
That distinction matters.
Most customers do not need every feature.
They need the 20% that fits their process.
If an internal team can build that 20%, connect it to their existing systems, and avoid another monthly SaaS subscription, the vendor has a problem.
The defensible parts become:
- proprietary data
- deep domain expertise
- regulatory trust
- network effects
- hard integrations
- operational reliability
- security posture
- distribution
- brand trust
- support quality
The non-defensible part is "we have forms and tables".
Forms and tables are not a moat.
The optimistic version
There is a positive version of this.
Smaller companies get better software.
Internal teams stop waiting six months for some vendor roadmap.
People who understand the business can prototype their own tools.
Governments and industries can standardize boring workflows.
Open source business software can become much more usable.
Developers can spend less time wiring yet another CRUD screen and more time solving actual hard problems.
That is the good outcome.
Generic software does not mean software becomes worthless.
It means repetitive software becomes less special.
And honestly, a lot of repetitive software probably should become less special.
The uncomfortable version
The uncomfortable version is that thousands of small SaaS products become difficult to justify.
Not because they are bad.
Because they are too easy to approximate.
The market does not always punish bad products first.
Sometimes it punishes products whose value proposition becomes too easy to copy.
That is where agentic development changes the equation.
A competitor does not need to clone the entire product.
A customer does not need to replace the entire product.
An internal admin does not need to build the perfect system.
They only need to build enough.
Enough is dangerous.
Where I currently land
I do not think software disappears.
I do not think developers disappear.
I do not think every company becomes its own software company in a clean, magical way.
But I do think we are moving toward a world where generic business software becomes much easier to create, adapt, and replace.
That means the value of software shifts away from implementation and toward judgment.
What should exist?
What should not exist?
What should be standardized?
What should be custom?
What needs security review?
What needs auditability?
What needs to survive for ten years?
What should be deleted after three months?
Those questions become more important than the ability to generate the first version.
The sentence is becoming the interface.
The baseline is becoming generic.
And the moat is moving somewhere else.
I may be wrong on the timeline.
But I do not think I am wrong on the direction.
Top comments (0)