Generative AI is driving the cost of producing software toward zero. The common conclusion is that software itself is becoming less valuable.
This is wrong. It confuses production cost with economic value.
There is a simple formula for software value in the age of AI. It explains why some software becomes worthless while other software becomes more valuable than ever.
The Floor On Replacement Cost
When people say "AI makes software free," they mean AI reduces the tokens required to generate a working program. But "toward zero" does not mean "to zero." There is a floor.
In information theory, the true complexity of any object is the length of its shortest possible complete description — its Kolmogorov complexity. For software, this is the total information consumed in generating it correctly: the spec, the output and all the intermediate reasoning, debugging and verification along the way.
As AI models improve, they internalize more patterns. The number of tokens needed to produce a given program decreases. But they cannot decrease below the program's Kolmogorov complexity.
That is the floor. The replacement cost of any software is converging toward a well-defined quantity: the irreducible token count times the price per token.
The Formula: K x P x N
Three variables determine software value:
- K (Kolmogorov Complexity): The minimum number of tokens required to correctly generate the software
- P (Price Per Token): The cost of compute for generation
- N (Reuse Number): How many independent systems need this functionality
If N systems each need software with complexity K, and it doesn't exist, each pays K x P to generate it. Total cost to the economy: K x P x N.
If the software exists, that cost is eliminated. The economic value is the total cost saved:
Value = K x P x N
The Race Between P And N
K is converging to a floor; it cannot shrink below the irreducible minimum. The interesting dynamics are between P and N.
P is dropping fast. Token prices have fallen by orders of magnitude and will continue falling toward marginal compute cost.
For software value to grow, N must grow faster than P shrinks. This is the key insight: Value concentrates in software where reuse is expanding faster than generation cost is collapsing.
What This Predicts
- Low K, Any N (Commoditized): CRUD apps, standard UI patterns, basic integrations. Low complexity means low replacement cost, even at today's token prices. As P drops further, the value approaches zero.
- High K, Low N (Niche): Bespoke simulation tools, specialized compliance logic. High replacement cost, but few systems need it. Value is real but limited.
- High K, High N (Most Valuable): Operating systems. Database engines. Irreducibly complex, and N keeps growing as more systems depend on them.
Venture investors are increasingly betting that infrastructure layers will capture disproportionate value in the AI cycle. The formula explains why: high K, along with N that grows faster than P shrinks.
What Makes K High
It's not the length of the spec. It's not the lines of code. It's the length of the generation process.
Two programs, each 10,000 lines. The first: REST endpoints for 200 database tables. The model reads a short spec, generates the code in one pass and it's done.
The second: a distributed consensus protocol. The model generates an attempt, tests it, discovers a race condition, reasons through the failure and tries again. Another edge case appears. It debugs, refactors and generates again.
Same output length, but vastly different total tokens were consumed. The complexity lives in the generation path, not the endpoints.
This distinction doesn't shrink as models improve. For genuinely complex software, there is a minimum computation required regardless of how intelligent the solver is.
The Agent Economy Multiplier
As autonomous agents proliferate, N explodes. Every agent needing shared context or coordinated decision-making is a consumer of infrastructure.
Meanwhile, K for that infrastructure is irreducibly high. Providing temporally consistent snapshots across analytical, transactional and semantic queries is genuinely hard.
The result: N is growing faster than P is shrinking. Agent infrastructure gains value even as token prices collapse.
Strategic Implications
- For Founders: Ask two questions. First, is your K irreducibly high? Second, is your N growing faster than P is falling? If both answers are yes, you have a durable business. If either is no, you're in a race against commoditization.
- For Investors: K x P x N is a valuation heuristic. The key metric isn't current N; it's the growth rate of N relative to the decline rate of P.
- For Technical Leaders: Build-versus-buy has a precise answer. Estimate K x P for regeneration. Multiply by how often you'd need to do it. If buying is cheaper, buy, and expect the calculus to shift as P drops.
The Bottom Line
Generative AI changes how software is produced. It does not eliminate the need for structure, correctness or coordination.
Value is K x P x N. K has a floor. P is falling. The winners are those whose N grows faster than P shrinks.
The strategic question isn't "what can we automate?" It's "what irreducible complexity should we own, and can reuse grow faster than generation costs fall?"
Originally published at Forbes Technology Council
Top comments (0)
Some comments may only be visible to logged-in visitors. Sign in to view all comments.