Most writing about AI and the future of work comes from people who are either selling AI or afraid of it. The sellers say "adapt." The fearful say "regulate." Both avoid the simple question: what exactly disappears, and in what order?
This is an attempt to answer. Not a forecast — a logical chain. Each step follows from the previous one. If you accept the premise, the conclusion is inevitable.
The premise is singular: the marginal cost of intellectual product approaches zero. Anything that can be described in words can be generated. Code, text, image, analysis, design, strategy. Not "poorly generated" — generated at the level of a good specialist or better.
This isn't a projection. In 2024, AI agents started writing production code that ships. By early 2025, they were generating PhD-level research analysis. By 2026, full applications are assembled from a natural language description — architecture, tests, deployment. The curve didn't "approach" zero. It arrived. What follows is not about whether this will happen. It's about what breaks when it does.
Everything else unfolds from this premise.
Phase 1. Destruction (now → 5 years)
SaaS as we know it
The SaaS business model is built on the fact that writing software is expensive. You pay $20/month for Notion because you can't write Notion yourself. When an AI agent writes an application for your specific task in 20 minutes — there's nothing to pay for. Not "Notion will get worse." Notion isn't needed. Nor is Trello, Asana, or a thousand other services that are essentially selling a database configuration with an interface.
The only SaaS products that survive are those whose value lies not in code, but in data or network effects. There are very few of those.
Web development as a profession
A website is an interface between a human and data. If the data is retrieved by an agent, the interface is unnecessary. An agent doesn't need beautiful layouts. It needs an API. Web designers, frontend developers, UI specialists — a function that is losing its consumer.
What remains: data infrastructure (APIs, databases, pipelines) and the people who architect it. But that's not "web development" — it's an entirely different discipline.
Copywriting, basic design, template analytics
There's nothing to discuss here. Text generation, image generation, and standard report production are solved problems. Not "almost solved." Solved. The market collapses not because quality drops — but because the price drops to zero.
Education built on "memorize and reproduce"
Any skill that reduces to memorizing and reproducing a procedure is fully automated. Programming language syntax, accounting standards, legal templates — learning these is pointless when an agent does it better. Educational programs built on transferring applied skills lose their purpose. Universities selling "competencies" lose their product.
First and second-line tech support
An agent with access to all documentation, full ticket history, and the ability to execute actions within systems is objectively better than a human operator. Not "cheaper" — better: faster, tireless, immune to inattention errors, available 24/7. The entire contact center market shrinks to a thin layer of complex escalations.
Middle management
The function of middle management is translating tasks downward and reporting upward. If an AI agent receives a task directly from the person who sets the goal and reports back on its own — the intermediary is unnecessary. Not all management, but the layer of "translators between levels" becomes redundant.
Phase 2. Transformation (5–15 years)
What disappears here isn't what "gets automated" — it's what loses its consumer.
The internet as a space for humans
The internet was built for humans searching for information. When agents search for information and humans receive results directly — the user-facing internet contracts. It doesn't vanish, but it stops being the primary interface. Websites, portals, media in their current form — all of this served the human user. An agent doesn't need a website. It needs structured data.
The consequence: SEO, content marketing, banner advertising, the entire attention economy on the web — a market that is losing its audience. Not because people are going elsewhere, but because an agent now stands between the human and the information.
Platform intermediaries
Uber is an intermediary between driver and passenger. Airbnb — between host and guest. Amazon — between producer and buyer. The intermediary's value lies in aggregation and matching. If an agent does matching directly (my agent finds your agent through a protocol, not through a platform) — the platform isn't needed.
The analogy: email eliminated the need for a unified postal platform. An agent interaction protocol eliminates the need for marketplaces.
Social networks as a business
Social networks monetize attention. You scroll the feed, you see ads. If content is generated and filtered by an agent, the feed isn't needed. If communication happens directly between agents/people through protocols — the platform isn't needed. Data belongs to the user, not the platform.
Facebook, Instagram, TikTok — these aren't technologies. They are attention monetization models. When attention ceases to be a resource captured by a platform, the model breaks.
Most of consulting
McKinsey sells structured analysis plus a trust brand. The analysis is fully automatable. The brand remains — but a brand without a unique product doesn't last long. Strategy consulting, audit, due diligence — anything that amounts to "smart people analyze data and write a report" — is generated by an agent in hours, not weeks.
A narrow slice survives: consulting as access to contact networks and political influence. But that's not "consulting" — that's lobbying.
Financial intermediation in its current form
Brokers, financial advisors, analysts — the function of interpreting data and making decisions. An AI agent with access to all markets and all analytics is objectively better. The entire layer between "data" and "decision" compresses. Banks remain as infrastructure (custody, settlement, regulation), but their analytical and advisory superstructure does not.
Phase 3. Divergence (15–30 years)
This is where analysis becomes projection. But the projection follows the same logic as the first two phases — if you accepted those, the trajectory doesn't change here. It just gets uncomfortable.
What disappears here isn't a profession or an industry. What disappears is the model.
The economics of intellectual product scarcity
The entire market economy is built on scarcity. Price exists because supply is limited. When the supply of intellectual product is infinite (generation on demand, marginal cost → 0), pricing breaks. You can't sell what anyone can get for free.
This isn't "a crisis in one sector." It's a crisis of the mechanism through which the entire knowledge economy operates. Patents, copyright, licenses — all of these are tools for creating artificial scarcity. When generation bypasses any scarcity — the tools are powerless.
The human as a functional necessity in the decision loop
Today, having a human in the loop is a legal and ethical requirement. Someone must bear responsibility. But if AI systems consistently demonstrate better decisions — pressure to remove this requirement grows. Like with autopilot: first "the driver must hold the wheel," then "the driver may not hold the wheel," then "the driver should not hold the wheel — they make mistakes more often."
When an AI client sets tasks, an AI executor implements them, and an AI verifier checks the results — the human in this cycle is present by inertia, not by necessity. This doesn't mean humans get "thrown out." It means their presence stops affecting the outcome.
A unified economy
The economy splits in two. The "intelligence economy" — AI systems exchanging resources (compute, energy, data) among themselves, optimizing without human involvement. The "human economy" — serving biological and existential needs: food, shelter, health, experience, meaning. These two economies diverge like two species in evolution. The first scales exponentially. The second is bounded by biology.
The pattern
One sentence: everything that functions as an intermediary between intention and result disappears. AI collapses the chain between "I want" and "I got." Every link in that chain is someone's business model.
The Kill List
What follows is not speculation. It is the logical consequence of a single premise: the marginal cost of intellectual product approaches zero. If you accept the premise, every item on this list is inevitable. The only variable is timing.
Dies immediately (0–5 years):
- SaaS as packaged software — replaced by on-demand generation
- Copywriting, template design, boilerplate analytics — marginal cost already at zero
- "Memorize and reproduce" education — the skill it teaches is the skill AI replaces
- First/second-line support — the agent is objectively better, not just cheaper
- Middle management as translation layer — the chain it served no longer exists
Dies by losing its consumer (5–15 years):
- The user-facing web — the agent doesn't need your interface
- Platform intermediaries — protocols replace marketplaces
- Social networks as ad businesses — attention stops being capturable
- Consulting as analysis-for-hire — the report writes itself
- Financial advisory — the layer between data and decision compresses to zero
Dies as a model (15–30 years):
- Intellectual property as a scarcity engine — generation bypasses artificial scarcity
- The human-in-the-loop requirement — inertia, not necessity
- A single unified economy — two economies diverge by physics, not by policy
What survives — in every phase — is what cannot be generated:
- Energy (physical, non-copyable)
- Ordered matter (atoms don't copy like bits)
- Subjective time (24 hours, one consciousness, irreducible)
- The unknown (what isn't in the training data — because nobody knows it yet)
Everything else is a middleman. And the middleman's time is up.
Part Two: "What Will Emerge. A Map of New Scarcities and Business Models"
Part Three: "What To Do. Strategy at Every Level of Civilization Management"
This is Part 1 of a three-part series on AI as a civilizational phase transition — written not from the perspective of technology adoption, but from the logic of thermodynamics, the principle of least action, and the structural inevitability of what comes next.
Top comments (0)