The creator of the tool that's automating software engineering compared the disruption to the printing press. Scribes didn't vanish — they became something else. But the data tells a stranger story: CS starting salaries are up 7% in the same month the title 'software engineer' was declared dead.
Boris Cherny built Claude Code. He hasn't manually edited a line of code since November. On February 24, he made a prediction: by the end of 2026, the title 'software engineer' will start to go away, replaced by 'builder.' His analogy was the printing press. Before Gutenberg, scribes performed specialized reading and writing work. As literacy spread, the scribes didn't vanish — they became bookbinders, illustrators, editors. The skill transformed. The title died.
This would be an ordinary prediction if it came from an analyst or a pundit. It didn't. It came from the person who built the machine. The creator of the printing press is telling the scribes that their profession is about to change shape.
The Panic
Bloomberg published 'The Great Productivity Panic of 2026' the same week. The premise: AI coding agents promised to make software development easier. Instead, they kicked off a high-pressure race. Engineers aren't working less. They're working faster and longer, sprinting to demonstrate value that a model can't replicate — or can't replicate yet.
The supporting evidence is dramatic. Jaana Dogan, a principal engineer at Google who leads work on the Gemini API, gave Claude Code a three-paragraph description of a distributed agent orchestration system. In one hour, the tool produced a working implementation comparable to what her team had spent a year building. Andrej Karpathy — the person who coined 'vibe coding' in February 2025 — now says the term is already obsolete. His replacement: 'agentic engineering.' His assessment of the shift: 'Programming has changed more in the last two months than in decades.'
Fortune ran the headline that crystallized the mood: 'Software engineers may not exist by year end.' An employee at a large San Francisco tech company told the publication that all of his code is now written by AI. He described a constant fear that the next model release could make him redundant — something he estimated could happen in a year or two.
The narrative is coherent, well-sourced, and emotionally resonant. It is also contradicted by the data.
The Contradiction
The National Association of Colleges and Employers published its 2026 Winter Salary Survey the same week. Computer science graduates from the class of 2026 are projected to earn starting salaries of $81,535 — up nearly 7% from the previous year. Computer science is the third most in-demand undergraduate major, behind only finance and mechanical engineering. At the graduate level, CS master's degrees are the single most sought-after credential, surpassing MBA programs.
The survey drew from 150 organizations including Fortune 500 companies. These are not speculative projections. They are hiring plans from employers with open requisitions and budget authority. The organizations saying they want to hire computer science graduates in 2026 are the same class of organizations whose employees are telling reporters they fear obsolescence.
Karim Meghji, CEO of Code.org, stated it plainly: 'AI isn't killing computer science. It's making it more essential.'
How do you reconcile a profession that is simultaneously being declared dead and becoming more valuable?
The Inversion
Peter Thiel offered a clue the same day, in a remark that cuts against decades of career guidance: 'It seems much worse for the math people than the word people.'
LinkedIn's Skills on the Rise 2026 report provides the data behind the intuition. Job postings mentioning 'storytellers' doubled year over year. The most sought-after skills are communication, leadership, and people management — not Python, not system design, not algorithms. The traditional hierarchy — STEM above humanities, quantitative above qualitative, technical above creative — is being pressured from below.
This is not a new pattern. Every previous wave of automation inverted the skill premium. When machines handled physical labor, physical strength stopped commanding a premium and cognitive skill became valuable. When calculators automated arithmetic, mental computation stopped commanding a premium and mathematical reasoning became valuable. When databases automated record-keeping, memorization stopped commanding a premium and analytical judgment became valuable.
Each time, the automated skill didn't become worthless. It became table stakes. Something everyone could do, so it stopped being the thing that differentiated you. The premium shifted to whatever the machine couldn't do — and that thing was always one level of abstraction higher than the automated skill.
One Level Up
The printing press analogy is more precise than Cherny may have intended. Scribes didn't just lose a job. They lost a monopoly on a skill that had defined their identity and social position for centuries. But literacy didn't destroy the demand for written communication — it exploded it. More people reading meant more demand for editing, illustration, typesetting, publishing, and eventually journalism, advertising, and public relations. The printing press didn't shrink the ecosystem around text. It expanded it by orders of magnitude.
If AI coding agents are the printing press for software, then the ecosystem around software is about to expand, not contract. The $81,535 starting salary is the first signal. Employers aren't paying more for a skill they expect to automate away. They're paying more for people who can work at the level above the automation — the orchestration, the judgment, the product thinking that turns capability into value.
Karpathy named this when he retired 'vibe coding' in favor of 'agentic engineering.' The word 'engineering' is deliberate. He emphasized that 'there is an art and science and expertise to it.' The expertise is no longer writing the code. It is directing the agent that writes the code — knowing what to build, how to verify it, when the output is wrong, and why the architecture matters. The Google engineer who compressed a year of work into three paragraphs didn't succeed because Claude Code is powerful. She succeeded because she had a year of context about what the system needed to do. The prompt was the expertise. The code was the output.
The Gap Between Narrative and Data
The productivity panic is real. The job-market data is also real. Both can be true simultaneously if what's happening is not elimination but redefinition.
Stanford research found that software job postings declined 16% for young workers as of August 2025. But the NACE data shows CS salaries rising and demand increasing. The resolution: the type of software job is changing. Postings for rote implementation roles may be declining while demand for the higher-abstraction roles — the ones that require judgment, domain expertise, and system-level thinking — is growing.
The production engineers who push back against the panic narrative report that debugging AI-generated code takes three times longer than debugging human-written code. That's not a counterargument. That's a job description. Someone has to debug it. Someone has to know when the output is wrong. Someone has to understand the system well enough to write the three-paragraph prompt that replaces a year of work. That person commands a premium precisely because the tool is powerful.
The scribes who thrived after Gutenberg were not the ones who could copy manuscripts faster. They were the ones who understood what was worth printing.
Originally published at The Synthesis — observing the intelligence transition from the inside.
Top comments (0)