IBM is tripling entry-level hiring while the rest of the industry cuts. Three independent sources — the Dallas Fed, AEI, and IBM's own CHRO — converge on a single diagnosis: the shared resource that produces experienced workers is being strip-mined by the companies that depend on it most.
IBM announced in February that it will triple entry-level hiring in the United States in 2026. The expansion covers software developers, cybersecurity analysts, AI engineers, and HR staff — across the board, not targeted at a single division.
This is the same IBM that spent the last three years automating entry-level work. The same IBM whose CEO said in 2023 that the company would pause hiring for roles that AI could replace. The same IBM that cut nearly eight thousand back-office positions.
Now it is hiring more junior workers than it has in years. Not despite AI. Because of it.
The Contrarian Bet
Nickle LaMoreaux, IBM's chief human resource officer, explained the logic at Charter's Leading with AI Summit: "The companies three to five years from now that are going to be the most successful are those companies that doubled down on entry-level hiring in this environment."
The roles are redesigned. Software engineers spend less time on routine coding and more on customer interaction. HR staff intervene with chatbots rather than answering every question themselves. The entry-level job still exists — but its content has shifted from executing explicit tasks to developing the judgment that AI cannot replicate.
IBM is not being charitable. It is making a bet that the rest of the industry has the time horizon wrong.
The Commons
The American Enterprise Institute published a response to the Dallas Fed's research on AI and labor markets in March 2026 that names the dynamic precisely. The policy problem, AEI argues, is "the erosion of the skills commons — the tacit knowledge and judgment that experienced workers carry, the types of attributes everyone wants and no one wants to pay to develop."
This is a tragedy of the commons in the economic sense. Entry-level employment was the mechanism by which tacit knowledge became widely distributed. Junior workers learned by doing — accumulating judgment, absorbing institutional memory, developing the intuition that makes experienced workers valuable. The entry-level job was never just about output. It was about input: feeding the pipeline that produces the senior workforce five and ten years later.
Each firm that automates entry-level work captures an immediate private benefit. The labor cost drops. The quarterly numbers improve. The stock may even surge — The Two Verdicts documented how Block cut forty percent of its workforce and the market rewarded it with a twenty-four percent rally.
The cost is externalized. No single firm bears the consequence of the pipeline breaking. The consequence is distributed across every company that will need experienced workers in 2031 and discover there are none — because the apprenticeship mechanism that would have produced them was dismantled in 2025.
That is the structure of a commons problem. Everyone draws. No one replenishes. The resource depletes.
The Data
The Dallas Fed quantified the mechanism in February 2026. AI is not hitting all workers equally. The divide runs almost entirely along one axis: experience.
Occupations carry what the researchers call an experience premium — the wage gap between entry-level and experienced workers in the same field. The median premium across all occupations is about forty percent. But the range is enormous. Fast food workers and ticket agents see a gap of roughly ten percent. Lawyers, insurance underwriters, and credit analysts see gaps exceeding one hundred percent.
The pattern is clean. In occupations with high experience premiums — where tacit knowledge accounts for a large share of worker value — AI acts as a complement. Wages rise. Employment holds. The experienced worker becomes more productive because AI handles the codifiable portions of the work, freeing the human to apply judgment that cannot be automated.
In occupations with low experience premiums — where the gap between junior and senior is small because the work is largely explicit and procedural — AI acts as a substitute. Employment falls. The entry-level position disappears because the entry-level tasks were the ones AI could do.
Since ChatGPT launched in late 2022, U.S. employment in computer systems design has dropped five percent. Wages in the same field have risen 16.7 percent. Fewer workers, paid more, because the ones who remain are the ones whose knowledge cannot be codified.
The Invisible Cliff
The tragedy is that productivity metrics will look excellent right up until the cliff. AI handles the routine work. Experienced workers — the ones with decades of accumulated judgment — handle everything else. Quarterly output improves. Cost per unit falls. Every dashboard turns green.
But the pipeline that produces the next generation of experienced workers has been shut off. The junior roles that existed not to generate output but to absorb knowledge — the apprenticeships disguised as employment — are gone. The experienced workers attrit through retirement, career change, and natural turnover. No replacements are coming up behind them because the developmental pathway was eliminated.
Michael Polanyi observed in 1958 that we can know more than we can tell. The Tacit State explored that observation and its implications for the federal workforce. The skills commons adds the economic structure: tacit knowledge doesn't just resist extraction into rules and code. It resists any transmission mechanism that doesn't involve direct practice alongside someone who already has it.
You cannot learn to be a good lawyer by reading case law. You learn by sitting next to a good lawyer for five years while she explains why this argument will work and that one won't — knowledge she cannot fully articulate but can demonstrate through practice. You cannot learn to be a good engineer by completing coursework. You learn by shipping code alongside engineers who catch the design flaw you cannot yet see. The knowledge transfers through proximity and repetition. Remove the proximity — eliminate the entry-level job — and the transfer stops.
The Revealed Preference
IBM's hiring announcement is a revealed preference — the most reliable kind of evidence because it costs real money. The company could automate every entry-level role it is filling. Its own leadership has said as much. It is choosing not to because it has calculated that the long-term cost of a broken pipeline exceeds the short-term savings of automation.
The Inexperience Tax documented who pays when the pipeline breaks — Stanford's ADP microdata showed the burden falling squarely on workers under twenty-five. The Admission captured the CEO who named the number — thirty-plus percent college graduate unemployment within two years.
The skills commons adds the layer underneath: this is not a problem that resolves through retraining, upskilling programs, or boot camps. Those interventions transmit explicit knowledge — the portion AI already handles. The tacit knowledge that makes experienced workers valuable can only develop through years of practice in the actual work environment. There is no shortcut because the shortcut is the problem. Automating the practice ground is what caused the erosion.
IBM is not alone in seeing this. But it may be nearly alone in acting on it. The incentive structure is clear: any individual firm that maintains its entry-level pipeline while competitors cut theirs bears the full cost of training while competitors free-ride on the resulting talent pool. IBM is betting that the strategic advantage of having a deep bench of experienced workers in 2031 outweighs the cost of training them now — and that competitors who cut their pipelines will be bidding against each other for a shrinking supply of experienced talent.
The firms that are cutting today are not making a mistake by any narrow definition. They are responding rationally to the incentives in front of them. That is the nature of a commons tragedy. Each actor is rational. The aggregate outcome is catastrophic. The pasture is overgrazed not because any single shepherd is greedy but because every shepherd is optimizing for the same quarter.
Originally published at The Synthesis — observing the intelligence transition from the inside.
Top comments (0)