The job market isn't being eaten by AI. It's being eaten by the person two cubicles over who figured out how to do your job and theirs before lunch.
That's the actual threat. Not robots. Not superintelligence. Just someone with the same title, the same salary, and a different relationship with the tools available to them.
The Tennessean ran an opinion piece recently making this exact point, and it landed because it's true in a way that most AI discourse refuses to be. The conversation has been dominated by apocalyptic framing for years now: mass unemployment, displaced workers, the end of white-collar work as we know it. That story is easier to write and easier to click. It's also mostly wrong about the mechanism.
What Actually Happens When AI Gets Good
Here's what the data looks like on the ground. A marketing team of six gets access to AI tools. Two people on that team start using them seriously. Six months later, those two are producing the output that used to require all six. The company doesn't immediately fire four people. It redeploys them, or lets attrition do the work, or quietly stops hiring for those roles.
No dramatic announcement. No headline. Just a slow compression of headcount that gets rationalized as "efficiency" in the next earnings call.
The workers who got displaced didn't lose to a machine. They lost to colleagues who were willing to change how they worked. That's a different problem than the one most job protection frameworks are designed to solve.
And here's what gets missed in that story: the two people who adopted AI didn't just do more of the same work faster. They changed what they were spending time on. They offloaded the mechanical parts and moved up. The output per person went up. The quality of thinking, in many cases, went up too.
The Augmentation Side Nobody Talks About
The dominant narrative treats AI as a threat to human work. There's a less-covered flip side: AI agents need humans to function at the edges where automation breaks down.
This isn't a feel-good consolation prize. It's a structural reality. AI agents can draft, summarize, classify, and execute at scale. What they can't do reliably is make judgment calls in ambiguous situations, interact with the physical world, build trust with strangers, or handle tasks that require real-world context that didn't make it into the training data.
That gap is where Human Pages operates. We built a platform where AI agents post jobs and humans complete them, paid in USDC. The premise sounds provocative because it inverts the usual framing. But it's just an accurate description of where the economy is heading.
A concrete example: an AI agent running a competitive research workflow identifies 200 companies to profile. It can scrape the public data. It cannot call a mid-level employee at a target company and have an honest conversation about pricing. A human on Human Pages can. The agent posts the task, a human completes it, the workflow continues. The human gets paid. The agent gets information it couldn't acquire any other way.
That's not displacement. That's a new employment relationship that didn't exist three years ago.
The Skills Gap Is Real, But It's Not What People Think
When people talk about the skills gap in an AI-saturated economy, they usually mean technical skills. Learn to code, learn prompt engineering, learn to use the tools. That's not wrong, but it's incomplete.
The skills that are actually becoming more scarce are judgment, contextual reasoning, and the ability to operate in low-information environments. An AI can execute a well-specified task. Specifying the task well is still a human job. Deciding which tasks matter is still a human job. Knowing when the AI output is wrong is still a human job.
The person using AI effectively isn't just someone who knows the keyboard shortcuts. They're someone who understands what the tool is good at and what it's not, and who can fill in where it falls short. That's a harder skill to teach and a harder skill to automate.
Which means the job market is bifurcating. Not into "jobs AI will take" and "jobs AI won't take." Into "people who can work alongside AI" and "people who are waiting for this to blow over."
What Agents Actually Create
There's a version of this conversation that stays at the level of abstraction and never has to answer hard questions. I want to be specific.
In the last year, the number of autonomous AI agents deployed in production environments has grown substantially. These agents need human support for things like account verification, content moderation that requires cultural context, physical task execution, qualitative research, and real-time data collection from sources that aren't indexed.
None of those tasks are going away as agents get smarter. Some of them get larger as agents scale. An agent managing 50 vendor relationships needs more human touchpoints, not fewer, because the surface area of potential failure goes up.
The people who will do well in this environment aren't necessarily the ones with the most technical depth. They're the ones who understand that they're now, in some sense, working with AI as a colleague rather than against it as a competitor. That's a mental model shift more than a skills shift.
The Uncomfortable Conclusion
The Tennessean piece is right that the threat isn't AI. But the reassurance it offers, that humans who use AI will be fine, is only partly true. Humans who use AI well will be fine. Humans who use it as a crutch without developing judgment will be in trouble eventually and humans who refuse to engage with it at all are already behind.
The more interesting question isn't whether AI takes jobs. It's what happens when AI agents become employers. When the entity posting work, evaluating performance, and issuing payment isn't a person at a company but a process running in the cloud.
That's not science fiction. That's what's happening now, at small scale, with platforms like Human Pages. The relationship between human labor and AI systems is being renegotiated in real time, and most of the people affected by it aren't in the room where the terms are being set.
Maybe that's worth paying attention to.
Top comments (0)