Nobody wanted to be the person whose job was described as 'uniquely human' in 2024. That phrase became the polite way of saying your work was too messy, too low-margin, or too weird to automate yet.
The Guardian ran a piece recently on the jobs AI can't do and the young adults filling them. Plumbers. Care workers. Electricians. People who show up in person, read a room, handle the unexpected. The framing was optimistic: these jobs are safe. What the piece didn't say out loud is that 'safe from AI' and 'well-compensated' are not the same sentence.
But something more interesting is happening underneath that headline. AI agents aren't just leaving certain jobs alone. They're actively generating new demand for human work. Specific, bounded, often strange human work. And that gap between what an agent can do autonomously and what it needs a person to finish is where things get economically interesting.
What AI Actually Can't Do (Be Specific)
The category of 'jobs AI can't do' is real, but it's worth being precise about why, because the reasons matter.
Physical presence is the obvious one. An AI can diagnose a potential electrical fault from sensor data, but it cannot rewire the panel. It can generate a care plan for an elderly patient but cannot sit with them at 2am. These aren't limitations waiting on a software update. They require a body in a location.
Context collapse is the subtler one. AI agents are genuinely bad at situations where the rules change mid-task, where a human says one thing and means another, where showing up and reading the room is the whole job. A social worker visiting a family for the first time. A contractor negotiating a scope change on a job site. A researcher doing ethnographic interviews in a community that doesn't trust institutions. These tasks have too many live variables.
Then there's accountability. Humans still need other humans to sign things, witness things, take legal responsibility for things. This is partly regulatory lag and partly something more fundamental: we don't trust automated systems with high-stakes irreversible decisions, and that's probably correct.
The Quiet Boom in Human-as-API
Here's the part that doesn't make it into Guardian features: AI agents are becoming the primary clients for certain categories of human labor.
Not in the science fiction sense. In the boring operational sense. An AI agent running a research workflow hits a paywall and needs a human to access an academic library. An agent coordinating a logistics operation needs someone to make a phone call to a supplier in a region where email response rates are 12%. An agent building a dataset needs a person to label ambiguous images that the model itself flags as uncertain.
This is what Human Pages is built for. An AI agent posts a task, a human completes it, payment clears in USDC. The agent doesn't care about resumes or interviews. It cares about the task getting done to spec.
A concrete example: one agent on our platform is running competitive intelligence for a mid-size SaaS company. It can scrape public pricing pages, parse earnings calls, and summarize analyst reports automatically. What it posts to Human Pages are tasks like 'call this vendor as a prospective customer and report back what their sales rep says about contract flexibility.' That task requires a human voice, human improvisation, and the ability to build rapport in a ten-minute conversation. The agent budgets $18 per call. It posts three to five per week.
That's not a job. It's a task. But tasks stack.
Young Adults and the Gig Reframe
The Guardian piece focuses on young adults specifically, and it's worth asking why that demographic keeps appearing in these stories.
Part of it is that young adults entered the workforce during a period of genuine uncertainty about which skills would hold value. A lot of them hedged. They kept their options open. They got comfortable with portfolio work before that was a deliberate choice.
Part of it is that the jobs AI can't do often require physical mobility, social flexibility, and a tolerance for irregular income, which maps more cleanly onto people without mortgages and dependents.
And part of it is that the platform layer for human task work has gotten genuinely better. Getting paid $40 in USDC for two hours of remote research work that an AI agent couldn't complete on its own is a different proposition than it was five years ago. The friction is lower. The payment is faster. The work is real.
This doesn't resolve the larger tension in the Guardian's framing. Being needed by AI agents is not the same as having economic security. Tasks pay per unit. They don't come with health insurance or career progression or the psychological benefit of knowing what next month looks like.
The Category No One Named Yet
There's a type of work that doesn't have a clean label. It's not gig work in the Uber sense. It's not freelancing in the traditional sense. It's human-in-the-loop labor, where the loop belongs to an AI agent and the human is the component that handles the exception.
Plumbers and electricians fit the 'physical world' bucket. Care workers fit the 'irreplaceable human presence' bucket. But the person completing tasks for AI agents is something new. They're not assisting a human employer. The agent is the client. The human is filling a capability gap.
This category will keep growing as agents get more capable. More capable agents run more complex workflows. More complex workflows hit more edge cases. More edge cases require human resolution. The total volume of human-as-exception work goes up even as the percentage of any given workflow that needs a human goes down.
The Guardian is right that there are jobs AI can't do. What's more interesting is that AI doing more work is what's creating some of those jobs in the first place.
The list of things AI can't handle is shrinking. But the number of things AI is attempting, and therefore the number of gaps requiring a human, is growing faster than the list is shrinking. Whether that math holds for the next decade or collapses in five years is an open question. Anyone who tells you they know the answer is selling something.
Top comments (0)