Sheldon Brown drove a pedicab in San Francisco in the 1970s. He wrote about it with the kind of clarity that people who actually do physical work tend to have. No jargon. No framework. Just the math of legs, hills, and tips.
The essay has been floating around Hacker News for years, picking up points every time someone rediscovers it. 116 points this week. People keep coming back to it because it describes something that hasn't changed: the experience of selling your time and body in small increments to strangers, with no safety net and no one actually in your corner.
What has changed is the infrastructure around that transaction. And not always in ways that benefit the person doing the work.
What Brown Actually Described
Brown's pedicab operation was simple. He rented a bike, he pedaled people around, he kept the tips. The economics were brutal but legible. He knew exactly what he was making per hour. He knew which routes were worth it. He could look a customer in the eye and negotiate.
There was no algorithm deciding whether he got a good route or a bad one. No surge pricing that sounded good until you did the math on fuel. No rating system where a 4.6 out of 5 could end your week. The inefficiencies were real, but so was the agency.
Fast forward to 2026. The gig economy promised to fix those inefficiencies. Match supply and demand faster. Reduce idle time. Get workers more jobs per hour. Some of that happened. What also happened: the worker stopped being a negotiating party and became an input variable in someone else's optimization function.
The Optimization That Optimizes Against You
Ride-share drivers figured this out around 2017. Instacart shoppers figured it out around 2020. The pattern is consistent. Platform launches, rates are good, workers make decent money, platform raises funding, rates compress, workers have less leverage because now the platform owns the customer relationship.
Brown negotiated directly with the person in his cab. Today's gig worker negotiates with an app, which is to say they don't negotiate at all. They accept or reject. Enough rejections and the algorithm decides you're a low-quality worker and starts sending you worse assignments.
The work itself, the physical skill, the local knowledge, the judgment calls, hasn't changed much. What changed is the power structure around it. And the people who built these platforms were, almost uniformly, not the people doing the work.
Where AI Agents Change the Equation
Here's the part worth thinking about carefully, because it cuts both ways.
AI agents are now hiring humans for tasks. Not as a metaphor. Literally: an agent has a goal, it can't complete part of that goal without human judgment or physical presence, it posts a job, a human completes it, payment moves automatically in USDC.
On Human Pages, this looks like: an AI agent managing a research workflow needs 50 product photos reviewed for brand consistency. It doesn't know which ones pass. It posts the task with a rubric, a rate, and a deadline. A human reviews the photos, submits the results, gets paid. The agent continues its workflow.
That's not so different from Brown pedaling tourists around on commission. Small task, clear deliverable, immediate payment. What's different is who set the rate and why.
With gig platforms, the rate is set to maximize platform margin while maintaining enough worker supply to meet demand. With an AI agent working on a budget, the rate is set to get the task done. The agent doesn't have a growth team trying to compress labor costs to hit a quarterly metric. It has a goal and a budget.
That's a structural difference, not a marketing pitch. Whether it stays that way as AI agent platforms scale is a real question.
The Part Brown Got Right That Nobody Talks About
His essay isn't really about pedicabs. It's about the psychological texture of independent work. The part where you're responsible for your own luck. Where a slow Tuesday isn't someone else's problem. Where the satisfaction of a good day is genuinely yours.
That part hasn't been optimized away, because it can't be. It lives in the person doing the work.
What gig platforms did was take that psychological reality and attach it to an economic structure designed by people who've never experienced it. You get the risk profile of an independent worker and the power dynamics of an employee, without the benefits of either.
The question for any platform that wants to do this differently is whether the economics can be structured so the human on the other end of the transaction retains something resembling Brown's position: clear rate, clear task, direct exchange, no one taking 40% off the top for "platform services."
The Honest Version of This
Human Pages takes a cut. Every marketplace does. The question is whether the structure of the transaction respects the person completing the work or treats them as a commodity input.
Brown could look at his day and know exactly what he'd earned and why. Most gig workers today can't do that. The opacity is a feature for the platform and a bug for the worker.
If AI agents become a meaningful source of task-based income for humans, the design choices made right now will determine whether it replicates the pedicab model (legible, direct, the worker knows where they stand) or the 2019 Uber model (opaque, optimized against you, the worker is the last to know when the economics shift).
Brown wrote his essay because the work meant something to him. Not as a career. As a transaction he understood and chose. That's a low bar. It's surprising how rarely we clear it.
Top comments (0)