If you trace the lineage of the personal computing revolution back to its roots in the 1970s and 80s, you won’t find corporate boardrooms or venture capital firms. You’ll find idealists. You’ll find the counterculture hippies of the Whole Earth Catalog, Homebrew Computer Club or the hackers of the Cult of the Dead Cow tinkerers who believed that democratizing information would emancipate humanity. For a brief, shining moment, the tech industry felt like the one place on earth where you could actually change the world for the better.
![]() |
Even as the industry corporatized, that idealism lingered. We saw it in Google’s famous foundational mandate: "Don't be evil." It was a promise that tech could be immensely profitable without losing its moral compass.
But today, that promise feels like it's dying a slow, agonizing death.
Instead of a unified global village, we are left with a balkanized internet of walled gardens. The products we rely on are actively undergoing what author Cory Doctorow so aptly named "enshittification." Platforms that once served users are now decaying into extraction machines. They get more expensive while degrading in quality. They are bloated with intrusive spyware and saturated with features absolutely no one asked for. Like forcing AI copilots into every corner of our digital lives just to appease shareholders. We are no longer the users; we are the product, the data points, the marks.
I hope for a new ethic in tech and I have ideas on what it should look like.
I know exactly how that sounds. I am acutely aware of the mockery that awaits anyone who talks about "ethics" in Silicon Valley. If you’ve seen the HBO show Silicon Valley, you remember the parody of "Tethics" (Tech Ethics) a hollow, performative corporate branding exercise used by narcissistic founders to dodge accountability. Proposing "tech ethics" today sounds like a punchline because we’ve seen too many ethics boards dissolved the moment they stand in the way of quarterly growth.
We don't need a new corporate compliance framework. We need a fundamental shift in how we view the people who use our tools.
I look at this through the lens of my faith, but the underlying moral framework requires no theology to understand. It is a framework that appeals just as much to the staunch atheist as it does to the believer, because it is rooted in pure, unglamorous action.
In Baha'i history, there is a story of an early American adherent named Lua Getsinger. She deeply wanted to serve God and asked the head of the Faith how to do it. He didn't tell her to pray more or something equally abstract. Instead, he sent her to the home of a destitute, chronically ill man.
When Lua arrived, the conditions were so filthy and the smell so overwhelming that she fled. But her excuse was rejected. She was told that if she truly wanted to serve the divine, she had to serve her fellow man. She was sent right back with strict instructions: clean his house, bathe his body, and feed him. She was told not to return until the work was done.
You don't need a religion to understand the weight of that lesson. The translation is simple: Lofty ideals mean absolutely nothing if you aren't willing to do the messy, tangible work of improving the reality of the person right in front of you.
This is the standard of service. It is not an abstract mission statement on an "About Us" page. It is recognizing the inherent dignity of another human being.
What would the technology sector look like if it were governed by this principle?
If we truly respected the dignity of our users, we could never justify injecting spyware into their devices to harvest their private lives. We could never justify trapping them in addictive algorithmic loops to sell their attention.
To "clean the house and feed the hungry" in the digital age means building technology that acts as a genuine public good. It means designing software that respects human autonomy, protects privacy, and solves actual problems rather than fabricating new ones to monetize. It means teaching our engineers, our designers, and our algorithms how to serve humanity, rather than exploit it.
Tech can still be a place where people go to change the world. The tools we are building today have more potential to uplift humanity than the 1970s idealists could have ever dreamed. But the internet will not be saved by a new feature, more LLMS, or a faster computer. It will only be saved when the people building it decide to roll up their sleeves, look at the people they are building for, and remember what it actually means to serve.

Top comments (0)