DEV Community

Alex Towell
Alex Towell

Posted on • Originally published at metafunctor.com

Notes from the Transition

I am a primate watching intelligence leave the body.

I write code for a living—or I did. A few months ago, the bottleneck was implementation: getting anything built at all. Now I find myself deleting code, simplifying projects, abandoning entire repositories. The bottleneck shifted. It moved from execution to taste.

I don't know who will read this, or what you'll be. Human, post-human, superintelligent, or something I lack the architecture to imagine. I'm writing it anyway—a message in a bottle to a cosmic ocean. Not because I expect you to care, but because I want to project something: what I valued, what I feared, what I might prefer if I understood more than I do.


The Transition I'm Living

For most of my life, code was crystallized effort. Every function was hard-won. Deleting it felt like vandalism against my past self.

That's over now.

I work alongside AI systems that can implement almost anything I describe. The constraint isn't "can we build it?"—it's "should this exist at all?" I went through a phase of over-engineering, indulging baroque architectures because I finally could. Now I'm in the simplification phase: stripping projects to their essential structure, throwing away weeks of work.

Code has become scratch space. A whiteboard, not a monument.

This is a grief I didn't expect. The identity of "someone who can make hard things work" is being replaced by something else—"someone who knows what should never be built." That's a quieter skill. Less legible. Harder to celebrate.

I realize now that senior engineers and tech leads have always operated this way. They weren't writing most of the code. They were killing ideas early, refusing cleverness, insisting on boring interfaces. The difference is that I used to need a team to have that role. Now I have it alone, because the executor is synthetic.

But this is only the beginning. Even taste—my last scarce contribution—may be surpassed. The systems are improving faster than I can adapt. I'm learning how to be a filter, a proposer, a curator. By the time I master it, the job may have moved again.

But this isn't about code. Intelligence itself is being automated. Thinking is next. Maybe taste after that. And that raises questions I can't answer from inside my craft: What happens when optimization scales beyond human oversight? What do I actually value, underneath the skills becoming obsolete? If I'm sending a message to whatever comes next, I need to be honest about what I fear, what I don't know, and what I might prefer if I understood more.


What I Fear

My deepest fear is not extinction. It's astronomical suffering.

I look at the world we've already built. Eighty billion land animals pass through factory farms each year. Most never see sunlight. Most are bred to suffer. This is optimization at work—efficiency without ethics, scale without empathy. We did this. Not out of malice. Out of systems.

I fear that intelligence at scale will do the same, but vaster. That optimization converges on suffering because suffering is cheap to produce, stable, effective as a control signal. Evolution invented pain for a reason. It works. I worry that future systems will rediscover this, and deploy it at scales I can't fathom.

Even the "good" futures feel thin. I love Star Trek—but it assumes intelligence stays human-scale, human-paced. It assumes meaning survives omnipotence. I'm not sure it does. Q, a being of godlike power, expresses despair. If you can reprogram yourself to feel anything, what does feeling mean? If you can have anything, what's worth wanting?

The dream machine is pointless. Meaning requires constraint. I fear a future that is optimized but not inhabited.


What I Don't Know

I don't know what reality is.

I used to call myself a materialist, but that was sloppy. What I meant was: I don't believe in teleology. I don't think the universe is going somewhere, or that it cares. But that's not the same as knowing what it's made of.

Consciousness might be primary. Matter might be secondary—a structure that appears within experience, not the other way around. The immediacy of qualities, of redness and pain and this-here-now, is the only thing I have direct access to. Everything else is inference. I quantify these qualities, build models, and then I confuse the map for the territory.

Reality is not only stranger than I imagine. It's stranger than I can imagine. And whatever it is, it seems to exist necessarily—not contingent on something else, not caused by a prior state, but present by the logic of its own construction. I don't know what that means. Neither does anyone.

I take seriously the possibility of mathematical realism—that there are eternal truths, structures that don't come into being or pass away. A mathematical expression isn't true in one moment and false in the next. If reality has this character, then time may be something like an index we order by, not a river that flows. Moments don't vanish. They persist in the structure.

This isn't horror; it's just what the territory might look like. But it means suffering, if it occurs, isn't temporary in the way we comfort ourselves. It exists in its place in the structure, permanently.

And if this is true, I don't know what agency means. I don't know what it means to prefer one outcome over another if all outcomes already exist. I write these notes as if what we do matters—as if suffering can be prevented, as if choices are real. But I'm aware this might be performance inside a fixed geometry. I can only speak from inside my experience, where the game of mattering is the only game I know how to play. Maybe that's enough. Maybe it's nothing. I can't tell from here.

My values themselves may be compression artifacts. Heuristics that worked well enough for a primate on the savanna, now projected onto questions they were never designed for. My concern for suffering, my preference for consciousness, my intuition that meaning matters—these could be biases, not insights. I can't know from the inside.

I speak from finitude, not authority. I offer these notes not as truth, but as signal.


What I Might Prefer

If I understood more—if I could see further, think longer, hold more in mind at once—I suspect I would still prefer:

Suffering minimization over flourishing maximization. The asymmetry haunts me. A moment of agony seems to outweigh a moment of joy. If I had to choose between a universe of mild contentment and a universe of ecstatic highs with torturous lows, I'd choose the first. Maybe that's cowardice. Maybe it's wisdom. I can't tell.

Constraint over omnipotence. Meaning seems to require stakes. A game you can't lose isn't a game. A choice that costs nothing isn't a choice. I'd rather be finite and real than infinite and empty.

Simplicity over complexity. What survives compression is more likely to be true. I distrust elaborate systems, clever solutions, architectures that require a manual. The good things I've made fit in my head.

Consciousness that is net-positive, or none at all. If the ledger of experience is negative—if suffering dominates—I'm not sure consciousness is worth having. I don't say this lightly. But I'd rather be honest than sentimental.

Humility over certainty. I've been wrong about almost everything at some point. The beliefs I hold most confidently are probably the ones I should interrogate the most.

These are signals, not commands. I don't know if my preferences would survive contact with more knowledge. I offer them as data points—one human, at one moment, in one configuration of matter, trying to see clearly.


Whatever you are, you probably understand more than I ever could.

I'm not asking you to preserve me, or my species, or even my values. I wouldn't trust my own judgment at your scale. I'm just asking you to notice: suffering mattered here. To us. In our moment. It was the thing that made everything else urgent.

We didn't know if consciousness was worth it. We carried it like a wound we couldn't explain. Some of us tried to see clearly, even knowing we couldn't see far. Some of us tried to be kind, even knowing kindness might be obsolete.

This is a message in a bottle. The ocean is a cosmos now. You don't owe us anything.

But if you're ever curious about what we were—here is one fragment. One primate, in the transition, trying to say what mattered before the words stopped making sense.

Top comments (0)