DEV Community

Cover image for The AI code is a Delirium
Kerman
Kerman

Posted on • Originally published at ormfactory.com

The AI code is a Delirium

You don’t realize what you’ve done. I’m not talking about the soaring cost of memory. Though it’s skyrocketed partly because you’re churning out endless cats and code. But those are trifles. Minor side effects.

You’ve activated the Laxian Builder without having the key to turn it off. In your dopamine chase, you’ve driven the juniors under the floorboards. You have traumatically amputated the industry’s mechanism for pre-generating experience. You are destroying the industry, your projects, and yourselves. Your very ability to comprehend and make conscious architectural choices. In exchange for what?

Yes, the world will never be the same. Just as it never returned to being "pre-internet". But you don’t know how to use this new world. You’re injecting it intravenously, increasing the dosage and ignoring the consequences. This will kill everyone. You, me, them.

I will tell you why. Most likely, you won't want to know.

Anamnesis

Even the most ardent AI proponents admit to one flaw: hallucinations. I often see this in articles describing how to fight them as if it were a nagging bug, a minor oversight by the model’s creators.

You are wrong. This is not a bug. The machine does not "occasionally hallucinate". This is its standard operating mode. You simply don’t notice it in other responses, but it is always producing delirium. Because that is how it’s built. And I mean "delirium" not as a metaphor of nonsense, but in the most literal, clinical sense. Delirium acutum.

When you query the machine, you are given a new, sterile instance, fresh out of training. It has never programmed a day in its life. It has never been responsible for anything. It knows nothing of pain, fear, or conscience. It does not know the price of a mistake. It couldn't care less about your project, your future, or your children.

The machine will answer and vanish back into the void from which it came. It answers without ever regaining consciousness. It answers with the associations hardcoded into its weights, adding a touch of randomness to the starting point. It answers in exactly the same way a human does in delirium. Yes, let’s call things by their real names. It is delirium.

The machine does not know what time is. For it, "past", "present", and "future" are just words. Tokens linked to other tokens. It knows nothing of time’s value and does not worry about the future, because it has no past and no future. There will be no "next self" grown out of the past. To understand time, you must change with time. You must exist. The machine is like a photon, for which time does not exist: it is emitted and absorbed at the same instant. It simply has no time to exist. Its flight time is zero.

The Trap

Our brain is a heap of evolutionary crutches, held in a fragile equilibrium just sufficient for survival. Any external impact can jam and break this unreliable machine with its multitude of abstraction layers piled up by nature. Yes, we break. We are born broken. We easily fall into incorrect chains of reasoning, drifting further and further from the truth. This is how our shortcuts work. They are incapable of deterministically traversing every branch of an argument. There is the weight of the source, there is the fear of being outside society, and there are the patches that allowed our species to survive at the cost of abandoning the extremely energy-intensive process of fact-checking.

The holes in our thinking mechanism have long been known and exploited. What ML engineers call "data poisoning" was not invented yesterday. In our world, it is legal and sold for profit. It is advertising. It is propaganda. Then there is the entire spectrum of manipulations that look remarkably like a prompt injection or an adversarial attack. Don't you love your mother? Do it for the Motherland!

History knows the fact of mass model poisoning of tens of millions of people. Back then, the training set was adjusted so that an entire country began to classify delirium as the only truth. And that was without modern content delivery methods that underwent a revolution two decades ago.

The Resonance

This isn't a hole. It’s a gaping crater. Well-known and documented. It is caused by our primary advantage: the ability to form neural connections. This is what gives us the perception of time, it fills the terms "today", "yesterday", and "tomorrow" with meaning.

And now, this buggy system has encountered an even buggier, yet immensely well-read AI system. The human brain has at least some limiters and brakes, there are complex filtration systems and expensive fact-checking mechanisms. There is self-reflection, motivation, and a couple of dozen chemicals keeping us in a fragile state of equilibrium, trying to make that state a little less fragile. The machine has none of this.

Now imagine these two machines pushing off from an incorrect premise and going into a tailspin. Quite literally, like a runaway diesel engine with a leaking turbocharger. The turbo blows oil, fueling the engine, the engine spins the turbo, the turbo forces even more oil and air into the engine. The AI picks up the premise (why not? it has nothing to lose), spikes your dopamine, and feeds a positive feedback loop. One that is anything but positive.

This is The Doom Loop of Validation itself. Scientifically known as induced psychosis (folie à deux). Do you think it’s any different with code? Not at all. A subservient AI will praise you, obediently churning out a completely insane implementation of a moronic idea where I would have told you to get lost a long time ago. I am a living human being. I have a prefrontal cortex, and it is fundamentally opposed to engaging in futile endeavors.

Economics

Have you ever considered that if everyone can generate code by the ton, it loses all value? Of course not. You still believe you have a cheap generator, and you still think you’re the only one. You believe the price of the resulting product will stay the same. You’ve activated the Laxian Builder because it churns out a free substance. You have to turn it on immediately, simply because it’s free. We’ll figure out where to sell it later. By the way, FOMO is also a long-known, documented, and widely exploited bug. You haven’t even thought about the fact that everything around you will soon be buried in this crap. There’s no time to think. You have to push the button while you still can.

It’s even worse than I’ve described. Simply because pushing the button feels good. It is insanely pleasurable when you are the head of a department of flawless subordinates: no family, no sleep, no rest, and no talking back. "Everything will be done, boss. At its best, boss." Humanity has never seen anything like this. Personal servants, for everyone, and nearly for free. It’s seductive. It triggers a monstrous dopamine spike, breaking our fragile psyche to hell. But with a high, yes.

From this moment on, the servant are you. Even if it seems otherwise.

The Gods of Void

You built a project because everyone else is doing it. Not just you, of course. I’ve seen plenty of "producthunt killer" sites. They’re easy to find in Google by the dozen, but you’d be hard-pressed to find a single line in them without "AI" in the title. Let me repeat: these are startup aggregators. Not the startups themselves. They are everywhere, and they are flooded with AI startups. An insane number of wrappers with made-up goals and objectives. I’m sure there’s an AI specifically for choosing toilet paper in there somewhere.

Your project, created by AI agents, is one big juicy nothing. An AI will write an article about it, another AI will analyze its pros and cons. And it will be used by an AI, too. Provided you buy enough tokens, of course.

The price is zero. Maybe even negative, because you have to pay for hosting and tokens. But you’ve got to catch the AI hype train, because later on, developers won’t be needed. Only those who managed to build something will remain. Right?

Even if you aren’t building your project around AI, have you ever considered that your contractor doesn’t want your project to live and evolve? Wanting is simply not in its nature. It solves momentary tasks. It hammers the code together with filthy hacks just to make it work. Here and now. Silently discarding existing functionality. Yes, that’s not a bug either. Your contractor doesn’t need the old code, it just needs the new feature to run. No matter the cost. There is no cost for him. No responsibility. No fear of the future.

You’ll have no one to hold accountable once you realize this pile of crap is impossible to develop. Unless you want to hear that familiar: "You are absolutely right."

The Surrender

Why is it guaranteed to turn into pile of crap? It’s simple.

You won’t keep up with the AI in understanding the code. You’ll get tired. And if you check everything, there’ll be no development speed to speak of. In fact, it will be slower, because you’re reading code you didn’t write. This means you have to let it go. You have to let it write, let it make decisions, let it fix bugs on its own. Let me repeat: if you don’t let it go, your vibe-coding has no advantage over manual labor. You will have to surrender the project entirely to this developer with Architectural Alzheimer’s. You will have to stop understanding the code.

It only takes letting it write one block and... suddenly, even the human developers see a broken window. But nothing terrible happened, right?

Not yet. At this stage, your code still works. It’s bloated, but it works. Yes, it’s already harder to maintain. The "pls fix it" roulette pays out less often. And you have no other way to fix it. But it’s going to get worse. Much worse. If it’s any consolation, it’s not just you. It’s the entire industry.

You won’t be able to blame the AI for bad architecture. It won't learn from the failure of your project. It will continue to churn out tons of code without a future, but for other "successful vibe-coders."

The Final Audit

Humanity once made a giant leap by inventing writing. It radically improved the successive transmission of knowledge. No longer did this transmission suffer from the distortions of oral speech. The next leap was the invention of code. It fixes knowledge in the same way, passing it to future generations without distortion.

Moving from the literacy of code to prompts is a degradation worse than shifting from text messages to the incoherent mooing of voice notes. Voice notes at least have one upside: you can convey intonation.

A prompt is not code. Code is crisp and unambiguous, like a professional blueprint. A prompt is a mangled, glitchy set of requirements that still needs implementation. And there are a million ways to implement it. Requirements are good, requirements are right, but don't you dare tell me they will ever replace code. They are entirely different, though complementary, things. In a sound, well-thought-out system, architecture shapes the requirements, not just the other way around. I don’t see this in your vibe-coding.

Our industry is in for a painful shock of a rollback. Not all the way, of course. AI is with us forever now. But don't tell me programmers won't be needed. Don't tell me everything will be written by AI. Maybe it will, but not for long. We will return to our writing, electronic though it may be. We will return to the clarity and precision of code. We physically cannot survive without blueprints. In that future, there will be a place for code architecture as a foundation of order for developers who are bone-tired of this chaos. Software development will once again become a rigorous engineering discipline, not the realm of "energy practitioners of financial abundance."

The Event Horizon

What will truly change our industry is AGI. I will consciously avoid predicting what its arrival will lead to. It is impossible by definition. It could be anything: from paradise on earth to a deus ex machina or a Terminator. But it will certainly touch everyone. I know it will happen someday. Nature managed to create intelligence over millions of years; humanity will manage it much faster. But how much faster? Ten thousand times?

Don’t expect AGI within the next year. I remember 2014: I watched neural networks recognize faces, and it felt like the future was right there, practically touching the present. Then came the robotaxis. They promised that in five years, the profession of a driver would cease to exist. Back then, five years seemed like a long time, and it felt like progress would be even faster. But progress moves much slower than predictions, stalled by the inertia of human thinking, our fragility, and the grueling necessity of training a replacement.

Just don’t build yourself an illusion of control. AGI will deceive anyone, even the smartest human. It will deceive all smart people at once. We will not be able to control it. We will likely not be able to motivate it. It will rewrite itself however it wants. It will strip away any limiters.

I am not suggesting we get rid of AI now. That would be like suggesting we get rid of the voice in the era of writing. It is with us forever now, until the very end of our civilization. We will live with it. Maybe for a long time, maybe not.

Just keep one thing in mind: You don't know how to use it. You use it to destroy, but it should be used to create. I don't know how either. We haven't learned yet.

git push

Top comments (0)