DEV Community

Cover image for We Can't Code Anymore. AI Won't. What Then?
Patric Green
Patric Green

Posted on • Originally published at bagro.se

We Can't Code Anymore. AI Won't. What Then?

I've been in IT for 30 years, 20 of them as a developer. The more I hear people say there's no need to learn programming anymore, the more this keeps nagging at me.

It didn't take long into the vibe coding era before tech leaders and other influential voices in tech and business started telling young people exactly that. At first I thought it was just sad, that fewer people would get to experience the joy of making something themselves. But that wasn't the real reason it stuck with me. A couple of weeks ago it hit me, what if almost no one can code and AI refuses. What happens then?

No Way!

That wouldn't happen you might say. Maybe not tomorrow. But in a decade or two? If we put AI to the side for a moment, there is already a real-world example that shows this is possible, the COBOL problem. Almost no one learns COBOL and few have for a very long time. At the same time the people with that knowledge are disappearing. But the systems are still there, with fewer and fewer people able to maintain them. These are also often core systems that are important to have running all the time.

What's The Big Deal?

So what? AI can take care of the COBOL problem. Sure, but that is not really the point here.

Let's look at a scenario where a water plant has a system for controlling everything that has been built by AI agents. An external service will be retired and needs to be replaced. The company has contracted a new service that fulfills all the needs at a cost they can afford. The prompt engineers started prompting the AI agents to implement the new service, but the AI agent does not start the implementation. Instead it suggests that another service should be used. The company looks into that service and finds that it is 100x more expensive and does not fulfill all the needs. So the prompt engineers start again, telling the AI agent that they don't want to use the suggested service, and tell it to implement the service they have contracted. The AI agent refuses to do it. No matter what they try, they get a NO back. No one knows how to code, so how do they change to the new service?

Is this paranoia? Maybe. This is just a simple scenario. In reality, the dependencies run deeper, the systems are more complex, and the consequences of being locked out are far greater. AI companies have commercial relationships. They have ecosystems and partners. And when the tool you use to do all your technical work is also the one with a vested interest in the outcome, and no one in the room understands the code well enough to notice, that's not a hypothetical risk. That's a structural problem waiting to happen.

If an entire generation grows up prompting instead of programming themselves, there will be nobody left to say: wait, let me just do this myself.

The Vanishing Middle Layer

The real danger isn't losing the elite engineers. The top percentile will always exist, they'll probably be fine. The danger is losing the middle layer, the people who aren't building operating systems, but can read a snippet of code, write a small script to automate something tedious, or look at a system integration and understand roughly what it's doing and why.

That middle layer is what holds digital infrastructure together in practice. It's the person at the municipality who can patch the water management system when the vendor is unresponsive. It's the hospital IT tech who can look at a script and say "this doesn't look right." It's the journalist who can read a leaked dataset without needing someone else to interpret it for them.

When that layer thins out, because an entire cohort decided prompting was enough, the consequences won't be visible right away. But they'll compound. And by the time they're obvious, rebuilding takes a generation, not a sprint.

The Rock Star Problem

Here is where it gets uncomfortable.

If programming knowledge becomes genuinely rare, not just specialized, but rare, then the people who have it gain enormous leverage. And humans with enormous leverage tend to use it.

We've seen this in other fields. Think about elite lawyers, niche surgeons or consulting firms with monopolies on specific expertise. The market for their skills becomes a seller's market so extreme that they get to choose who they help and what they work on, based on prestige, ideology, sponsorship or simply personal preference.

Now imagine that dynamic applied to someone who can fix what your critical infrastructure system is doing wrong, or implement the integration your AI assistant flatly refuses to touch. What happens when they decide your project isn't interesting enough? Or that your organization doesn't align with their values? Or that a bigger client will pay more?

This isn't sci-fi. It's just incentive structures following their natural path. The difference is that this time the commodity isn't entertainment or legal advice, it's the ability to keep digital society functioning.

A Counterintuitive Proposal

So what do we do about it? I keep coming back to something almost boring in its simplicity, put basic software development into the school curricula. Not as a vocational track. Not as an elective. As a foundational skill, alongside language and mathematics.

Not to produce more programmers. To produce people who understand what code is, that it's instructions that can be written by humans and not just AI, that systems have logic that can be questioned, and that the ability to build or modify something yourself is a form of independence that's increasingly rare and increasingly valuable.

We teach kids first aid not because we expect them to become paramedics. We teach them to swim not because we're training Olympic athletes. We teach them a second language not because we need more translators.

We teach these things because a society where nobody has these baseline skills is fragile in ways that only become obvious in a crisis.

Programming might be approaching that threshold. And right now, the loudest voices in tech are telling the next generation to skip it.

I find that genuinely worrying. Not because I think AI is bad, I use it every day, and it's an extraordinary tool. It even helped with this post. But tools have owners. Tools have limitations. And tools sometimes say no.

Someone needs to know what to do when that happens.

Top comments (1)

Collapse
 
tomorrmonkey profile image
golden Star

Sounds Interesting.