Last month a friend texted me a screenshot of a working SaaS app he'd built over a weekend. Stripe integration, auth, dashboard — the whole thing. He's a product manager. Hasn't written a meaningful line of code in years.
He vibe coded it.
If you haven't encountered the term yet: vibe coding is when you describe what you want in plain language, let an AI generate the code, and iterate by feel rather than by understanding. You're not debugging. You're directing. The code is a black box you trust to work until it doesn't, and when it doesn't, you ask the AI to fix it.
The reaction from engineers is predictable. "It won't scale." "What happens when something breaks in prod?" "He doesn't actually understand what he built." All true! And also kind of missing the point.
The capability floor just moved
For 30 years, building software required a specific kind of literacy. You had to understand control flow, data structures, how requests and responses worked, why certain things were slow. That knowledge took years to develop and it was a real barrier. Not a bad one — a legitimate one. You needed to understand systems to reason about them.
That barrier is dissolving. Not for complex distributed systems or performance-critical infrastructure — it's still very much there. But for the vast middle ground of software that actually gets built? The CRUD apps, the internal tools, the small SaaS products, the data pipelines that run once a week? AI handles it.
And more importantly: it handles it well enough that non-engineers are shipping.
This isn't "low-code" from five years ago where you'd hit a wall the moment you needed anything custom. Modern AI coding tools — Cursor with Claude, GitHub Copilot with their agent features, even just raw API calls with good prompts — these things write code that actually works. It's not always pretty. It's often redundant. But it runs.
The uncomfortable part for engineers
Here's what I keep thinking about: a lot of the junior engineering work I've seen in my career wasn't that far from vibe coding anyway. Take the spec, look up the pattern on Stack Overflow, paste it in, tweak until tests pass. The understanding was shallow, the output was functional.
I'm not being cynical. That's a normal part of learning. But if the output is functionally equivalent and AI does it 10x faster, the honest question is: what are we actually protecting when we gatekeep code quality from non-engineers?
Sometimes we're protecting real things. System design, security, scalability — these require deep knowledge that AI will confidently hallucinate around if you let it. The PM friend's weekend app? If it ever gets 10,000 users, it will probably fall over in interesting ways he won't understand.
But most software doesn't get 10,000 users. Most software serves 50 people inside a company and needs to work reliably for three years. Vibe coding is probably fine for that.
What's actually changing in practice
I've watched a few non-technical people go through this over the past year and the pattern is consistent:
Week 1-2: Amazement. It works. Holy shit it works.
Week 3-4: First real confusion. The app works but they're not sure why something stopped working after they added a feature. They ask the AI, it fixes it, but now there's more code and they understand it even less.
Month 2-3: Either they hit a wall they can't debug their way out of, or they get good enough at prompting that they've effectively learned software development from a weird angle.
The ones who push through the wall end up with a pretty decent intuition for how code works — not from reading textbooks but from watching patterns emerge across hundreds of AI-generated solutions. It's a different kind of literacy but it's real.
The more interesting question
Everybody's asking whether vibe coders can build production software. That's not the interesting question.
The interesting question is what happens to software architecture when the cost of writing code approaches zero.
Right now there are enormous amounts of software that should exist but doesn't because it wasn't worth the engineering time. Internal tools that live in spreadsheets. Automations that live in someone's head. Small integrations between systems that nobody wanted to build a Jira ticket for.
All of that is getting built now. By the people who actually understand the problem domain — the ops manager, the analyst, the support lead — rather than by engineers who had to be convinced it was worth prioritizing.
That's actually good? It's weird and a little destabilizing if you're used to being the person with the superpower, but it's good. More software that solves real problems. Built faster. By people closer to the problem.
Where this leaves engineers
Not in a bad place, honestly. The floor moved up. The ceiling moved up too.
Engineers who understand systems — who can reason about failure modes, who know why the AI's solution will break under load, who can design the thing the vibe coder builds their app on top of — those people are more valuable, not less. The work is better because the toil is gone.
What doesn't survive is the cargo-culting. The pattern-matching without understanding. The "I know how to use this framework" without knowing why. AI does that now, and it does it without complaining about the ticket backlog.
My PM friend shipped his app. It has 40 users. It makes him $200/month. He's genuinely proud of it and he should be.
I'm not going to tell him he didn't really build it. He did. It just looked different than how I would've built it.
That's going to keep happening. Better to understand it than to dismiss it.
Top comments (0)