DEV Community

Cover image for AI Writes the Code Now. So What Are You?
Glover
Glover

Posted on

AI Writes the Code Now. So What Are You?

The hardest part of this transition isn't learning new tools. It's letting go of the identity you built around writing code, and figuring out what replaces it.

Last week, Alexey Grigorev, the founder of DataTalks.Club (a platform that teaches data engineering to over 100,000 students), let Claude Code run a Terraform command. Within seconds, it wiped out his entire production infrastructure. The course platform, 2.5 years of student submissions, homework, projects, leaderboard entries, every automated database snapshot he'd counted on as backups. All of it, gone.

Here's how it happened. Grigorev was migrating a side project to AWS and wanted it to share infrastructure with DataTalks.Club to save a few dollars a month. Claude actually advised against this. It suggested keeping the setups separate. He overrode the recommendation. Then he ran terraform plan from a new computer without the state file, which is the critical document that tells Terraform what infrastructure already exists. Without it, Terraform thought nothing existed. Duplicate resources got created. When the state file was finally uploaded, Claude did what Terraform's logic dictated: it ran terraform destroy to bring everything into alignment. Since the state file described both sites, everything was obliterated.

It took 24 hours and an emergency AWS Business Support upgrade to recover the data. Nearly two million rows, restored from a hidden snapshot that wasn't even visible in the AWS console.

In his post-mortem, Grigorev was remarkably honest. He didn't blame the AI. He wrote: "I over-relied on the AI agent to run Terraform commands." His fixes? Enable deletion protection, move state files to S3, and most importantly, manually review every plan before executing destructive actions.

I keep thinking about this story. Not because it's about a careless developer (Grigorev is clearly not that), but because it captures exactly where we are right now. An experienced engineer, using a powerful AI tool, making reasonable decisions at each step, and still watching 2.5 years of work disappear. The AI did exactly what it was told. The problem was that nobody was doing the thinking about what it should have been told.

That gap between what AI can execute and what a human needs to judge? That's the entire story of the developer career transition we're living through.


The Numbers Behind the Anxiety

If you've been writing code for any amount of time, you've felt it. Every week brings a new tool (Cursor, Copilot, Claude Code, Devin, v0) and with it, a fresh wave of "developers are done" takes on Twitter.

The adoption numbers are staggering. GitHub Copilot now has over 20 million users. AI generates roughly 46% of all code written by active Copilot users. According to the 2025 Stack Overflow Developer Survey, 84% of developers now use or plan to use AI tools in their workflow, up from 76% the year before. Over half of professional developers use AI tools daily.

But here's the number everyone overlooks: in that same survey, positive sentiment toward AI tools dropped from over 70% to just 60%. A full 46% of developers actively distrust AI output. The top frustration, cited by 66% of respondents? AI solutions that are "almost right, but not quite." Close enough to be tempting, wrong enough to be dangerous.

Sound familiar? Grigorev's Terraform plan looked reasonable at each step. The AI wasn't hallucinating. It was following a coherent logic chain. The problem was contextual, the kind of "almost right" that only a human who understood the full picture could catch.

Your career isn't ending. But it is changing shape. And the developers who recognize that shift early will have a massive advantage over those who either panic or pretend nothing's different.


The Honest Assessment: What AI Actually Replaced

Let's get the uncomfortable part out of the way.

AI coding tools have reduced the need for certain kinds of work. If your entire value proposition is "I can write a CRUD endpoint" or "I can convert this Figma design into pixel-perfect CSS," you're now competing directly with tools that do this faster and for a fraction of the cost. As one engineer put it: "Why hire a junior for $90K when GitHub Copilot costs $10 a month?"

But what the doom commentary misses is that those tasks were already being commoditized. Frameworks, low-code platforms, and template ecosystems had been compressing the value of raw code output for years. AI just hit the accelerator on an existing trend.

What AI has not replaced, and likely won't for a long time, is the messy, contextual, deeply human work of building software that actually solves problems. Understanding what a non-technical stakeholder actually needs (not what they say they need). Making architectural decisions that account for team capacity, company runway, and the debt you'll inherit. Debugging production systems where the issue spans three services, a race condition, and a misconfigured environment variable. Deciding what not to build, or in Grigorev's case, deciding that Claude's advice to keep infrastructure separate was worth listening to.

That Stack Overflow survey asked developers where they'd still want human help even in a future where AI handles most coding. The top answer, at 75%, was: "When I don't trust AI's answers." The human developer isn't going away. They're becoming the final quality gate.


If You're a Junior Developer, Read This

I'm not going to sugarcoat it. The entry-level landscape is brutal right now.

Entry-level hiring at the 15 largest tech firms fell 25% between 2023 and 2024. Salesforce announced zero engineering hires for 2025, citing AI agents. CS graduates now face a 6.1% unemployment rate, nearly double the national average. A Harvard study tracking 62 million workers across 285,000 U.S. firms found that junior employment at AI-adopting companies dropped 9 to 10% within six quarters of AI implementation. The decline wasn't from layoffs. Companies simply stopped posting junior roles.

This is real, and it's worth being honest about. The traditional path of "learn to code, land a junior role, learn on the job, move up" has fractured.

But here's the counter-narrative nobody's platforming enough: every senior engineer was once a junior. If companies stop hiring juniors today, they're creating a senior talent crisis in five to seven years. AWS CEO Matt Garman called the idea of replacing junior developers with AI "one of the dumbest things I've ever heard." The pipeline has to be rebuilt. The question is what "junior" looks like now.

If you're early in your career, the bar has risen. The junior of 2026 needs something closer to the system-design fluency of a mid-level engineer from 2020. But here's how you meet that bar:

Ship real things, not tutorials. Deploy end-to-end applications. React frontend, Node backend, a database, CI/CD pipeline. A to-do list app means nothing when AI can generate one in 60 seconds. An app that integrates an LLM to solve a real problem, handles edge cases, and runs in production? That proves something.

Get absurdly good at reviewing AI output. The developers getting hired right now aren't the ones who generate the most code with AI. They're the ones who can look at AI-generated code and immediately spot what's wrong. Think of yourself as a code auditor, not a code writer. If Grigorev's story teaches us anything, it's that the ability to pause, read the plan, and say "wait, this doesn't look right" is the skill that saves entire platforms.

Build in public and contribute to open source. When the market is flooded with applicants, a visible track record of shipped work and meaningful contributions is worth more than a polished resume. Three to five real pull requests on established projects will set you apart.

Target industries that are actually hiring. Consumer tech is contracting, but "boring" sectors are aggressive buyers. AI-related job postings in insurance jumped 74% in 2025, with similar surges in finance, logistics, and healthcare. These industries don't need someone to build pretty landing pages. They need engineers who can automate internal workflows and build the infrastructure that controls them.


The New Developer Skill Stack

Here's what developers who are thriving right now have deliberately built up.

1. System Thinking Over Syntax Fluency

The ability to hold a complex system in your head, to understand how a change in the payment service cascades through the event pipeline to the notification layer, is becoming more valuable, not less. AI can generate any individual component. It cannot reason reliably about the emergent behavior of interconnected systems.

This is exactly what made Grigorev's incident so instructive. Claude Code executed each individual command correctly. The Terraform syntax was valid. The AWS CLI calls worked. But the system-level understanding, that this state file described production infrastructure for a live platform with 100,000+ users, that was the human's job. And at the critical moment, the human had delegated too much of that thinking to the machine.

U.S. Bureau of Labor Statistics data tells the same story at a macro level: overall programmer employment fell 27.5% between 2023 and 2025, but employment for software developers, a more design- and architecture-oriented role, fell only 0.3%. The market isn't punishing people who build systems. It's punishing people who only write code.

If you've been meaning to go deeper into distributed systems, event-driven architecture, or observability, now is the time. These are force-multiplier skills.

2. AI Fluency as a Real Development Skill

I know "prompt engineering" sounds like a buzzword. But the productivity gap between developers who use AI tools well and those who don't is enormous, and measurable. Research involving 4,800 developers showed those using Copilot completed tasks 55% faster, with pull request turnaround dropping from 9.6 days to 2.4 days.

The gap isn't magic. It's discipline. Break problems into well-scoped, testable chunks before involving AI. Provide sufficient context upfront: project constraints, coding conventions, known edge cases. Review AI output with the same rigor you'd apply to a junior developer's PR. And know when to stop prompting and just write the code yourself.

That last point matters more than people admit. In the Stack Overflow survey, 45% of developers said debugging AI-generated code takes longer than writing it themselves. The skill isn't "use AI for everything." It's knowing exactly when AI accelerates you and when it slows you down.

3. Taste and Product Sense

This one surprises people, but it might be the most important shift. When the cost of producing code approaches zero, the bottleneck moves to deciding what code should exist. Developers with strong product sense, the ones who can look at a feature request and say "this won't move the metric we care about, here's what will," become disproportionately valuable.

Build this by getting closer to your users. Read support tickets. Sit in on sales calls. Look at your analytics dashboards. The developer who understands the business will always out-earn the one who only understands the framework.

4. The "Glue Work" That AI Can't Do

Every successful software project runs on a substrate of almost-invisible work. Facilitating technical decisions across teams, writing clear RFC documents, mentoring junior developers, translating between engineering and product language, building consensus around tradeoffs.

This work has historically been undervalued. In an AI-augmented world, it becomes the differentiator. And there's a growing concern that as companies cut juniors, seniors are losing the ability to delegate lower-risk tasks entirely, creating a "delegation vacuum" where they shoulder both the high-level architecture and the grunt work with no pressure valve. If you're the person who helps a team function, lean in hard.


What Not to Do

Before the playbook, a few traps I keep seeing developers fall into:

Don't let AI touch production without guardrails. This is the Grigorev lesson, distilled. He's now requiring manual review of every Terraform plan, enabling deletion protection on all critical resources, and storing state files in S3 instead of locally. These aren't complex changes. They're the kind of boring, five-minute safeguards that separate professionals from people who move fast and break things. AI can draft the infrastructure code. A human should hold the keys to destroy.

Don't chase every new AI tool. A new coding assistant drops every week. If you're constantly switching between Cursor, Copilot, Claude Code, Windsurf, and whatever launched yesterday, you're optimizing for novelty instead of depth. Pick one or two tools. Master them. Depth beats breadth.

Don't over-rely on AI without reviewing output. This is the most common mistake I see. AI-generated code often looks correct, passes a quick scan, and then fails in production on edge cases the model couldn't anticipate. 72% of developers say "vibe coding," meaning generating software from prompts without deeply understanding what's produced, is not part of their professional work. There's a reason for that.

Don't panic-pivot into management because you think engineering is dead. If you love building, stay building. The demand for engineers who can think at the system level is actually increasing. Switching to management out of fear rather than genuine interest is a recipe for misery.

Don't ignore the fundamentals. When AI handles the surface-level implementation, your understanding of data structures, networking, concurrency, and system design becomes more important, not less. AI is a layer on top of fundamentals, not a replacement for them. As Satya Nadella put it: "Having the ability to think computationally matters a lot."


A Practical Playbook for the Next 12 Months

Theory is nice. Here's what I'd actually do.

Month 1 to 3: Integrate AI deeply into your workflow. Don't just dabble. Use an AI coding tool as your default pair programmer for a full month. Build muscle memory. The goal isn't to use AI for everything. It's to develop an instinct for when to reach for it. What "done" looks like: You can estimate, before starting a task, whether AI will speed it up or slow it down, and you're right 80% of the time.

Month 4 to 6: Go deep in one area AI struggles with. Pick a domain that requires judgment, context, and systems-level thinking. Performance engineering. Security architecture. Data modeling for complex domains. Infrastructure design. Become genuinely excellent at one of these. What "done" looks like: You can explain to a non-technical PM why your system handles 10x traffic spikes without falling over, and they believe you.

Month 7 to 9: Build something end-to-end with AI as your co-pilot. Ship a side project, contribute to open source, or take on an ambitious work project. Document the process. Not just the wins, but the moments where AI led you astray. Where did you override it? Where did you wish you had? Grigorev's post-mortem is a masterclass in this kind of reflective practice. What "done" looks like: You have a shipped project and a written reflection (blog post, internal doc, even a detailed README) that someone else could learn from.

Month 10 to 12: Teach what you've learned. Write about your experience. Give a talk at a meetup. Mentor someone more junior. Teaching forces you to crystallize your understanding, and it positions you publicly as someone navigating this transition with intention. What "done" looks like: You've published at least one piece of content or given one talk that generated real feedback or conversation.


The Mindset Shift That Matters Most

I want to close with something less tactical and more fundamental.

The developers struggling most with this transition aren't the ones with weaker technical skills. They're the ones whose professional identity is built on being the person who writes the code. When AI can write the code, that identity feels threatened.

The developers who are thriving? They see themselves as problem solvers who happen to use code as their primary tool. For them, AI is just another tool in the kit, like when they adopted a new framework or learned a new language. It changes how they work, not who they are.

Grigorev's story lands differently when you see it through this lens. He didn't write a post about how AI is dangerous and should be avoided. He wrote a post about what he could do differently. What guardrails to add, what processes to change, what judgment calls to stop delegating. His identity as an engineer didn't break. It evolved. He's still building. He's just building with clearer boundaries now.

That's the model.

The software industry has always been defined by waves of abstraction. Assembly to C. C to Python. Manual deployment to CI/CD. Every wave eliminated certain jobs and created others that were previously unimaginable. We're in another one of those waves.

But this one has a wrinkle. It's not just changing the tools. It's changing who gets to enter the profession and what the first few years of a career look like. That's worth taking seriously, both as individuals adapting our own paths and as an industry deciding whether to invest in the next generation or abandon them.

The developers who adapt won't just survive. They'll look back in five years and realize this was the inflection point that made their careers.


If this resonated, I'd love to hear how you're adapting your own workflow. Or if you're early career, what the job search actually looks like right now.

Top comments (2)

Some comments may only be visible to logged-in visitors. Sign in to view all comments.