DEV Community

Cover image for Are we still engineers… or just really good prompt writers now❓
TheBitForge
TheBitForge

Posted on

Are we still engineers… or just really good prompt writers now❓

Last Tuesday I fixed a bug in about four minutes.

Not a small bug either. It was one of those authentication edge cases that only shows up when a user has a certain combination of OAuth scopes and a session that's technically valid but partially expired. The kind of thing that in a previous life would have cost me two hours, a lot of console.log statements, and at least one long stare out the window.

I described it to an AI assistant, pasted some context, and got back a fix that actually worked. I tested it, shipped it, and moved on.

Then I sat back and thought: what just happened?

Not in a bad way. Not in a grateful way either. Just... genuinely, what did I do there? Did I solve that problem? Or did I facilitate someone else solving it?


The way it used to feel

When I started writing code seriously, there was a certain texture to the work that I don't think I fully appreciated until it started changing.

You'd read through something — a library's source, a language spec, a Stack Overflow thread from 2013 — and slowly build up a mental model. You'd get it wrong the first time, sometimes the second. You'd trace execution line by line. Eventually, something would click.

That clicking feeling was the job. That was engineering.

It wasn't fast. But it left a residue. Every hard thing you figured out made the next hard thing slightly easier, because you were slowly building a map of how systems actually worked.

And yeah, most of us also Googled everything. We copied from Stack Overflow constantly. We used frameworks that abstracted away enormous complexity. Nobody pretended otherwise.

But there was still a gap you had to cross on your own. The search result gave you a direction. You still had to understand why.


What's different now

With AI, the gap is smaller. Sometimes it disappears entirely.

That's not a complaint. It's just true, and worth sitting with.

I can describe a problem in plain English and receive something close to a solution. I can paste error messages I don't fully understand and get explanations that are usually correct. I can ask for a working implementation of something I've never built before and get scaffolding that would have taken me a week to research from scratch.

This is genuinely useful. There's no version of this story where I pretend it isn't.

But it changes something about the feedback loop that used to build understanding. When the gap closes too easily, you don't always notice what you didn't learn.


The question I keep dodging

Here's the uncomfortable one: do I fully understand the code I just shipped?

Sometimes yes. Often, mostly. But not always entirely.

And the follow-up, which is worse: would I be able to build this without AI?

Some of it, sure. Parts of it, probably not as quickly. But some of it — if I'm honest — I'm not completely certain. There are pieces I've been trusting more than verifying lately.

That's a new feeling. And I'm not sure it's fine.


But let's be fair to ourselves

Every generation of developers has had this conversation about something.

When IDEs started autocompleting, someone worried we'd forget how to type. When ORMs arrived, someone worried we'd stop understanding SQL. When frameworks abstracted routing and state management, someone worried we'd stop understanding the web itself.

Some of those concerns turned out to be overblown. Some of them turned out to be partially right.

The truth is usually in the middle: tools do raise the floor, and they sometimes lower the ceiling for people who stop at the tool. Both things can be true.

So maybe the question isn't "are we losing skills" but "which skills, and does it matter?"


What I think engineering actually is

When I try to strip it down to what I value most in the developers I respect, it's not typing speed or memorization of syntax. It's something harder to name.

It's the ability to look at a system and understand its shape — where things can go wrong, why certain trade-offs were made, what will break under pressure.

It's being able to debug when the AI's suggestion doesn't work. That moment still requires something real. You have to know enough to recognize when you're being led the wrong direction, to ask the right follow-up question, to look past the plausible answer to the actual problem.

It's taking responsibility. Not just deploying working code, but understanding what you deployed. Knowing what it does to the data. Knowing what happens if it fails.

That part hasn't been automated. Not really.


Prompting as a skill

Here's a frame that I find genuinely useful sometimes: prompting is just another interface.

We don't think of someone as less of an engineer because they're good at reading documentation. We don't think of someone as less of an engineer because they can quickly find what they need on GitHub or in a package registry. Those are skills too. Knowing where to look, how to evaluate what you find, when to trust it and when to dig deeper.

Prompting well is similar. It requires knowing enough about the problem to describe it precisely. It requires evaluating the output critically. It requires understanding when the answer is subtly wrong.

The people who get the most out of AI tools right now are, in my observation, people who already knew a lot. They use it to go faster, not to avoid understanding. The foundation still matters.


Where the concern is real

But I do think there's a legitimate risk that we're not talking about clearly enough.

If you're newer to this, and AI fills every gap before you have a chance to cross it yourself — you might end up with the outputs of understanding without the understanding itself. Code that works. A mental model that doesn't.

And that's fine until something breaks in a way the AI can't explain. Until you have to debug something novel. Until you have to make a decision without a scaffold.

The skills you don't build in the first years are hard to build later. Not impossible. But harder.

That worries me a little. Not for senior developers who've already built the map. For the people starting now, who might be building on shaky ground without knowing it.


What I'm trying to do about it

I don't have a clean answer. But I've been trying a few things.

When I use AI to solve something, I try to make sure I can explain the solution after. Not just deploy it — actually trace through it and understand what it's doing and why.

When it gives me something that works but I don't fully understand, I mark that as debt. Not technical debt. Understanding debt. And I try to pay it down before it compounds.

I still write some things from scratch when it would be faster not to. Not always. But sometimes, deliberately, to keep the manual memory alive.

I don't know if this is the right balance. I'm figuring it out in real time, same as everyone else.


The title question, honestly answered

Are we still engineers?

Yes. But the word is doing more work than it used to.

We're engineers who direct more than we construct. Who review more than we author. Who evaluate more than we discover.

Whether that's better or worse depends on what you value. It's faster. It's more productive in measurable ways. It might also be producing a generation of developers who are fluent with tools but not fluent with fundamentals — and that gap doesn't always show up until it really matters.

The craft is changing. That's not new. Every new tool changes the craft.

But I think the honest version of this conversation — the one worth having — isn't "AI good or bad." It's about what we're intentionally keeping, and what we're letting go of without noticing.


I'm genuinely curious where others land on this.

Not the theoretical version of the question — the personal one. Think about the last thing you shipped that had significant AI involvement. If that thing broke in production tonight, in a way you hadn't seen before, in a place you hadn't thought to look — would you know where to start?

Top comments (0)