Last week, I opened a blank file and froze.
Not because I didn’t know how to solve the problem — but because my hands automatically reached for the AI sidebar.
Somewhere along the way, “thinking” quietly turned into “prompting”, and I didn’t notice when the switch happened.
I’ve been writing code professionally for years. I’ve survived legacy Angular, zoneless migrations, and monorepo chaos. I mentor juniors, I review architecture, I’ve shipped features in codebases where a single bug could cost real money. But today, I catch myself outsourcing not just syntax or boilerplate — I’m outsourcing the actual thinking. Copilot, ChatGPT, even Stack Overflow rewrites: they’re no longer tools I use. They’ve become the place where my thinking starts. And that’s the part that bothers me.
This isn’t an “AI is evil” post. It’s a confession from someone who’s supposed to be the adult in the room.
How I Slid From “Power User” to Prompt Router
The slide didn’t start with anything dramatic.
At first, AI was just a Component Detox for my busy brain.
- “Generate unit tests for this 300‑line component.”
- “Refactor this RxJS pipeline, but keep the behavior.”
- “Give me a quick summary of the Angular 19 migration guide.”
It felt efficient. I was still doing the real work — just delegating the boring parts.
Then I started asking for design sketches :
- “Propose a Signals‑based state structure for this feature.”
- “Draft an Nx project structure for this B2B SaaS.”
- “Give me pros and cons of option A vs B.”
The model would spit out three options, neatly explained. I’d skim, pick the one that matched my gut, and move on. It felt like I was “considering trade‑offs”. In reality, I was outsourcing that part too.
The final step was subtle: I stopped starting with the problem.
I started with the prompt.
Instead of opening a notebook, drawing some boxes, and mapping dependencies, I opened a chat window. Instead of asking myself “What’s the real bottleneck here?”, I asked “How do I phrase this so the model gives me something usable?”
I didn’t become a better architect.
I became a better prompt router.
The Skills That Quietly Atrophy
Over‑reliance on AI doesn’t kill your skills in one big refactor.
It erodes them like a silent memory leak.
Here’s what I’ve noticed in myself.
Problem framing
When a ticket is fuzzy, I used to spend 20–30 minutes wrestling with requirements. Business constraints, edge cases, failure modes — mapping the real problem was half the job.
Now? It’s tempting to throw a vague prompt at the model and let it hand me a “structured” breakdown.
The muscle that sits with ambiguity, that asks “What’s actually being asked here?”, starts to atrophy.
Taste and judgment
When Copilot suggests three completions, it feels like you’ve seen alternatives. You haven’t. You’ve just seen three variations from the same model.
When ChatGPT gives you three architecture sketches, it feels like you’ve “evaluated options”. You haven’t. You’ve just picked the one that sounds right.
If you stop forming your own strong opinions about trade‑offs, your “senior” title is just a label on top of autocomplete.
Mental models
A good frontend architect carries an internal map of the app:
- where the state lives,
- where it leaks,
- which components are on a Bundle Diet,
- which features are one change detection cycle away from meltdown.
If every time you need that map, you ask a model “Explain this codebase to me”, you never build the long‑term memory. You treat your own brain like a cold cache.
Patience for deep work
AI rewards quick questions and quick answers.
- “Write a quick summary.”
- “Draft an email.”
- “Give me 5 bullet points.”
The problem is: the hardest problems in software don’t yield to quick questions. They require staring at an ugly Nx dependency graph for an hour. They require walking away, coming back, and seeing the bug you missed.
If every uncomfortable problem is a trigger to “just prompt it”, your tolerance for real deep work drops. You become optimized for throughput, not for architecture.
Why This Is Worse When You’re Supposed to Be “Senior”
Juniors get paid to type. Seniors get paid to think.
As a senior dev / frontend architect, I’m not paid for lines of code. I’m paid for:
- Framing problems correctly.
- Saying “no” to bad ideas.
- Choosing the 3 sprints that actually move the needle.
- Turning a legacy Angular monster into a maintainable, signal‑based system without taking the app down.
Those are judgment problems , not typing problems.
If I outsource the very parts that make my judgment valuable — problem framing, trade‑off analysis, long‑term mental models — what’s left? A “highly efficient AI operator” who can ship tickets quickly, as long as the model is online.
That’s not a senior. That’s a developer wrapped around a tool.
There’s another layer: juniors copy what seniors do.
If a junior watches me open Copilot for every second line of code and sees me feed vague prompts into ChatGPT for every design decision, they’ll learn one thing:
Being senior means you know which magic prompt to type.
They won’t see the draft diagrams, the dead‑end ideas, the notebooks full of failed state machines. They’ll see the last 10% — the polished answer that “the AI helped with”.
That’s not mentorship. That’s demo theater.
The Day I Tried to Think From Scratch
The moment this really hit me wasn’t in a fancy migration or a client workshop.
It was a small feature.
I was pairing with another dev, and we decided — just for fun — to solve it without AI. No Copilot, no chat, no “quick prompt just to check something”.
Simple enough task: refactor a clumsy, event‑driven flow into a clean, signal‑based API. Something I’ve done dozens of times.
We spun up a fresh branch. I opened the file.
And my first instinct wasn’t “What’s the shape of this state?”.
It wasn’t “Where does this event really belong?”.
It was “How do I phrase this so the model suggests a decent starting point?”.
There was a full second where my hands hovered over the keyboard, waiting for a prompt box that wasn’t there.
The feeling was… unpleasant.
Not “I’ve forgotten how to code” unpleasant.
More like “I’ve let an important muscle degrade while pretending I was getting stronger”.
I could still do the work. I sketched the new state, refactored the component, sliced some logic out, put a couple of streams on a Bundle Diet. It was fine.
But the friction was higher than it should have been for someone who sells “surgical Angular interventions” for a living.
That’s when the question landed:
Am I still using AI as a power tool, or am I using it as a thinking crutch?
My Unpopular Opinion: Senior Devs Should Use AI Less, Not More
Here’s the part that will annoy some people:
If you’re a senior dev, you should deliberately use AI less than the average engineer — not more.
Not because AI is bad.
Because your job is to keep your thinking muscles in production‑ready shape.
Juniors need the scaffolding. They’re still building basic patterns, still shipping their first real features. Give them Copilot. Give them prompt templates. Let them move faster while they learn.
But if you’re the person:
- leading architecture decisions,
- conducting interviews,
- reviewing critical PRs,
- deciding whether to go zoneless this quarter or next —
you can’t afford to offload the parts of your brain that do that heavy lifting.
Here’s the rule I’m experimenting with:
- No prompts for the first 20 minutes of any non‑trivial problem. Whiteboard, notebook, or plain text file only.
- Write my own solution or design sketch first , no matter how rough.
- Only then ask AI to challenge it, optimize it, or poke holes in it.
- At least one “no‑AI” task per week — a refactor, a design, or a small feature built like it’s 2015.
In other words: I want my brain to be the primary engine, and the model to be the turbocharger — not the other way around.
“But AI Makes Me Faster” (And Other Objections)
Let me pre‑empt some of the obvious pushback.
“Speed is all that matters.”
Speed matters when you’re shipping clones.
Speed matters when you’re burning through backlog.
But speed without judgment is just shipping bad decisions faster. You can modernize an Angular app into a worse mess at record velocity.
“You’re just nostalgic. Tools have always changed how we think.”
Yes, they have. Better IDEs changed how we navigate code. Git changed how we collaborate. Nx changed how we structure repos.
The difference is: those tools still required us to frame the problem ourselves.
Git doesn’t tell you if your feature should exist.
Nx doesn’t design your domain.
Your IDE doesn’t invent business requirements.
Generative tools can produce full answers, explanations, and designs — all without you forming a clear internal opinion first. That’s new. And that’s exactly why you need guardrails.
“If I don’t use AI aggressively, I’ll fall behind.”
You’ll fall behind if your only edge is “I know the right prompts”.
The people who will win long‑term are the ones who can think with the model, not through it :
- They bring strong domain models and mental maps.
- They use AI to stress‑test designs, not to produce the first draft of their thinking.
- They still know how to debug a broken app when the model decides to hallucinate.
If your core skill is “being slightly faster than the next person at prompt‑chaining”, that’s not a moat. That’s a race to the bottom.
I Don’t Want to Be a Prompt‑Only Senior
I fix Angular apps that generalists break.
I hunt down leaks, kill 500‑line components, and rip legacy patterns out of production codebases.
But none of that matters if, when you take away my AI, I hesitate in front of a blank file.
I don’t want to be the senior dev who can only think in prompts.
I want AI to be the power tool in my hands — not the place where my thinking lives.
So I’m rebuilding the habit of thinking first, prompting later.
What about you?
When was the last time you solved a hard problem without opening a chat window?
I fix the Angular apps that generalists break.
Karol Modelski is a senior Angular developer and frontend architect rescuing legacy B2B SaaS frontends.
If your Angular app is slowing your team down, start here: https://www.karol-modelski.scale-sail.io/







Top comments (0)