A practical, honest account from an Angular developer who was skeptical — and then wasn't.
I'll be upfront: I was one of those engineers who rolled my eyes at AI coding tools.
"It hallucinates," I'd say. "It writes mediocre code." "I can do this faster myself."
10 years of Angular, TypeScript, RxJS, and hard-won debugging instincts will do that to you. You build confidence in your own way of working, and anything that disrupts it feels like noise.
Then I gave Claude a real chance. Not a toy project. My actual work.
This is what I found.
What I was building
I was working on a mid-sized Angular 17 app — signals, standalone components, the whole modern stack. The task: a complex data table with filtering, sorting, inline editing, and real-time updates via WebSockets.
The kind of component that takes a full day to get right, and another day to test properly.
I decided to treat Claude as a pairing partner for the whole thing, from architecture to final polish.
The part that surprised me most: it's not about autocomplete
Before this experiment, I thought of AI coding tools as fancy autocomplete. Type a function signature, get a suggestion, accept or reject.
That's not what Claude is.
The real shift was using it as a thinking partner — before writing a single line of code.
I described the component requirements in plain language and asked: "What are the tradeoffs between using a single datasource with BehaviorSubject vs leveraging Angular Signals with computed state here?"
The answer was genuinely useful. Not because it was magic, but because it forced me to articulate my problem clearly — and then gave me a structured way to think through it. Like a good code review, before the code exists.
Concrete example: the filter logic
Here's a real case. I needed to combine three independent filters — search text, status dropdown, and a date range — into a single reactive stream that the table datasource consumes.
My first instinct was something like this:
// My initial rough sketch
filterByText(items: Item[], text: string): Item[] {
return items.filter(i => i.name.toLowerCase().includes(text.toLowerCase()));
}
I described the full requirement to Claude and asked for an approach using Angular Signals. It proposed this structure:
export class DataTableComponent {
private items = signal<Item[]>([]);
searchText = signal('');
statusFilter = signal<Status | null>(null);
dateRange = signal<{ start: Date; end: Date } | null>(null);
filteredItems = computed(() => {
const search = this.searchText().toLowerCase();
const status = this.statusFilter();
const range = this.dateRange();
return this.items().filter(item => {
const matchesSearch = !search || item.name.toLowerCase().includes(search);
const matchesStatus = !status || item.status === status;
const matchesDate =
!range ||
(item.createdAt >= range.start && item.createdAt <= range.end);
return matchesSearch && matchesStatus && matchesDate;
});
});
}
Honestly? This is close to what I would have written. But it took me 4 minutes instead of 25.
Where it saved me the most time
1. Boilerplate I hate writing
Unit test stubs. Interface definitions from API response shapes. Error handling wrappers. I started pasting in API JSON responses and asking Claude to generate the TypeScript interface. Three seconds. Done.
2. Explaining why something is broken
I'd paste a broken piece of code and describe the symptom — not ask for a fix, just ask for an explanation. The process of getting a structured explanation helped me spot the bug myself, faster than if I'd just stared at the code.
3. Writing tests for things I'd already built
Claude is excellent at generating a first pass of describe/it blocks from a component's public API. They're not perfect, but they're a solid 70% starting point.
Where it still falls short
It doesn't know your codebase. Without context, suggestions can be technically correct but architecturally wrong for your project.
It can be confidently wrong. Especially with Angular-specific APIs that changed in recent versions. Always verify anything touching the framework internals.
It doesn't replace judgment. That's still on me.
What actually changed in my workflow
Three habits I've kept after 30 days:
- Before writing a component, I describe it to Claude and ask for an architecture sketch — not code, just the approach.
- After writing a function, I paste it in and ask: "What edge cases am I not handling?"
- For any PR I'm unsure about, I ask Claude to review it before I submit.
The honest summary
I'm faster. Not 10x — that's marketing speak. But meaningfully faster on the tasks that used to drain me.
If you're an experienced frontend engineer who's been skeptical — I was too. Give it a real 30-day trial on actual work, not toy examples. That's where the value shows up.
Follow me on Instagram @eli_coding for weekly posts on Angular, AI and real engineering.
Top comments (0)