<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Don Angelillo</title>
    <description>The latest articles on DEV Community by Don Angelillo (@nativitymobile).</description>
    <link>https://dev.to/nativitymobile</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/nativitymobile"/>
    <language>en</language>
    <item>
      <title>The Complete History of iOS Development (So Far)</title>
      <dc:creator>Don Angelillo</dc:creator>
      <pubDate>Mon, 23 Feb 2026 00:00:00 +0000</pubDate>
      <link>https://dev.to/nativitymobile/the-complete-history-of-ios-development-so-far-j8p</link>
      <guid>https://dev.to/nativitymobile/the-complete-history-of-ios-development-so-far-j8p</guid>
      <description>&lt;p&gt;I downloaded the first iPhone SDK the day it was available. March 6, 2008. There was no App Store yet - that wouldn't come until July. There was no documentation worth reading, no Stack Overflow answers, no tutorials, no community. I don't think I'd ever even heard of Objective-C before this. But none of that mattered. I just needed to be part of whatever this new &lt;em&gt;thing&lt;/em&gt; was going to become.&lt;/p&gt;

&lt;p&gt;What I didn't know - what none of us knew - was how many times Apple would tear up the foundation and make us rebuild. Not once. Not twice. Every few years, sometimes more often, Apple ships something that fundamentally changes how you build iOS apps. Sometimes it's a new language. Sometimes it's a new framework. Sometimes it's a design philosophy that renders every existing app obsolete overnight. And every single time, you adapt or you're done.&lt;/p&gt;

&lt;p&gt;I've adapted every single time. Here's what that actually felt like.&lt;/p&gt;

&lt;h2&gt;
  
  
  Retain, Release, and Pray
&lt;/h2&gt;

&lt;p&gt;The original SDK was Objective-C with manual reference counting. If you've only ever written Swift, you cannot appreciate how much of your brain was consumed by memory management. Every object you created, you owned. Every object you owned, you had to release. Every object someone else owned that you wanted to keep, you had to retain. Get it wrong and your app leaked memory until it was killed. Get it wrong the other way and you'd access a deallocated object - a zombie - and crash with a stacktrace that told you nothing useful.&lt;/p&gt;

&lt;p&gt;You'd spend hours in Instruments hunting for the one &lt;code&gt;retain&lt;/code&gt; that didn't have a matching &lt;code&gt;release&lt;/code&gt;. You'd write &lt;code&gt;dealloc&lt;/code&gt; methods longer than the actual logic of the class. This was just the tax you paid for writing any code at all.&lt;/p&gt;

&lt;p&gt;I got good at it. We all did, the ones who stuck around. You developed an instinct for retain cycles, for ownership semantics, for when to &lt;code&gt;copy&lt;/code&gt; versus when to &lt;code&gt;retain&lt;/code&gt;. It became second nature. And then Apple automated it.&lt;/p&gt;

&lt;h2&gt;
  
  
  ARC Changed Everything and Nothing
&lt;/h2&gt;

&lt;p&gt;Automatic Reference Counting shipped with iOS 5 in 2011, and the reaction was split down the middle. Half of us were relieved. The other half were insulted.&lt;/p&gt;

&lt;p&gt;I'd spent years mastering manual memory management. I could trace a retain cycle in my head. I'd earned that skill through pain and now it was being handed to everyone for free. But here's the thing - and this is a pattern that repeats with every paradigm shift - the skill didn't become worthless. It became context. I still understood what ARC was doing under the hood. I could still debug the retain cycles between closures and delegates that ARC couldn't magically solve. The people who'd never learned manual memory management just knew the compiler handled it now, and when it didn't, they were lost.&lt;/p&gt;

&lt;h2&gt;
  
  
  Storyboards, or How Apple Tried to Kill My Workflow
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;Most experienced iOS developers came to the same conclusion eventually. Storyboards were a tool for demos, not for teams.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;I need to talk about Storyboards because my hatred for them is deeply personal.&lt;/p&gt;

&lt;p&gt;Apple introduced Storyboards in iOS 5 and spent years aggressively pushing them as the default. Every WWDC sample project used them. Every tutorial used them. The project templates defaulted to them.&lt;/p&gt;

&lt;p&gt;The problem is that Storyboards are opaque XML files that are impossible to review in a pull request, create merge conflicts that will ruin your afternoon, can't be meaningfully diffed, and encourage a monolithic approach to view controller management that does not scale. I watched teams spend more time resolving Storyboard merge conflicts than actually building features. I watched apps where the entire navigation flow was crammed into a single Storyboard file that crashed Interface Builder if you looked at it wrong.&lt;/p&gt;

&lt;p&gt;I went back to nibs - individual xib files paired with code. It was the ideal middle ground: visual interface design without the monolithic nightmare. You could design a view visually, keep it scoped to a single screen or component, and still review your pull requests without decoding XML soup. Most experienced iOS developers came to the same conclusion eventually. Storyboards were a tool for demos, not for teams.&lt;/p&gt;

&lt;h2&gt;
  
  
  iOS 7: Flat App Theory
&lt;/h2&gt;

&lt;p&gt;iOS 7 was the most visually jarring update Apple has ever shipped - until Liquid Glass, anyway. Jony Ive killed skeuomorphism overnight. The green felt of Game Center, the leather stitching in Calendar, the wood grain in Newsstand - gone, replaced by flat surfaces, thin fonts, and translucency.&lt;/p&gt;

&lt;p&gt;And it wasn't just cosmetic. iOS 7 changed how navigation bars worked - content now scrolled underneath them by default, which broke the layout of essentially every existing app. You weren't just updating colors and icons. You were reworking view hierarchies, edge insets, and scroll behavior across entire codebases.&lt;/p&gt;

&lt;p&gt;I reworked multiple apps simultaneously, against a deadline that Apple set and Apple didn't care if you met. That was my first experience with what I'd later recognize as Apple's standard operating procedure: announce a massive change at WWDC in June, ship it in September, and good luck.&lt;/p&gt;

&lt;h2&gt;
  
  
  Swift: A New Language, Whether You Were Ready or Not
&lt;/h2&gt;

&lt;p&gt;Apple announced Swift at WWDC 2014. A new programming language. After decades of Objective-C.&lt;/p&gt;

&lt;p&gt;My first reaction was resentment. Objective-C wasn't just a language I used - it was a language I thought in. The message passing, the dynamic runtime, the bracket syntax that looked insane to outsiders but felt like home. I'd spent years getting fluent in it, and now Apple was telling me that investment was effectively over. A lot of us felt that way. We'd done exactly what Apple asked - learned their platform, mastered their tools - and now the ground was shifting again.&lt;/p&gt;

&lt;p&gt;And Swift wasn't ready. Not even close. Swift 1.0 was a proof of concept, not a production language. The tooling was buggy, the compile times were brutal, and the language was changing between point releases. Nobody serious was shipping Swift in production that first year. It wasn't until Swift 2.0 that it started to feel like something you could actually build on.&lt;/p&gt;

&lt;p&gt;But once it got there, the truth was hard to argue with. Optionals forced you to think about nil in ways that Objective-C let you ignore. The type system caught entire categories of bugs at compile time that Objective-C would have let slip through to runtime. The resentment faded because the language was genuinely better, and I'd been doing this long enough to know that fighting the obvious is a waste of energy.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Swift 3 Apocalypse, or, The Great Renaming
&lt;/h2&gt;

&lt;p&gt;And then Swift 3 happened.&lt;/p&gt;

&lt;p&gt;If you weren't writing Swift in 2016, you missed the single most painful migration in the history of iOS development. Swift 3 was a source-breaking release. Not "some things changed." The API naming conventions were completely overhauled. Method signatures you'd been writing for two years were renamed. The entire standard library was reorganized. Apple's migration tool handled maybe sixty percent of the changes. The other forty percent was you, manually fixing thousands of compiler errors.&lt;/p&gt;

&lt;p&gt;I had projects where the migrator generated more errors than the original code had lines. And Swift was still changing after that. Swift 4 broke some more, and it wasn't until Swift 5 achieved ABI stability that you could finally write Swift with confidence that it wouldn't be obsolete in twelve months. The early adopters paid the price for Swift's evolution in blood and billable hours.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Dependency Management Wars
&lt;/h2&gt;

&lt;p&gt;This one doesn't get talked about enough.&lt;/p&gt;

&lt;p&gt;CocoaPods was first. It worked, mostly, if you were willing to hand over your Xcode project file to a Ruby gem that would modify it in ways you couldn't predict and occasionally couldn't undo. When it broke during major Xcode updates - which it did pretty much every time - you were debugging Ruby internals instead of building your app.&lt;/p&gt;

&lt;p&gt;Carthage came along as the "we won't touch your project file" alternative. Philosophically pure, practically miserable. Build times were brutal, framework search paths were a nightmare, and when Apple changed how frameworks worked, Carthage was always playing catch-up.&lt;/p&gt;

&lt;p&gt;Swift Package Manager was Apple's answer - years late and initially undercooked, but it's the one that won because it's built into Xcode. SPM isn't perfect, but it works, it's integrated, and you don't need Ruby installed to use it.&lt;/p&gt;

&lt;p&gt;I've used all three in production. I've migrated projects between all three. I've watched developers who started after SPM became default have no idea how good they have it.&lt;/p&gt;

&lt;h2&gt;
  
  
  SwiftUI: The Great Reset
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;But the industry did move on, because Apple said so. And when Apple says so, that's the end of the conversation.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Apple announced SwiftUI at WWDC 2019 and the promise was enormous: declarative UI, live previews, one framework across all Apple platforms. The demos were beautiful. The reality was humbler.&lt;/p&gt;

&lt;p&gt;And here was the resentment again, right on schedule. I'd spent years becoming a UIKit expert. Not just competent - expert. I knew every esoteric trick, every workaround, every undocumented behavior. I could fix layout bugs that made other developers give up and start over. I could make UIKit do things Apple never intended, and I'd earned every bit of that knowledge through years of shipping real apps. And now Apple was doing the same thing they did with Objective-C - taking the thing I'd mastered and telling me it was time to move on.&lt;/p&gt;

&lt;p&gt;SwiftUI in its first year did not make that any easier. It was not production-ready for anything complex. Navigation was broken. Lists were slow. The data flow model changed between betas.&lt;/p&gt;

&lt;p&gt;But the deeper problem wasn't the bugs. It was that SwiftUI required a completely different way of thinking. If you'd never done reactive programming before, it was like learning to be an iOS developer all over again. All those years of UIKit expertise, all that muscle memory for how views work, how layout works, how data flows through a screen - none of it transferred directly. The reactive paradigm is a fundamentally different mental model, and switching to it felt less like learning a new framework and more like switching platforms entirely.&lt;/p&gt;

&lt;p&gt;And honestly? UIKit was and still is easier to work with in a lot of ways. You had direct control over your UI. You told a view what to do and it did it. With SwiftUI, you're describing what you want and hoping it figures out the right way to get there, and when it doesn't, the hoops you have to jump through to course-correct are absurd. A lot of us would have happily stayed in UIKit if the industry hadn't decided to move on.&lt;/p&gt;

&lt;p&gt;But the industry did move on, because Apple said so. And when Apple says so, that's the end of the conversation. So I learned it. Not because it was obviously better the way Swift was obviously better. Because it was next, and staying current on this platform means going where Apple goes, whether you agree with the direction or not.&lt;/p&gt;

&lt;p&gt;And SwiftUI is still only as good as the version you're allowed to deploy, which is directly tied to the minimum iOS version your product team will let you ship to. If your app still supports iOS 15 or even 16, you're not writing modern SwiftUI - you're writing SwiftUI with one hand tied behind your back. The framework gets meaningfully better every year, but you only get to use those improvements when your deployment target catches up. Five years from announcement before it was mature enough for a full production app without caveats. Apple doesn't tell you that at the keynote.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Async Saga
&lt;/h2&gt;

&lt;p&gt;Asynchronous code on iOS has been reinvented so many times that the history itself is a paradigm shift.&lt;/p&gt;

&lt;p&gt;In the beginning, you had &lt;code&gt;NSThread&lt;/code&gt; and &lt;code&gt;performSelectorOnMainThread:&lt;/code&gt; and a prayer. Then Grand Central Dispatch arrived and blocks changed everything - suddenly you could dispatch work to background queues without managing threads directly. GCD was elegant and powerful and also the single easiest way to create bugs that only reproduced one time in fifty. Race conditions, deadlocks, priority inversions - all the classic concurrency nightmares, now available in a convenient closure-based API.&lt;/p&gt;

&lt;p&gt;Combine showed up alongside SwiftUI in 2019 as Apple's reactive framework. Publishers, subscribers, operators, cancellables - a whole new mental model for handling asynchronous data streams. It was powerful for the right problems and wildly overengineered for the wrong ones. I watched teams wrap a single network call in a Combine pipeline with six operators when a completion handler would have been three lines.&lt;/p&gt;

&lt;p&gt;Then Swift 5.5 introduced async/await and structured concurrency, and suddenly Combine looked like a transitional technology. Actors, sendable types, task groups - the language itself now had opinions about how concurrency should work. The model was genuinely good. The migration was genuinely painful. Rewriting callback-based code to async/await isn't just a syntax change - it's rethinking control flow.&lt;/p&gt;

&lt;p&gt;And now Swift 6 strict concurrency is here, and it's a mess. The compiler is right about everything and helpful about nothing. You turn on strict concurrency checking and your project lights up with warnings about sendability violations that are technically correct but practically incomprehensible. The error messages read like they were written for the compiler team, not for the developer staring at them. And let's not even get into "approachable concurrency" - which is about as approachable as a porcupine dipped in poison. Half the community has turned strict concurrency on and is fighting through it. The other half is waiting for the dust to settle. I don't blame either camp. Apple shipped the right idea with the wrong developer experience, and they'll probably fix it in two years, the same way they fix everything - slowly, and after we've already done the hard part ourselves.&lt;/p&gt;

&lt;h2&gt;
  
  
  Liquid Glass: Here We Go Again
&lt;/h2&gt;

&lt;p&gt;And then at WWDC 2025, Apple did it again. Liquid Glass is the biggest visual overhaul since iOS 7, maybe bigger. Every surface, every control, every navigation pattern has a new material language. It's beautiful and it means every custom UI component you've built needs to be reconsidered.&lt;/p&gt;

&lt;p&gt;I watched the keynote and had the same feeling I had in 2013. Here we go again. Every app is going to look dated. Every client is going to want an update. The timeline is going to be aggressive because Apple's timeline is always aggressive.&lt;/p&gt;

&lt;p&gt;But this time I'm not anxious about it. I've done this before. I've done this so many times that the process is familiar: watch the sessions, download the beta, start a branch, figure out what breaks, figure out what's new, figure out what's better. Same cycle, different details.&lt;/p&gt;

&lt;h2&gt;
  
  
  And After All of That
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;It's the closest thing in software to a doctor's obligation to stay current - except instead of a medical board enforcing it, it's Cupertino.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Here's what people don't talk about when they talk about iOS development: it demands continuing education in a way that most platforms don't. A web developer can write JavaScript the same way they wrote it five years ago and still ship. A backend developer can run the same framework for a decade and never be forced to change. iOS doesn't work like that. On this platform, what's current isn't determined by you, or by the community, or by market trends. It's determined by Apple. And when Apple moves, you move, or your skills start expiring. It's the closest thing in software to a doctor's obligation to stay current - except instead of a medical board enforcing it, it's Cupertino.&lt;/p&gt;

&lt;p&gt;I've met that obligation every time. I've manually managed memory, survived three major language transitions, rebuilt UIs for two complete design overhauls, navigated the dependency management wars, relearned async patterns three times, and shipped production apps on every version of every framework Apple has offered. Not because I had to. Because this is what I chose, and I keep choosing it.&lt;/p&gt;

&lt;p&gt;I've literally forgotten more about iOS development than most current iOS developers know. That's not a boast - it's just math. When you've been doing something for this long, through this many changes, the sheer volume of knowledge you've accumulated and then replaced is enormous.&lt;/p&gt;

&lt;p&gt;Every paradigm shift felt like the end of something. And every single time, it wasn't the end. It was a layer. The new thing built on the bones of the old thing, and understanding the old thing made you better at the new thing.&lt;/p&gt;

&lt;p&gt;That's what experience is. Not just knowing the current tools, but knowing every tool that came before them and why they were replaced.&lt;/p&gt;

&lt;p&gt;That's not something you can shortcut. It's not something you can tutorial your way into. It's what happens when you show up, every year, for nearly two decades, and do the work.&lt;/p&gt;

</description>
      <category>iosdevelopment</category>
      <category>warstories</category>
    </item>
    <item>
      <title>Human-Led, AI-Assisted Development: What It Actually Looks Like</title>
      <dc:creator>Don Angelillo</dc:creator>
      <pubDate>Sun, 22 Feb 2026 00:00:00 +0000</pubDate>
      <link>https://dev.to/nativitymobile-org/human-led-ai-assisted-development-what-it-actually-looks-like-1cpe</link>
      <guid>https://dev.to/nativitymobile-org/human-led-ai-assisted-development-what-it-actually-looks-like-1cpe</guid>
      <description>&lt;p&gt;I was wrong.&lt;/p&gt;

&lt;p&gt;Not about everything. But about enough. If you've been following me for a while, you might remember my position on AI-assisted development. I called it autocomplete on steroids. I mocked vibe coding. I was the guy in the comments pushing back every time someone posted about their AI-generated miracle app.&lt;/p&gt;

&lt;p&gt;Then I started watching people I respected change their minds. Not the hype crowd - the serious engineers. The ones whose opinions I'd actually trusted for years. And I realized that if this was real and I refused to look, I wasn't being principled. I was being stubborn - even for me - and reacting out of fear, not reason.&lt;/p&gt;

&lt;p&gt;I could be wrong again. Maybe we're all about to lose our jobs. But I'm betting our jobs change rather than disappear - because that's been the outcome every single time something was supposed to make developers obsolete.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I Got Wrong
&lt;/h2&gt;

&lt;p&gt;Let me be honest about what was actually happening. I was afraid. Afraid of becoming obsolete. Afraid that the thing I'd spent my entire career getting good at was about to stop mattering. When you've been writing code since you were eight years old and building software professionally since 1997, the idea that a machine might make all of that irrelevant isn't an abstract concern. It's existential.&lt;/p&gt;

&lt;p&gt;I did what scared people do. I dismissed it. I mocked it. I found every bad example I could and held it up as proof that the whole thing was a joke. And I did all of that without actually trying the tools myself. That's the part that bothers me - not that I was wrong, but that I let the fear make me shortsighted. I had a strong opinion with no firsthand experience behind it, and I should have known better.&lt;/p&gt;

&lt;p&gt;So I spent months - not days, not a weekend hackathon - months learning how to actually work with AI coding tools. Deliberately. Methodically. The same way I learned Objective-C in 2008, the same way I learned Swift when it dropped, the same way I've learned every tool that ended up mattering. I sat down, shut up, and did the work.&lt;/p&gt;

&lt;p&gt;What I found surprised me. The fear was wrong - but not in the way the hype crowd thinks.&lt;/p&gt;

&lt;h2&gt;
  
  
  What I Got Right
&lt;/h2&gt;

&lt;p&gt;I wasn't wrong about everything, though.&lt;/p&gt;

&lt;p&gt;The people who can't write code and think AI means they don't have to learn? They're still wrong. The vibe coders who prompt their way to a demo and call it a product? Still building on sand. The LinkedIn influencers posting "I bUiLt a SaaS in FoUr HoUrS wItH AI" and neglecting to mention it falls over the second a real user touches it? Still full of shit.&lt;/p&gt;

&lt;p&gt;AI doesn't replace the need to understand what you're building. It replaces some of the typing. Those are very different things.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Binary Is Broken
&lt;/h2&gt;

&lt;p&gt;The industry has framed this as two camps. Camp one: AI is going to replace all developers, learn prompt engineering or die. Camp two: AI is a toy, real developers don't need it, it's all hype.&lt;/p&gt;

&lt;p&gt;Both camps are wrong, and they're wrong in the same way - they're both treating AI as a single thing that either works or doesn't. That's not how any tool works. A table saw doesn't replace a carpenter. It also isn't useless. It makes a skilled person faster. It makes an unskilled person dangerous.&lt;/p&gt;

&lt;p&gt;I've been building software professionally since 1997. I've been writing code since I was eight years old. That experience didn't become obsolete the moment Claude learned to write a for loop. It became the thing that lets me tell Claude when it's writing the wrong for loop.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Difference Experience Makes
&lt;/h2&gt;

&lt;p&gt;I was terrified that AI would make experienced developers obsolete. The reality is the opposite. It makes us extraordinarily valuable - because we're the ones who can actually use it well.&lt;/p&gt;

&lt;p&gt;When an AI tool generates code, someone has to evaluate it. Someone has to know whether the architecture makes sense, whether the approach will scale, whether the edge cases are handled, whether it's going to pass App Store review, whether it's the kind of code that's going to haunt you in six months. Someone has to know what good looks like.&lt;/p&gt;

&lt;p&gt;That's not a junior developer skill. That's not a prompt engineering skill. That's a "I've shipped dozens of apps and watched hundreds of decisions play out over years" skill. It's judgment. It's taste. And you can't prompt your way into it.&lt;/p&gt;

&lt;p&gt;This is where the fear inverts. I was terrified that AI would make experienced developers obsolete. The reality is the opposite. It makes us extraordinarily valuable - because we're the ones who can actually use it well.&lt;/p&gt;

&lt;p&gt;The vibe coder sees code that compiles and calls it done. The experienced developer sees code that compiles and asks why it chose that approach, whether the error handling is real or decorative, and whether this is going to blow up the first time someone's phone loses connectivity in an elevator.&lt;/p&gt;

&lt;p&gt;And here's the part that should genuinely concern this industry: what happens to the pipeline? If companies stop hiring junior developers because AI can generate code cheaper, and the juniors who do get hired never learn to actually evaluate what they're building because the AI wrote it - who becomes the senior developer in ten years? Who develops the judgment? The experience that makes AI-assisted development work isn't something you're born with. It's something you build over decades of writing bad code, shipping it, watching it break, and learning why. If we don't create junior developers with the skills and critical thinking to become senior developers, what happens when all the senior developers retire?&lt;/p&gt;

&lt;h2&gt;
  
  
  The Accidental Manager
&lt;/h2&gt;

&lt;p&gt;Here's another part I didn't see coming.&lt;/p&gt;

&lt;p&gt;There's a false binary in tech careers that's been around forever: grow into management or stagnate. For years the only way to advance was to stop doing the work and start managing the people who do the work. The industry has gotten better about this - staff and principal engineer roles exist now specifically because companies realized they were losing their best technical people by forcing them into org charts. But for a long time, the message was clear: manage or plateau.&lt;/p&gt;

&lt;p&gt;I tried the management path. I hated it. I "demoted" myself back to individual contributor and never looked back. I'm a builder. That's what I'm good at, that's what I care about, and managing humans was making me worse at both.&lt;/p&gt;

&lt;p&gt;And now, after all of that, I'm a manager again.&lt;/p&gt;

&lt;p&gt;Except I don't manage people. I manage AI agents. I set direction, define constraints, review output, reject bad work, and push for better results. I'm doing everything a good engineering manager does - except my direct reports are LLMs, they don't have feelings about my code reviews, and they never schedule a meeting that should have been a Slack message.&lt;/p&gt;

&lt;p&gt;It turns out the skills I hated using on people - the constant steering, the quality checks, the "No, go back and do it differently" conversations - work beautifully when your team runs on GPUs instead of coffee.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Force Multiplier
&lt;/h2&gt;

&lt;p&gt;Twenty-five years of experience didn't become worthless. It became a force multiplier.&lt;/p&gt;

&lt;p&gt;The thing I was most afraid of - that my experience would become worthless - turned out to be exactly backwards. Twenty-five years of experience didn't become worthless. It became a force multiplier.&lt;/p&gt;

&lt;p&gt;I can build things faster now. Not because I stopped being a programmer, but because I stopped confusing typing with building. The act of writing code was never the hard part. The hard part was always knowing what to write, why to write it, and when to stop. AI handles the first part better than I expected. It's useless at the other two.&lt;/p&gt;

&lt;p&gt;The future isn't AI replacing developers, but it's not AI being irrelevant either. It's experienced developers who know how to wield it - who have the judgment to direct it, the knowledge to evaluate its output, and the discipline to reject it when it's wrong. Not a developer replaced by a chatbot. Not a Luddite pretending the tools don't work. It's a professional using every tool available to do better work, the same way professionals have always done.&lt;/p&gt;

&lt;p&gt;I was a skeptic. I did the work. I changed my mind. And I'm building better and faster because of it.&lt;/p&gt;

&lt;p&gt;If you're still in the "AI is useless" camp, I get it. I was there. But do the work before you decide. You might be surprised.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>softwaredevelopment</category>
      <category>agenticcoding</category>
    </item>
  </channel>
</rss>
