Here's a line I keep coming back to:
Anything that can be spec'd out, AI will replace.
Write a login flow? Spec'd out. AI handles it.
Build a REST API with pagination? Spec'd out. Done in seconds.
Implement a sorting algorithm? Spec'd out decades ago.
The list of things that can be cleanly specified is enormous, and AI is eating through it fast. If you can write a detailed requirements document, AI can probably build it.
But here's the gap: most software that fails doesn't fail because the code is wrong. It fails because the experience is wrong.
The Photo Studio Lesson, Again
Remember the photo studios from the last post? The ones that survived the digital camera revolution?
Here's something I didn't mention. When cameras got cheap, a lot of people thought they could take their own family portraits. They bought DSLRs, set up backdrops, tried to replicate what the studio did.
The results were technically fine. Proper exposure. In focus. Everyone facing the camera.
But they felt wrong. The lighting was flat. The poses were stiff. Nobody looked relaxed. The photo was technically correct but emotionally dead.
Because the studio photographer wasn't operating a camera. They were reading the room. Noticing when someone's smile looked forced. Adjusting the light to soften a face. Telling a joke to make a child laugh naturally. Positioning bodies so a family looked like they loved each other.
The photographer was designing an experience. The camera was just the tool.
Software Is Experienced Through Human Senses
When you use an app, you're not reading code. You're seeing it, hearing it, feeling it through the interface.
You see the layout and instantly judge whether it's trustworthy.
You hear a notification sound and feel either pleased or annoyed.
You tap a button and the response time — even 200ms vs 50ms — creates a gut feeling of quality or cheapness.
These are human sensory reactions. They happen before conscious thought. You can't spec them out because they're not logical — they're embodied.
AI has never been confused. It has never stared at a screen unable to find the "save" button. It has never felt a surge of frustration when a form resets itself. It has never felt delight when an animation is perfectly smooth.
AI can optimize for metrics. But it can't feel what your user feels.
Why UX Is the Last Moat
Think about the apps you love vs. the apps you tolerate.
Instagram in the early days — the magic wasn't the photo filters. It was the feel. Open the app, snap, pick a filter, share. Three taps. The entire flow was designed around a feeling: "I just captured something beautiful and I want to share it NOW."
Superhuman — an email client that charges $30/month in a world of free Gmail. Why? Because it feels fast. Every animation, every keyboard shortcut, every transition is designed to make you feel like you're moving at the speed of thought.
TikTok — the swipe gesture, the instant full-screen video, the algorithmic feed. The entire UX is designed around one feeling: "one more." It bypasses your rational brain and goes straight to dopamine.
None of this is in a spec. You can't write "make it feel delightful" in a requirements document. It requires someone who can imagine themselves in the user's body and feel what they feel.
The Spec-able vs. The Sens-able
Here's the dividing line:
| Spec-able (AI does this) | Sens-able (Humans must do this) |
|---|---|
| Build a responsive layout | Choose a layout that feels calm vs energetic |
| Implement form validation | Decide the error message tone — helpful vs punitive |
| Create a color palette | Feel whether a color combination is trustworthy |
| Optimize page load time | Know that 200ms feels instant but 500ms feels slow |
| Build a notification system | Understand that buzz at 11pm feels invasive |
| Implement a checkout flow | Sense where users feel anxious about spending money |
| Generate an icon set | Know which icon feels like "delete" vs "archive" |
The left column is engineering. The right column is design. AI is demolishing the left column. The right column is untouchable — because it requires lived human experience.
What This Means for Software Teams
The shift is already happening. The most valuable person on a software team is no longer the one who writes the most code. It's the one who can:
See through the user's eyes. Look at a screen and feel "this is confusing" before the user even reports it. Not because of analytics, but because of empathy.
Hear the silence. Notice when a user journey has a dead end — a moment where the user thinks "now what?" — that no error handler will catch.
Feel the friction. Sense that a three-step flow could be one step. That a confirmation dialog feels like distrust. That loading spinners on every interaction make the app feel broken even when it's working.
Design for emotion. Not just usability, but how the product makes people feel. Confident? Relaxed? Playful? In control? That feeling IS the product.
The Studio Photographer's Instinct
Go back to the photo studio. The best photographers have an instinct — a sense — for what makes a great portrait. They can't always explain it. It's not a rulebook. It's years of observing humans, understanding light, reading emotion.
The best UX designers have the same instinct. They can look at a wireframe and feel "this won't work." They can use an app for 30 seconds and tell you exactly where users will abandon. Not because of data, but because they've trained themselves to feel what users feel.
AI can generate a thousand wireframes. It can A/B test a million variants. But it can't sit in a room with a confused user and feel the embarrassment of not understanding a simple task.
It can't feel.
And in a world where code is free and features are commoditized, feeling is the only thing that differentiates great software from adequate software.
The Future: AI Builds, Humans Feel
The software team of the future isn't bigger. It's different.
Instead of 10 engineers writing code, you'll have 2 engineers directing AI with 1 designer who owns the experience. The engineers handle the spec-able parts. The designer handles the sens-able parts.
The code will be fine. AI ensures that.
But does the app feel right? Does it make people feel smart, capable, in control? Does it respect their time, their attention, their emotions?
That's not a spec. That's a senses check.
And only a human can do it.
What app makes you feel something — good or bad? I'd love to hear about the software experiences that hit you on a sensory level.
Top comments (0)