Here's something I've noticed: companies will spend six months building a feature that solves a problem customers don't have, then wonder why adoption is terrible.
Meanwhile, the actual friction points—the places where customers rage-quit your product or call support at 2 AM—sit in a backlog labeled "minor improvements." Because apparently making your product actually pleasant to use is less important than adding AI to something. (Everything needs AI now. Your coffee maker. Your shoelaces. Definitely your B2B SaaS dashboard.)
The gap between what companies think CX design is and what it actually should be has never been wider. And I'm not talking about journey maps that look beautiful in stakeholder presentations but never get implemented. I'm talking about the unglamorous work of finding where your experience breaks down and fixing it.
Let's dig into what customer experience design looks like when you're focused on outcomes instead of aesthetics.
The Friction Audit Nobody Wants to Do
Start here: use your own product like a customer would. Not the happy path you demoed to investors. The actual path.
Sign up with a new email. Go through onboarding without skipping steps. Try to accomplish something without using your internal knowledge. Watch where you get stuck. Those moments of confusion? That's where your customers are bleeding out.
I worked with a SaaS company last year that had a gorgeous interface. Clean. Modern. Minimalist. Their churn rate was 40% in the first 90 days. Turns out their "intuitive" navigation buried the most-used feature three clicks deep, and their onboarding never explained what the product actually did. Everyone was too busy congratulating themselves on the design awards to notice nobody could figure out how to use the thing.
The friction audit is simple but uncomfortable:
Map every step of your core user journey. Not the journey you wish they took. The one they actually take. Use session recordings (Hotjar, FullStory, whatever). Watch real humans interact with your product. The confusion is immediate and painful to watch.
Identify drop-off points. Where do people abandon the process? Where do support tickets spike? Where do users stop and Google "how to [basic thing] in [your product]"? That last one is particularly telling.
Measure time-to-value. How long before a customer gets their first win? If it's more than 10 minutes, you're losing people. Slack figured this out early—they measured time until a team sent their first message. Everything in onboarding optimized for that metric. Not time watching tutorial videos. Not time setting up integrations. Time until actual value.
Talk to the people who quit. Exit interviews aren't just for employees. Your churned customers know exactly where the experience failed. They're usually happy to tell you. Sometimes in great detail.
The Features vs. Friction Trade-off
Every new feature adds complexity. This isn't controversial, but teams act like it is.
You add a feature. Now users have to discover it, understand it, decide if they need it, learn how to use it, and integrate it into their workflow. That's cognitive load. That's friction. The feature better be solving a real problem for enough people to justify that cost.
Notion is a fascinating case study here. Incredibly powerful tool. Also incredibly overwhelming for new users. They know this. Their solution isn't to remove features—it's to provide templates that hide complexity until you need it. You can start with a simple doc and gradually discover more advanced capabilities. The friction is deferred, not eliminated, but that timing matters.
Compare that to products that throw every feature at you immediately. Seventeen tooltips on first login. A settings menu with 200 options. A dashboard that looks like a fighter jet cockpit. Sure, power users eventually learn it. But you've already lost everyone else.
The question isn't "should we build this feature?" It's "does this feature reduce more friction than it creates?"
Sometimes the answer is no. And that's fine. Basecamp built an entire company philosophy around saying no to features. They're doing okay.
Intentional Friction (Yes, Really)
Not all friction is bad. Some friction prevents mistakes. Some friction creates value.
Gmail's "undo send" feature adds a 5-second delay. That's intentional friction. It's saved approximately one billion people from career-ending email mistakes. The friction is the feature.
Amazon's one-click ordering removes friction. Their "are you sure you want to cancel Prime?" flow adds friction. Both are intentional design choices based on what behavior they want to encourage.
The key is being deliberate. Ask yourself:
Where should users slow down? Deleting data. Making purchases. Sharing sensitive information. Add confirmation steps. Make the destructive action slightly harder than the constructive one.
Where are you protecting users from themselves? Stripe's API has a test mode that's visually distinct from live mode. That tiny bit of friction (having to explicitly switch modes) prevents developers from accidentally processing real transactions while testing. Smart.
What friction builds investment? Ikea's furniture is harder to assemble than pre-built options. People value it more because they built it. The "Ikea Effect" is real. Some onboarding friction—customizing your workspace, inputting your data, making initial choices—creates ownership.
The Personalization Trap
Personalization is supposed to reduce friction by showing people what's relevant to them. In practice, it often just creates different friction.
Spotify's personalized playlists are genuinely useful. They reduce the friction of finding new music. LinkedIn's personalized feed is... less useful. It's mostly showing me content from people I don't follow based on an algorithm that thinks I care about hustle culture and passive income. The personalization creates friction by making me hunt for the content I actually want.
The difference? Spotify's personalization is based on clear behavioral data (what I listen to) and has an obvious value proposition (discover music I'll like). LinkedIn's personalization is based on murky engagement metrics and serves LinkedIn's goals (keep you scrolling) more than mine (see updates from my actual network).
Before you personalize something, ask:
- Do you have enough data to personalize accurately?
- Is the personalized experience clearly better than a simple default?
- Can users easily override the personalization when it's wrong?
- Are you personalizing to help users or to manipulate behavior?
That last one is the uncomfortable question. A lot of "personalization" is just dark patterns with better PR.
Design for Recovery, Not Just Success
Your CX design probably focuses on the happy path. User does A, then B, then C, success! Confetti animation!
But what happens when something goes wrong? That's where most experiences completely fall apart.
Error messages that say "Something went wrong." Okay, what? What should I do about it? Can I fix it or do I need to contact support? Is my data gone? The error message is where good CX design proves itself.
Stripe (them again) has exceptional error handling in their API. Errors include specific codes, clear descriptions, and links to documentation. They design for the failure case as carefully as the success case. Because they know developers will hit errors. Making those errors easy to understand and fix reduces support load and improves the overall experience.
Your recovery design should include:
Clear error states. What went wrong? Why? What can the user do about it?
Easy paths back. If someone navigates into a dead end, how do they get back to somewhere useful? Breadcrumbs. Back buttons. Clear navigation. This seems obvious until you're trapped in a settings menu with no clear way out.
Saved progress. If something fails, did you lose everything they just did? Auto-save exists. Use it. Google Docs taught everyone to expect this. Your form that loses all data when the session times out is unacceptable in 2025.
Support access at the point of friction. When someone's stuck, that's when they need help. Not after they've given up and navigated to your contact page. Intercom and similar tools let you trigger contextual help based on behavior. Use it.
The Mobile Experience Isn't Optional Anymore
I know. You've heard this for a decade. But I still see products in 2025 that are clearly designed for desktop and "adapted" for mobile as an afterthought.
More than 60% of web traffic is mobile now. For some industries, it's over 80%. If your mobile experience is a cramped, hard-to-tap version of your desktop site, you're telling the majority of your audience they don't matter.
This doesn't mean mobile-first for everything. It means mobile-appropriate. Some tasks genuinely work better on desktop. Complex data analysis. Detailed content creation. Multi-window workflows. That's fine. But basic tasks—checking status, making simple updates, consuming content—should work seamlessly on mobile.
Duolingo's mobile experience is better than their desktop one. Because that's where their users are. They designed for the actual context of use (practicing language skills in spare moments throughout the day) rather than the context they wished for.
Measuring What Actually Matters
You can't improve what you don't measure. But most CX metrics are either too vague (NPS) or too narrow (button click rates).
Here's what actually tells you if your CX design is working:
Time to first value. How long until a user accomplishes something meaningful? Track this obsessively.
Task completion rates. Can users actually do what they came to do? Sounds basic. Most companies don't measure it.
Support ticket trends. What are people getting stuck on? Your support team knows where the experience breaks. Listen to them.
Feature adoption vs. feature awareness. Did people know the feature exists? Did they try it? Did they keep using it? The gap between these numbers tells you a lot.
Return usage patterns. Do people come back? Do they go deeper into the product over time or stay surface-level? Growth in usage depth indicates good CX.
Qualitative feedback loops. Surveys, user interviews, session recordings. Numbers tell you what's happening. Humans tell you why.
The goal isn't to optimize every micro-interaction. It's to understand where the experience is genuinely helping people and where it's getting in the way.
Implementation Reality Check
Everything I've described sounds reasonable in an article. Implementation is messier.
You'll have stakeholders who want their pet feature prioritized over fixing friction. You'll have technical debt that makes seemingly simple changes complicated. You'll have limited resources and competing priorities. You'll have data that's incomplete or contradictory.
Welcome to actually doing CX design.
The approach that works: start small, measure everything, prove value, expand.
Pick one high-friction point. Fix it. Measure the impact. Show the results to stakeholders. Use that win to justify tackling the next friction point. Build momentum gradually.
Airbnb didn't redesign their entire experience overnight. They ran thousands of experiments, kept what worked, killed what didn't. Boring? Yes. Effective? Also yes.
The companies with exceptional customer experiences didn't get there with one brilliant redesign. They got there by consistently prioritizing experience improvements over shiny new features. By measuring what matters. By actually using their own products and fixing what's broken.
That's the work. It's not glamorous. It won't win design awards. But it's what keeps customers around.
Where to Start Tomorrow
Stop reading articles about CX design (including this one) and go use your product.
Sign up with a new account. Try to accomplish your core use case. Note every moment of confusion, every unclear label, every time you have to think harder than you should.
That list? That's your roadmap.
You don't need a complete journey map. You don't need stakeholder alignment on a comprehensive CX strategy. You need to fix the thing that's annoying your customers right now.
Start there. The rest follows.
Top comments (0)