DEV Community

Cover image for No Habits to Break
Itay Maman
Itay Maman

Posted on

No Habits to Break

I switched from Claude Code to Codex this week

Codex 5.3 dropped the same day as Opus 4.6, and early results favored Codex — so after six-plus months as a die-hard Claude Code user, I switched. Took me minutes. Literally, minutes. I'll probably switch back the moment Opus pulls ahead again.

Switching IDEs used to be nothing like this. Moving from IntelliJ to VS Code was a genuine project: weeks of hunting for equivalent plugins, remapping keybindings etched into muscle memory, recreating snippets and templates. Real lock-in, built from years of accumulated customization.

Why the difference?

It's the text interface. It's a far less rigid medium than whatever surface area we used to rely on — dropdown menus, key combinations, or configuration files. With a text interface, the prompts I wrote for Claude Code work just as well in Codex. There's almost nothing to learn.

In tech circles, the question of whether apps built on LLMs have moats gets a lot of attention. But as this anecdote shows, the AI labs themselves might have even less.

What this means

  1. If you're behind, catching up is purely a technical problem. Google with Gemini doesn't need to build a better model and break user habits. They "just" need to build a better model. That's hard, but it's one hard thing — and arguably easier than changing user behavior. The flip side: if you're ahead, you're only ahead until the next benchmark. Today's lead means nothing when switching takes minutes.

  2. Benchmarks matter and will keep mattering. Low switching friction keeps competition healthy and value-based — which is exactly what we're seeing. That's why there's so much discussion about LLM benchmarks. Nobody obsessed over IDE benchmarks. There was no point — you weren't going to switch anyway.

  3. The no-moat effect extends beyond coding agents. It applies to most products where the primary interface is chat. Anywhere the interaction is primarily text, switching friction drops. Though once a product accumulates your data, or you've built workflows around it, switching costs return.

For decades, "incumbent advantage" meant something. Users accumulated habits, configurations, workflows — and that accumulation was a moat. We're used to thinking this way. But in the era of LLM-based products, the only thing that matters is whether your product is better right now. Yesterday's gone.

Top comments (0)