Over the past year, I’ve been using a lot of AI tools. At first, they felt impressive—fast, capable, almost magical in how quickly they could respond. But over time, something started to feel off. The more I used them, the more every interaction began to feel the same. Predictable. Structured. Transactional.
Most conversations followed a simple loop: you type something, you get a response, and then you move on. Even when systems try to feel conversational, they’re still built around that same pattern—input, output, done. There’s no real sense of continuity, no presence, nothing that feels like an actual interaction unfolding over time.
What made this more uncomfortable was realizing how these systems handle conversations behind the scenes. Something that feels personal often isn’t treated that way. Conversations are stored, analyzed, sometimes reused. Once you become aware of that, it subtly changes how you engage. You hesitate more. You filter more. The interaction becomes less natural.
I thought avatars might change that. Adding a face, a voice, a sense of presence—on paper, it sounds like the missing piece. But in practice, it introduced a different kind of friction. Many of these systems are metered, charging per minute or per interaction. That changes behavior immediately. Instead of speaking freely, you start optimizing. You become aware of time, cost, efficiency. And that awareness breaks the illusion completely.
At some point, I stopped thinking about features and started thinking about the experience itself. Not how to make AI more powerful, but how to make it feel different. What would happen if conversations weren’t stored at all? If interaction wasn’t limited or measured? If the goal wasn’t just to generate responses, but to create something that felt more like a real exchange?
That question led me to start building something of my own. Not as a finished product, but as an experiment. The idea was simple in theory: remove as much friction as possible and see how people behave. In practice, it turned out to be much harder than expected. Most systems are built around persistence, tracking, and optimization. Removing those assumptions forces you to rethink how everything works—from session handling to how continuity is maintained without storing history.
Another challenge was realizing how vague the idea of “more human” actually is. It’s easy to say, but difficult to implement. It doesn’t come from one feature or one breakthrough. It comes from small details—timing, tone, how responses flow, how natural the interaction feels over time. These are subtle things, but they shape the entire experience.
One of the most interesting things I noticed during this process was how behavior changes when constraints are removed. When people feel like they’re not being tracked and not being limited, they interact differently. More openly. More casually. Less like they’re issuing commands, and more like they’re actually engaging in something.
This experiment eventually became what I’m building now, but the product itself feels secondary to the question behind it. Should AI remain a tool—efficient, structured, predictable? Or is there space for something that feels closer to an interaction, even if it’s imperfect?
I don’t think there’s a single correct answer yet. I’m still figuring it out as I go. But it’s been interesting to explore what happens when you shift the focus away from output and toward experience.
I’d be genuinely curious to hear how others think about this. When you use AI, do you want it to stay as a tool, or do you find yourself wanting something that feels more like a conversation?

Top comments (0)