I've built things that nobody used. More than once. And looking back, the frustrating part is that I thought I had validated the idea before building. I asked people if they'd use it, they said yes, and then when I launched they didn't. It took me a while to understand why this keeps happening, and it comes down to one thing: hypothetical questions get hypothetical answers.
When you ask someone "would you use this?" you're asking them to predict their own future behavior. People are bad at this. They're also nice, and they don't want to crush your enthusiasm, so they say yes. They genuinely believe it too. But there's a massive gap between "yeah I'd probably use that" and actually signing up, learning how it works, and making it part of their workflow.
Here's what I do instead.
1. Ask about their past, not their future
Instead of asking people if they'd use your thing, ask them what they're already doing to solve the problem you're addressing. This tells you way more. This is the core idea behind The Mom Test by Rob Fitzpatrick, which every founder should read. But knowing the concept and actually applying it are different things.
If you're thinking about building a tool to help people track their habits, don't ask "would you use a habit tracking app?" Ask "what did you do the last time you tried to build a new habit? What tools or methods did you use? Why did you stop using them?"
Now you're getting real information. You're learning what they actually tried, what worked, what didn't, and where they gave up. This is gold compared to a hypothetical yes or no.
The same applies if you're building a developer tool, a SaaS product, or anything else. Find people who have the problem and ask them to walk you through how they currently deal with it. If they can't describe their current workaround, they probably don't have the problem badly enough to pay for a solution.
2. Look for people already using bad solutions
One of the strongest validation signals I've found is discovering that people are already solving your problem with something that wasn't designed for it. Spreadsheets are the classic example. If you find people tracking customer feedback in Google Sheets, manually copying data between tools, or building janky internal scripts to automate something, that's real demand.
These people have already proven they care enough about the problem to spend time on a workaround. They're not giving you a hypothetical yes, they're showing you with their behavior that this problem matters to them. Your job is to give them something better than their spreadsheet.
When I was building UserJot, I noticed a lot of indie hackers and small teams tracking feature requests in Notion tables, Trello boards, or just their email inbox. The problem was that users couldn't see any of it. They'd submit feedback and have no idea if anyone read it, leading to duplicate requests and the feeling that their input went into a black hole. People cared about user feedback enough to set up some system, but the systems weren't solving the actual problem. That told me there was demand for something purpose-built.
3. Watch for repeated complaints
Another signal I trust is when people complain about the same problem repeatedly in communities you're part of. Not one tweet, but a pattern. If you keep seeing developers complain about a specific friction point, or founders asking how to solve the same thing in different Slack groups, that's validation.
The key word is repeated. One person complaining once doesn't mean much. But if you see the same frustration come up again and again across different people, you're onto something real. I keep a running list of complaints I see in communities I'm part of. Patterns emerge over time.
The caveat here is that complaining doesn't always mean people will pay for a solution. Sometimes people complain about things they wouldn't actually spend money to fix. So the complaints are a starting point, not proof. But they're a much better starting point than asking "would you use this?" and getting polite yeses.
4. Make people put something on the line
The reason hypothetical validation fails is there's no cost to saying yes. Real validation requires some form of commitment.
The classic version is asking for money. Can you get someone to pre-order? Can you charge $10 for early access? Even a small amount separates people who are genuinely interested from people who are being nice. If you ask 50 people if they'd pay for your tool and all 50 say yes, but then you ask those same 50 people to pay $20 for early access and only 2 do, you've learned something important.
Money isn't the only option though. Time works too. If you set up a landing page and someone gives you their email and then actually opens your emails and responds to them, that's a real signal. If someone spends 30 minutes on a call with you to talk about their problems, that's a real signal. The point is that some form of friction separates real interest from polite interest.
Waitlists can be misleading here because email signups are pretty low-friction. Thousands of people on a waitlist doesn't mean thousands of people will use your product. I've seen products with huge waitlists launch to crickets because the signups were just casual curiosity, not real demand.
5. Pay attention to what users do after launch
Validation doesn't stop when you ship. In some ways, the most useful validation happens after you have real users because now you can watch behavior instead of asking questions.
The difference between what users say they want and what they actually use is huge. Someone might request a feature, you build it, and they never touch it. Someone else might never request anything but use your product every single day. Behavior tells the truth.
I track a few things to understand real demand after launch. First, do people come back? If someone signs up and then never logs in again, they didn't have the problem badly enough. Second, do people tell others about it? Organic referrals are a strong signal that you're solving a real problem. Third, what features get the most engagement versus what features people ask for? And here's one people forget: do users care when you actually ship something? If you publish a changelog update and nobody reacts or clicks through, that feature might not have been as important as they said it was.
This is where structured feedback becomes useful. I think feedback loops are important enough that I built my own tool for it. With UserJot, users can submit ideas and vote on each other's requests, so I can see what actually has demand based on engagement, not just based on whoever emailed me most recently. When multiple people upvote the same feature request, that's a stronger signal than one person mentioning it once. There are also open source options like Fider if you want to self-host.
6. Accept that you can't fully validate without building
Here's the uncomfortable truth: you can never be 100% certain there's demand until you build something and see if people use it. All the validation tactics above reduce risk, but they don't eliminate it. At some point you have to ship.
The goal isn't perfect validation. The goal is to derisk enough that you're not spending six months building something nobody wants. If you've found people using workarounds, heard repeated complaints, and gotten a few people to commit money or serious time, you've validated enough to build a first version. It doesn't have to be the full product. Build the smallest thing that solves the core problem and see if people actually use it.
I used to spend way too long in "validation mode" when really I was just procrastinating because building is scary and rejection is uncomfortable. Now I try to validate quickly and build small. If I'm wrong, I want to find out in weeks, not months.
Quick reference: validation signals ranked
Weak signals (don't trust these alone):
- Friends and family saying it's a good idea
- Positive responses to "would you use this?"
- Likes and upvotes on a concept post
- Large waitlist with low-friction signup
Stronger signals:
- People describing their current janky workarounds
- Repeated complaints about the same problem in communities
- Someone giving you their email and then actually engaging with your emails
- People spending real time on a call to discuss their problems
Strong signals:
- Someone paying money, even a small amount
- Beta users who come back without being prompted
- Users referring others organically
- Multiple users requesting the same thing with engagement behind it
The difference between validation and reality is behavior. Watch what people do, not what they say they'd do. If someone is already spending time or money to solve the problem you're addressing, that's real. If they're just nodding along to your pitch, that's polite interest, and polite interest doesn't convert to users.
I'm building UserJot to solve the feedback loop problem for my own products. If you're tired of tracking feature requests in spreadsheets, give it a try. There's a free tier.
Top comments (4)
The Mom Test is def. top 5 business books of all time for me
Yeah its a really good book
Loved this. Nice write up.
:)