DEV Community

Cover image for From 38% to 58% Activation Rate in One Week — The Single UX Change That Did It
Mohammed Ashraf
Mohammed Ashraf

Posted on

From 38% to 58% Activation Rate in One Week — The Single UX Change That Did It

Last month, 38% of polls created on AnonPolls got at
least one vote.

This week it's 58%.

No new traffic. No new features. No marketing campaign.

One UX change in a single afternoon.

Here's what happened.

The problem I was ignoring

AnonPolls is a free anonymous
polling tool — no signup for creators or voters. People
create polls and share them via link or QR code.

I was tracking a metric I called "activation rate" —
the percentage of polls created that received at least
one vote.

For weeks it sat at 38%. That meant 62% of polls were
created and never voted on.

I assumed this was a distribution problem. People were
creating polls but not sharing them effectively. My plan:
drive more traffic, get more creators, some of them would
share better.

I was wrong about the cause.

What the data actually showed

When I looked more carefully at the user flow, I noticed
something. A lot of visitors were landing on poll pages
— they had received the shared link. They arrived.
They just weren't voting.

The drop-off was happening on the poll page itself,
not before it.

I watched a few Hotjar recordings. The pattern was clear:

Users landed on a poll page. They could immediately see
the current results — which option was leading, by how
much, how many people had voted. They read the results.
Then they left without voting.

They got the answer without participating.

The anchoring problem

There's a well-documented cognitive bias called the
anchoring effect. When people see existing data before
making a decision, that data disproportionately
influences their choice.

In the context of polls:

  • If someone sees "Option A is leading 70% to 30%" before voting, they're more likely to vote for Option A (bandwagon effect) or decide their vote won't change anything (futility effect)
  • Either way, the visible results are reducing honest participation

The same problem exists in workplace surveys and
classroom polls. When people can see what others
answered, they stop answering for themselves.

The fix

I implemented vote-first display.

The logic:

// Before rendering poll results, check vote status
const { data: voteStatus } = useQuery({
  queryKey: ['voteStatus', pollId, userIdentifier],
  queryFn: () => checkVoteStatus(pollId)
});

// Only show results if user has already voted
const showResults = voteStatus?.hasVoted || poll.isClosed;
Enter fullscreen mode Exit fullscreen mode

Simple in principle:

  • If you haven't voted → you see the question and options as clickable buttons, no results visible
  • After you vote → results appear immediately
  • If the poll is closed → results are always visible (voting is over, results are public)

One additional change: I stopped showing per-option vote
counts on poll cards in the browse page. Previously cards
showed "Option A: 67% — Option B: 33%". This spoiled the
result before someone even clicked through to vote.

Cards now show the question, total vote count, and a
"Vote Now" button. Options are listed but without counts.

The result

Activation rate: 38% → 58% in one week.

Avg votes per poll: 1.4 → 2.3 in the same week.

No new traffic. The same visitors, the same polls, the
same sharing patterns. Just removing the thing that was
killing their motivation to participate.

Why this took me so long to fix

I was measuring the wrong thing.

I was watching total votes and total polls. Both were
growing, slowly. Nothing looked obviously broken.

It took tracking activation rate specifically — polls
getting at least one vote as a percentage of polls created
— to surface the problem clearly.

Once I saw 62% of polls were dying with zero votes, the
question became: where in the flow is this happening?
Only then did I look at the poll page experience and
find the actual cause.

The lesson: build the metric that measures your core
value delivery, not just volume.

For a polling tool, volume metrics (total polls, total
votes) tell you how busy the system is. The activation
rate tells you whether the product is actually working.

One more thing

The same principle applies beyond polling tools.

If your product shows aggregate data, leaderboards,
or other people's activity before the user has done
the thing you want them to do — you might be creating
the same problem.

Social proof is powerful for conversion. But visible
outcomes before participation can kill the participation
itself.

Test hiding it. The results might surprise you.


AnonPolls is free, no signup
for anyone. If you're building something that needs
honest anonymous feedback — team retros, classroom
check-ins, group decisions — give it a try.

And if you've hit a similar activation problem in your
own product, I'd love to hear how you diagnosed it in
the comments.

Top comments (0)