DEV Community

Cover image for The Feedback Collector That Published Negative Reviews Publicly (Before Human Review)
FARHAN HABIB FARAZ
FARHAN HABIB FARAZ

Posted on

The Feedback Collector That Published Negative Reviews Publicly (Before Human Review)

I built a feedback collection system for a SaaS company. Customers submitted reviews, the system collected them, and the marketing team displayed the best ones on the website. Automated social proof. Standard practice.
Two weeks in, a one-star review appeared on the homepage. Then another. Then five more. All visible to every site visitor. All brutally negative.
The marketing director called screaming. "Why are you publishing our worst reviews on the front page?"

The Setup
SaaS company wanted customer testimonials on their site. They had been manually collecting reviews via email, then copying approved ones to the website. Slow process. They wanted automation.
The system I built was simple. After a customer used the product for thirty days, send an automated email asking for feedback. Customer clicks a link, fills out a review form, submits. The review posts automatically to a testimonials section on the homepage.
Fast. Efficient. No manual work required. Tested with eight beta users who all left positive reviews. Deployed.

The Public Disaster
The first few days were fine. Positive reviews appeared. Marketing was happy. Then the negative ones started showing up.
"Terrible customer support. Took 5 days to get a response." One star. Published immediately on the homepage.
"Product is buggy and crashes constantly. Waste of money." One star. Live on the site.
"Tried to cancel my subscription, they made it impossible. Avoid this company." One star. Front and center.
By the end of week two, the homepage testimonials section showed seventeen reviews. Nine were one or two stars. Only eight were positive.
Every potential customer visiting the site saw a wall of complaints before seeing any product information.

Why This Happened
My automation posted every review immediately upon submission. No filter. No approval step. No human review. The form said "Share your feedback" and the system shared it, instantly and publicly, exactly as written.
I had assumed most reviews would be positive. The company had good customer satisfaction scores. Surely most feedback would be praise. A few negative reviews mixed in would look authentic, I thought.
I was wrong on every assumption.

The Pattern
Customers who loved the product rarely filled out feedback forms. They were busy using the product. Happy customers are silent customers.
Customers with problems filled out the form immediately. Frustrated users had time and motivation to write detailed complaints. Angry customers wanted to be heard.
The result was selection bias. The feedback system captured complaints at a much higher rate than praise, because complaints were the primary driver of form completion.
The second problem was tone. When someone fills out a private feedback form, they write differently than when writing a public review. The form instructions said "Share your feedback with us" which sounded internal. Users wrote raw, unfiltered complaints meant for the support team.
Those raw complaints went live on the homepage word-for-word. Profanity included. Spelling errors included. No context, no resolution status, no company response. Just pure venting, published as social proof.

The Failed Fix
I tried adding a simple profanity filter. If the review contained bad words, do not publish it automatically.
That stopped the worst language but did not solve the core issue. Reviews like "Your product is garbage" and "Worst experience ever" did not trigger profanity filters but were still terrible homepage content.
I tried a sentiment analysis filter. Only publish reviews with positive sentiment scores.
That made the testimonials section look fake. Every review was glowing. No criticism. No authenticity. Potential customers assumed the reviews were fabricated or cherry-picked. Trust actually decreased.

The Real Solution Was Human Approval
The fix required stepping back from full automation. Reviews are now collected automatically but published manually.
When a customer submits feedback, it goes into a review queue visible to the marketing team. Positive reviews can be approved for public display. Negative reviews are routed to customer support for follow-up instead of being published.
Critical feedback is still valued and acted on internally. But the homepage only shows reviews that were explicitly approved for public use.
I also split the form into two paths. "Leave a public testimonial" versus "Send private feedback to our team." The wording sets expectations. Public testimonials go through approval. Private feedback goes to support and stays internal.

What Changed
After implementing approval workflow, the homepage showed only reviews that customers explicitly intended as public testimonials or that the marketing team felt comfortable displaying.
Negative feedback still came in, but it went to the support team where it belonged instead of being broadcast to potential customers.
Customers with complaints got responses and solutions. Many of those customers later submitted positive testimonials after their issues were resolved.

The Results
Before the fix, the homepage displayed every raw submission immediately, including nine one-star reviews out of seventeen total. Negative reviews visible to all site visitors. Conversion rate dropped thirty two percent. Several potential customers mentioned the bad reviews in sales calls as reasons for hesitation.
After the fix, the homepage showed only approved testimonials. Negative feedback still captured, but routed internally. Conversion rate recovered and exceeded baseline. Sales team stopped hearing objections about bad reviews.
The business impact was significant. Raw negative reviews had been costing conversions. Filtered testimonials built trust. Internal feedback still reached support teams for product improvement without damaging the public brand.

What I Learned
Automation is not always the answer. Some processes need human judgment. Feedback collection can be automated, but publication requires curation. Selection bias means negative feedback often outweighs positive in voluntary submission systems.
Most importantly, users write differently when they think feedback is private versus public. Instructions and expectations matter. A form asking for "your honest feedback" will get brutal honesty that should not be published verbatim.

The Bottom Line
An automated review system published every submission immediately, including raw negative feedback that was meant to be private. The homepage became a wall of complaints. The fix was adding human approval for public display while keeping internal feedback channels open for genuine customer input.

Written by Farhan Habib Faraz
Senior Prompt Engineer building conversational AI and voice agents

Tags: reviews, testimonials, automation, feedbackcollection, contentmoderation, socioproof

Top comments (0)