I built MentionMaster as a paid SaaS, ran it for a year,
and today I'm open-sourcing the full backend.
Here's what it does, how it works, and how to run 600 AI-powered mentions/month for ~$50.
🎬 Demo
The Problem
If you're a founder or indie hacker, you know this loop:
- Someone on Reddit asks "what tool helps with X?" — your product is the perfect answer
- You never see it
- A competitor replies, gets the customer
Manual monitoring doesn't scale. Paid tools like ReplyGuy charge $49–$299/month.
So I built my own and now I'm giving it away.
What It Does
MentionMaster monitors Reddit, Twitter/X, Hacker News, YouTube, and TikTok for
posts matching your product keywords. When it finds a relevant post, it:
- Scores it for relevance (saves OpenAI API costs — no point replying to irrelevant posts)
- Generates a context-aware reply using GPT-4o-mini
- Queues it for your review
- Publishes on approval, with auto-translation if your product targets non-English markets
Tech Stack
| Layer | Tech |
|---|---|
| API | FastAPI |
| Background jobs | Celery + Redis |
| Database | Firebase Firestore |
| Auth | Firebase Admin SDK |
| AI | OpenAI GPT-4o-mini |
| PRAW | |
| Twitter/X | Selenium scraper |
| Payments | LemonSqueezy |
| Observability | Prometheus + Grafana + OpenTelemetry |
| Deploy | Docker Compose |
Architecture
FastAPI (REST API)
│
├── Redis (Celery broker)
│ │
│ └── Celery Workers
│ ├── Scrape Reddit every 12h
│ ├── Scrape Twitter/X every 12h
│ ├── Scrape HN, YouTube, TikTok
│ └── Approve queued posts every 2h
│
└── Firebase Firestore (users, projects, posts)
The Interesting Parts
Relevance scoring before AI generation
Before calling OpenAI, every scraped post goes through a relevance check:
def is_post_suitable_for_help(text, product_description):
prompt = (
f"Given: '{text}', can our product '{product_description}' "
f"solve this problem? Return only True or False."
)
result = client.chat.completions.create(
model='gpt-4o-mini',
messages=[{"role": "user", "content": prompt}]
)
return result.choices[0].message.content.strip() == "True"
This cuts OpenAI costs by ~60% — you only generate full replies for posts
where your product is actually relevant.
Scheduled scraping with Celery + APScheduler
# Runs every 12 hours in production
scheduler.add_job(create_scrape_reddit_task, CronTrigger(hour='*/12'))
scheduler.add_job(create_scrape_twitter_task, CronTrigger(hour='*/12'))
# Promotes unpublished → approved every 2 hours
scheduler.add_job(create_move_posts_to_approved, CronTrigger(hour='*/2'))
Multi-language support
If your product targets non-English markets, replies are auto-translated
before publishing:
if product_language != 'en':
ai_response = GoogleTranslator(
source='auto',
target=product_language
).translate(ai_response)
💰 How to Run 600 Mentions/Month for $50
If you want to scale beyond one account, here's the setup I used:
Tools needed:
- Dolphin{anty} antidetect browser — $10/mo for 20 profiles
- Proxy per profile — ~$2/mo each
The math:
20 profiles × 5 social networks × 30 posts/month = 600 mentions/month
Total cost: $10 (Dolphin) + $40 (20 proxies) = $50/month
Cost per mention: ~$0.08
Each profile posts every 4 hours automatically — frequent enough to
cover conversations, slow enough to look natural.
Compare: ReplyGuy charges $49–$299/month for similar volume.
Getting Started
git clone https://github.com/youlast/mentionmaster_backend.git
cd mentionmaster_backend
cp .env.example .env # fill in your keys
docker compose up --build
You'll need:
- OpenAI API key
- Firebase project (Firestore + Auth)
- Reddit app credentials (free at reddit.com/prefs/apps)
Full setup guide in the README.
Links
- 🔧 Backend repo: https://github.com/youlast/mentionmaster_backend
- 🎨 Frontend repo: https://github.com/youlast/mentionmaster_frontend
If you found this useful, a ⭐ on GitHub goes a long way.
And if you have questions about the architecture or scraping approach — drop them below!
Top comments (0)