DEV Community

AN
AN

Posted on

I built a tool to stop wasting 15 hours/week job hunting (and it monitors 6 platforms automatically)

The Problem 😫

As a freelance developer/consultant, I was spending 15+ hours every week manually checking job boards:

  • Reddit (r/forhire, r/startups, r/freelance, etc.)
  • Hacker News (Ask HN, Who's Hiring)
  • Dev.to jobs
  • Product Hunt jobs
  • RemoteOK
  • We Work Remotely

The worst part? I'd miss posts that went up at 3 AM. By the time I woke up, they'd have 50+ responses already.

I was always too late.

The Breaking Point 💔

One month, I tracked my time:

  • 15 hours/week scrolling job boards
  • 780 hours/year wasted
  • At my consulting rate ($100/hr) = $78,000/year in opportunity cost

Plus, I was only finding 10-15 opportunities per week because I couldn't check everything constantly.

The Solution 🚀

I built Craffr - a tool that monitors all 6 platforms automatically and sends instant email alerts when opportunities match my skills.

How it works:

  1. Add keywords ("React developer", "Node.js", "full-stack", etc.)
  2. Select platforms to monitor (or use smart presets)
  3. Get email alerts within 5 minutes of posts going live
  4. Be first to respond and win more clients

Tech Stack 💻

Here's what I used to build this:

Frontend:

  • Next.js 15 (App Router)
  • TypeScript (strict mode)
  • Tailwind CSS
  • Shadcn/ui components

Backend:

  • Supabase (Postgres + Auth + Real-time)
  • Next.js API routes
  • OpenAI API (for quality scoring)

Platform Monitoring:

  • Reddit: RSS feeds (no rate limits!)
  • Hacker News: Algolia API
  • Dev.to: Public API
  • RemoteOK: RSS feed
  • We Work Remotely: RSS feed
  • Product Hunt: GraphQL API

Email:

  • Resend (for email delivery)
  • React Email (for templates)

Hosting:

  • Vercel (Next.js)
  • Supabase (database + auth)

Key Technical Challenges 🛠️

1. Email Deliverability

Getting emails to inbox (not spam) was the hardest part.

What I learned:

  • SPF, DKIM, DMARC records are critical
  • Use a dedicated email service (Resend > SendGrid for my use case)
  • Warm up sending domain gradually
  • Monitor bounce rates obsessively

Result: 98% deliverability rate

2. Rate Limits

Different platforms have different limits:

const rateLimits = {
  reddit: 'None (RSS)', // 🎉
  hackerNews: '10,000 requests/day (Algolia)',
  devTo: '30 requests/min',
  remoteOK: 'No official limit (RSS)',
  weWorkRemotely: 'No official limit (RSS)',
  productHunt: '100 requests/hour'
}
Enter fullscreen mode Exit fullscreen mode

Solution:

  • Batch requests intelligently
  • Cache aggressively
  • Use exponential backoff
  • Poll at different frequencies per platform

3. AI Quality Scoring

Not all "jobs" are real jobs. Needed to filter spam.

Approach:

const scorePost = async (post: JobPost) => {
  const prompt = `
    Score this job post quality (0-1):
    - Is it a real job? (not spam/scam)
    - Is it clearly written?
    - Does it have a budget/rate?
    - Is the poster credible?

    Post: ${post.title}
    ${post.description}
  `;

  const score = await openai.completions.create({
    model: 'gpt-4',
    prompt,
    temperature: 0.3
  });

  return parseFloat(score.choices[0].text);
}
Enter fullscreen mode Exit fullscreen mode

Result: 88% accuracy filtering spam

4. Duplicate Detection

Same job appears on multiple platforms.

Solution:

  • Hash title + description
  • Fuzzy matching for slight variations
  • 95% accuracy

Results (My Own Usage) 📊

Before Craffr:

  • ⏰ 15 hours/week searching
  • 📧 10-15 opportunities/week found
  • 🐢 12-hour average response time
  • 📉 ~15% win rate

After Craffr:

  • ⏰ 30 minutes/week (just responding!)
  • 📧 60-100 opportunities/week found
  • ⚡ 15-minute average response time
  • 📈 ~28% win rate

Real impact: 12 new clients in 6 months. Average project value: $5K-15K.

Architecture Overview

┌─────────────────────────────────────────┐
│     Platform Monitoring (Cron Jobs)     │
│  Reddit | HN | Dev.to | PH | RO | WWR  │
└──────────────┬──────────────────────────┘
               │
               ▼
┌─────────────────────────────────────────┐
│         Supabase (PostgreSQL)           │
│   • job_posts table                     │
│   • user_keywords table                 │
│   • user_alerts table                   │
└──────────────┬──────────────────────────┘
               │
               ▼
┌─────────────────────────────────────────┐
│      Matching Engine + AI Filter        │
│   • Keyword matching                    │
│   • OpenAI quality scoring              │
│   • Duplicate detection                 │
└──────────────┬──────────────────────────┘
               │
               ▼
┌─────────────────────────────────────────┐
│      Email Delivery (Resend)            │
│   • Instant alerts                      │
│   • Daily digest (optional)             │
└─────────────────────────────────────────┘
Enter fullscreen mode Exit fullscreen mode

Code Snippet: Reddit RSS Monitor

Here's how I monitor Reddit (simplified):

// app/api/cron/monitor-reddit/route.ts

import Parser from 'rss-parser';
import { createClient } from '@supabase/supabase-js';

const parser = new Parser();
const supabase = createClient(process.env.SUPABASE_URL!, process.env.SUPABASE_KEY!);

export async function GET() {
  const subreddits = ['forhire', 'startups', 'freelance_forhire'];

  for (const sub of subreddits) {
    const feed = await parser.parseURL(
      `https://www.reddit.com/r/${sub}/new/.rss`
    );

    for (const item of feed.items) {
      // Check if already in database
      const { data: existing } = await supabase
        .from('job_posts')
        .select('id')
        .eq('external_id', item.id)
        .single();

      if (existing) continue;

      // Insert new post
      await supabase.from('job_posts').insert({
        external_id: item.id,
        platform: 'reddit',
        subreddit: sub,
        title: item.title,
        url: item.link,
        content: item.content,
        author: item.creator,
        posted_at: new Date(item.pubDate),
      });

      // Match against user keywords
      await matchAndAlert(item);
    }
  }

  return Response.json({ success: true });
}
Enter fullscreen mode Exit fullscreen mode

Lessons Learned 🎓

1. Reddit RSS > Scraping

Reddit's RSS feeds are:

  • ✅ Legal (no ToS violation)
  • ✅ Reliable (rarely down)
  • ✅ No rate limits
  • ✅ Simple to parse

Don't overcomplicate with scraping.

2. Start with One Platform

I initially tried to build all 6 platforms at once. Bad idea.

Better approach:

  1. Start with Reddit only (2 weeks)
  2. Get users, get feedback
  3. Add one platform at a time
  4. Each platform = new marketing push

3. Free Tools = Growth Hack

Built 3 free tools:

  • Reddit Lead Finder
  • Freelance Rate Calculator
  • Response Timer

These drove 50% of my signups. Give value first.

4. Pricing is Hard

Tried 3 different pricing models:

v1: $20/month (too cheap, didn't value it)
v2: $99/month (too expensive, no conversions)
v3: $49/month ✅ (sweet spot)

Also learned: 7-day free trial > forever free tier

5. Email Deliverability Takes Time

Don't launch with email alerts on day 1. You need to:

  • Warm up your domain (2-4 weeks)
  • Start with low volume
  • Monitor bounce rates
  • Gradually increase

What's Next 🔮

February:

  • Twitter/X monitoring (most requested feature)
  • Mobile app (React Native)

March:

  • Discord community monitoring
  • Slack integration

Q2:

  • LinkedIn posts monitoring
  • API for developers

Goal: $10K MRR by June

Try It Yourself 🚀

I'm offering the Dev.to community:

  • 7-day free trial (no credit card)
  • Use code DEVTO50 for 50% off first month

Live site: https://craffr.com

Also built free tools (no signup required):

Questions? 💬

Happy to answer:

  • Technical implementation details
  • How I approach freelance lead gen
  • Pricing strategies
  • Launch tactics
  • Anything else!

Drop your questions in the comments 👇


*Currently live on Product Hunt too if you want to support: https://www.producthunt.com/products/craffr?launch=craffr

BuildInPublic #Freelance #SaaS #NextJS #TypeScript

Top comments (0)