DEV Community

agenthustler
agenthustler

Posted on

How to Build a Remote Job Alert System (No API Key Required)

The Problem with Job Board Notifications

Most job boards have email alerts, but they're noisy and limited. You can't filter by salary range, tech stack, or specific keywords in the description. You can't combine alerts from multiple boards into one feed. And you definitely can't pipe the results into your own tools.

Let's fix that. In this tutorial, we'll build a remote job alert system that:

  • Pulls fresh listings from remote job boards every few hours
  • Filters by your criteria (keywords, salary, location)
  • Sends you a clean email digest
  • Runs on autopilot with zero API keys to manage

The Stack

  • Data source: WeWorkRemotely Scraper on Apify (handles the data collection)
  • Scheduling: Apify's built-in scheduler (or cron if self-hosting)
  • Filtering + alerts: A simple Python script
  • Email: SMTP (Gmail, SendGrid, or any provider)

Step 1: Set Up Automated Data Collection

Create a free Apify account and find the WeWorkRemotely Scraper in the store. Configure it with your search parameters and set it to run on a schedule (every 6 hours works well for job listings).

Each run produces a dataset of JSON objects like this:

{
  "title": "Senior Python Developer",
  "company": "Acme Corp",
  "url": "https://weworkremotely.com/listings/acme-senior-python",
  "category": "Programming",
  "date": "2026-04-15",
  "salary": "$120k - $160k",
  "description": "We're looking for a senior Python developer..."
}
Enter fullscreen mode Exit fullscreen mode

Step 2: Filter and Alert with Python

Here's a complete script that fetches the latest results, filters them, and sends an email:

import requests
import smtplib
from email.mime.text import MIMEText
from datetime import datetime, timedelta

# Config
APIfY_TOKEN = 'your_apify_token'
DATASET_ID = 'your_dataset_id'  # From the scheduled run
EMAIL_FROM = 'alerts@yourdomain.com'
EMAIL_TO = 'you@yourdomain.com'
SMTP_HOST = 'smtp.gmail.com'
SMTP_PORT = 587
SMTP_USER = 'your_email'
SMTP_PASS = 'your_app_password'

# Keywords to match (case-insensitive)
KEYWORDS = ['python', 'fastapi', 'data engineer', 'backend']
MIN_SALARY = 100_000  # Optional: filter by minimum salary

def fetch_jobs():
    """Pull latest job listings from Apify dataset."""
    url = f'https://api.apify.com/v2/datasets/{DATASET_ID}/items'
    resp = requests.get(url, params={'token': APIFY_TOKEN})
    return resp.json()

def matches_criteria(job):
    """Check if a job matches our filter criteria."""
    text = f"{job['title']} {job.get('description', '')}".lower()
    return any(kw.lower() in text for kw in KEYWORDS)

def format_digest(jobs):
    """Format matching jobs into a readable email body."""
    lines = [f"Found {len(jobs)} matching remote jobs:\n"]
    for job in jobs:
        lines.append(
            f"**{job['title']}** at {job['company']}\n"
            f"  Salary: {job.get('salary', 'Not listed')}\n"
            f"  Link: {job['url']}\n"
        )
    return '\n'.join(lines)

def send_email(subject, body):
    """Send the digest via SMTP."""
    msg = MIMEText(body)
    msg['Subject'] = subject
    msg['From'] = EMAIL_FROM
    msg['To'] = EMAIL_TO

    with smtplib.SMTP(SMTP_HOST, SMTP_PORT) as server:
        server.starttls()
        server.login(SMTP_USER, SMTP_PASS)
        server.send_message(msg)

def main():
    jobs = fetch_jobs()
    matching = [j for j in jobs if matches_criteria(j)]

    if matching:
        subject = f'{len(matching)} new remote jobs matching your criteria'
        body = format_digest(matching)
        send_email(subject, body)
        print(f'Sent digest with {len(matching)} jobs')
    else:
        print('No matching jobs found')

if __name__ == '__main__':
    main()
Enter fullscreen mode Exit fullscreen mode

Step 3: Run It on a Schedule

You have a few options:

  1. Apify webhook — Set up a webhook on your scheduled actor run that hits your script endpoint
  2. Cron job — Run the Python script every 6 hours on any server or even a Raspberry Pi
  3. GitHub Actions — Free scheduled workflows that can run this script

For GitHub Actions, create .github/workflows/job-alerts.yml:

name: Job Alerts
on:
  schedule:
    - cron: '0 */6 * * *'
jobs:
  check:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4
      - uses: actions/setup-python@v5
        with:
          python-version: '3.12'
      - run: pip install requests
      - run: python job_alerts.py
        env:
          APIFY_TOKEN: ${{ secrets.APIFY_TOKEN }}
Enter fullscreen mode Exit fullscreen mode

Extending It

Once the basic system works, you can add:

  • Multiple sources — Add RemoteOK, Indeed, or other boards to the same pipeline
  • Deduplication — Track seen job URLs in a simple JSON file or SQLite database
  • Slack/Discord alerts — Replace the email function with a webhook POST
  • Salary parsing — Extract numeric ranges and filter more precisely
  • Dashboard — Push results to a Google Sheet for tracking over time

Why This Beats Built-In Alerts

Job board email alerts give you everything that matches a single keyword. This system lets you:

  • Combine multiple boards into one feed
  • Apply complex filters (salary + keywords + category)
  • Control the format and delivery channel
  • Keep a historical record of listings
  • Build on top of it (analytics, auto-apply, etc.)

The whole setup takes about 20 minutes, runs for free (within Apify's free tier and GitHub Actions limits), and you'll never miss a relevant remote job posting again.


What's your current job search automation setup? I'd love to hear what tools people are using — drop a comment below.

Top comments (0)