DEV Community

Vhub Systems
Vhub Systems

Posted on

I've Been Tracking Competitor Job Postings for 6 Months. Here's What I Learned.

Six months ago I started automatically scraping the careers pages of 24 competitors. I refresh the data weekly and track changes.

The strategic intelligence it produces is more useful than any paid market research I have bought.

What Job Postings Actually Reveal

Job descriptions are accidentally honest. They reveal:

Technology decisions: A job posting for "Senior Engineer — Kafka migration" tells you they are moving from batch to streaming data. A year before that shows up in any product announcement.

Organisational structure: Whether they want engineers to report to Product vs to an Engineering VP reveals their build culture. A VP of Revenue Operations means they are operationalising their sales motion.

Current pain points: Job descriptions describe real problems. "Experience debugging distributed systems at scale" means they have distributed system problems right now.

Pricing and packaging signals: "Experience with usage-based billing" means a pricing model shift is underway. "Enterprise account management" means they are going upmarket.

The 6 Months — What I Found

Month 1-2: Baseline

Established baseline job mix for each competitor. Most had predictable ratios: 60% engineering, 20% sales, 10% marketing, 10% ops.

Month 3: Competitor A Shifts

Competitor A suddenly posted 4 enterprise sales roles and a Director of Customer Success (Enterprise). Their standard ratio was 0 enterprise CS.

My interpretation: going upmarket. Adjusted our own pricing page to emphasise enterprise features we already had but were not merchandising.

Five weeks later: Competitor A announced an enterprise tier at 3x their SMB price.

Month 4: Competitor B's Tech Rewrite

Two "Staff Engineer" postings both mentioned "Rust" and "replacing our Python data pipeline." Standard job for them is Python/Django.

Interpretation: performance issues at scale, major infrastructure rewrite in progress.

Implication: their product will be slower to ship features for the next 6-9 months. Window to compete on features.

Month 5: The Fundraising Signal

Competitor C posted 12 jobs in 3 weeks (baseline: 2/month). All growth roles — demand gen, SDRs, sales engineers.

I checked SEC EDGAR. Found a Form D filing 18 days prior: $22M Series B.

TechCrunch announced it 3 weeks after I spotted the job spike.

Month 6: Competitor D's Pivot

Competitor D cut 8 customer success roles and posted 3 "Customer Success" roles titled specifically "Scale CS" with "1:many customer management" in the description.

Interpretation: moving from high-touch to low-touch CS. Probably raising prices on SMB tier or cutting it entirely.

Implication: their SMB customers will be underserved. An acquisition opportunity.

The Technical Setup

import hashlib, json
from datetime import datetime

class CompetitorJobTracker:
    def __init__(self, db):
        self.db = db

    def track_weekly(self, company: dict):
        current_jobs = self.scrape_careers_page(company['careers_url'])
        stored_jobs = self.db.get_latest(company['id'])

        # Detect changes
        current_ids = {j['id'] for j in current_jobs}
        stored_ids = {j['id'] for j in stored_jobs}

        new_jobs = [j for j in current_jobs if j['id'] not in stored_ids]
        removed_jobs = [j for j in stored_jobs if j['id'] not in current_ids]

        if new_jobs or removed_jobs:
            self.analyze_signal(company, new_jobs, removed_jobs)
            self.db.store(company['id'], current_jobs)

        return {'new': len(new_jobs), 'removed': len(removed_jobs)}

    def analyze_signal(self, company, new_jobs, removed_jobs):
        signals = []

        # Check for hiring surge
        if len(new_jobs) > company['baseline_weekly_hires'] * 3:
            signals.append('HIRING_SURGE: possible fundraise or major win')

        # Check for function shift
        new_functions = [self.classify_role(j['title']) for j in new_jobs]
        for func, count in Counter(new_functions).items():
            if count >= 3:
                signals.append(f'FUNCTION_SHIFT: {count} new {func} roles')

        # Check for senior/leadership hires
        senior_new = [j for j in new_jobs if any(
            t in j['title'].lower() for t in ['vp', 'director', 'head of', 'chief']
        )]
        if senior_new:
            signals.append(f'LEADERSHIP_HIRE: {[j["title"] for j in senior_new]}')

        if signals:
            self.send_alert(company['name'], signals)
Enter fullscreen mode Exit fullscreen mode

Cost and Infrastructure

  • 24 companies scraped weekly = ~480 pages/month
  • At $0.003/page with residential proxies: $1.44/month
  • VPS (shared): $3/month
  • Total: under $5/month for 24-company competitive intelligence

Comparable paid services (Crayon, Klue, Kompyte): $500-$2,000/month.

The Three Questions Worth Tracking

Not all job signals matter equally. Focus on these:

  1. Is headcount growing faster than product revenue would justify? (fundraise signal)
  2. Are new hires concentrated in one function? (strategic shift)
  3. Are leadership hires from enterprise companies? (upmarket signal)

Everything else is noise.

Get the Scraper

The careers page scraper handles the main technical challenges: varying page structures, JavaScript rendering, and anti-bot measures.

Competitor Intelligence Bundle — €29

Includes job posting tracker, weekly change detection, Slack/Telegram alert integration, and the signal classification logic.


What is the most surprising thing you have learned from tracking competitor job postings? I would like to hear in the comments.

Top comments (0)