DEV Community

Vhub Systems
Vhub Systems

Posted on

How to Track Competitor Job Postings to Predict Their Product Roadmap

Your competitor just announced a major new product direction. An AI-powered feature suite. A new payments module. An enterprise security tier. You found out when a prospect forwarded you their press release on a sales call.

Here's what you did not know: they had been hiring for it for six months. The signals were publicly visible on LinkedIn and their careers page — but nobody was watching.

Job postings are one of the most reliable leading indicators in competitive intelligence. Unlike press releases, which are crafted for a specific narrative, hiring decisions reflect real resource allocation. When a company opens multiple ML engineer roles in a single quarter, they are not just hiring — they are committing budget, time, and organizational capacity to a direction. That commitment shows up in job postings months before it shows up in a product announcement.

This article walks through how to monitor competitor hiring data automatically using Apify actors, categorize signals by function, and get notified when a meaningful cluster emerges.

Why Hiring Signals Work

Every job posting reveals something about internal priorities. Consider what different role clusters communicate:

  • ML engineers or data scientists → AI/ML feature investment or data pipeline build-out
  • Payments engineers or fintech specialists → expansion into financial services, billing, or payment processing
  • Security or compliance engineers → enterprise tier preparation, SOC 2 or regulatory certification pursuit
  • Developer relations or solutions engineers → API ecosystem expansion, partner integrations
  • Field sales or enterprise AEs → moving upmarket, away from self-serve toward enterprise deals

None of this is speculation. Headcount is expensive. When a company allocates it in a new direction, that allocation reflects an executive-level decision made weeks or months earlier. The job posting is a late artifact of that decision — but it still precedes the public product announcement by months.

The playbook: scrape competitor job postings weekly, categorize by function, and alert when any function shows a spike.

The System

The monitoring system has four components:

  1. Weekly job scrape — pull all open roles at target competitors from LinkedIn and their careers pages
  2. Function classification — tag each role by category (engineering, ML/AI, sales, security, DevRel, etc.)
  3. Role count diff — compare this week's counts to last week's by category
  4. Spike alert — trigger a notification when any category adds 3 or more roles in a single week

This runs entirely on the Apify free tier. No paid competitive intelligence platform required.

Step 1: Scrape LinkedIn Job Listings

Use the linkedin-job-scraper actor to pull open roles by company. Pass the competitor's LinkedIn company ID and cap results at 100 per run:

import { ApifyClient } from 'apify-client';

const client = new ApifyClient({ token: process.env.APIFY_TOKEN });

const competitors = [
  { name: 'competitor-a', linkedinId: 'linkedin-company-id-a' },
  { name: 'competitor-b', linkedinId: 'linkedin-company-id-b' },
];

async function scrapeLinkedInJobs(competitor) {
  const run = await client.actor('lanky_quantifier/linkedin-job-scraper').call({
    companyIds: [competitor.linkedinId],
    maxResults: 100,
  });

  const { items } = await client.dataset(run.defaultDatasetId).listItems();

  return items.map(item => ({
    competitor: competitor.name,
    title: item.title,
    function: classifyRole(item.title),
    postedAt: item.postedAt,
    source: 'linkedin',
  }));
}
Enter fullscreen mode Exit fullscreen mode

Step 2: Scrape Careers Pages Directly

LinkedIn does not always reflect every open role — some companies post to their own careers pages first or exclusively. Use website-content-crawler to scrape competitor careers pages:

async function scrapeCareersPage(competitorBaseUrl) {
  const run = await client.actor('apify/website-content-crawler').call({
    startUrls: [{ url: `${competitorBaseUrl}/careers` }],
    maxCrawlDepth: 2,
    maxCrawlPages: 50,
  });

  const { items } = await client.dataset(run.defaultDatasetId).listItems();

  // Extract job titles from crawled page text
  return items.map(page => ({
    url: page.url,
    text: page.text,
  }));
}
Enter fullscreen mode Exit fullscreen mode

Step 3: Classify Roles by Function

A simple keyword classifier buckets each role title into a function category:

function classifyRole(title) {
  const t = title.toLowerCase();
  if (/machine learning|ml engineer|data scientist|ai engineer|llm/.test(t)) return 'ML/AI';
  if (/payment|fintech|billing|financial/.test(t)) return 'Payments';
  if (/security|compliance|soc |infosec|appsec/.test(t)) return 'Security';
  if (/developer relations|devrel|solutions engineer/.test(t)) return 'DevRel';
  if (/enterprise|field sales|account executive/.test(t)) return 'Enterprise Sales';
  if (/infrastructure|platform|sre|devops|reliability/.test(t)) return 'Infrastructure';
  if (/product manager|head of product|principal pm/.test(t)) return 'Product';
  return 'Other';
}
Enter fullscreen mode Exit fullscreen mode

Step 4: Detect Spikes

Compare current role counts to the previous week's snapshot stored in a JSON file or Google Sheets:

function detectSpikes(current, previous, threshold = 3) {
  const spikes = [];

  for (const [competitor, functions] of Object.entries(current)) {
    for (const [fn, count] of Object.entries(functions)) {
      const prev = previous[competitor]?.[fn] ?? 0;
      const delta = count - prev;

      if (delta >= threshold) {
        spikes.push({
          competitor,
          function: fn,
          previousCount: prev,
          currentCount: count,
          delta,
          alert: `${competitor} added ${delta} ${fn} roles this week (${prev}${count})`,
        });
      }
    }
  }

  return spikes;
}
Enter fullscreen mode Exit fullscreen mode

Step 5: Send Slack Alerts

Route detected spikes to a Slack channel using an incoming webhook:

async function sendSlackAlert(spikes) {
  if (spikes.length === 0) return;

  const message = spikes.map(s => `⚠️ ${s.alert}`).join('\\n');

  await fetch(process.env.SLACK_WEBHOOK_URL, {
    method: 'POST',
    headers: { 'Content-Type': 'application/json' },
    body: JSON.stringify({
      text: `*Competitor Hiring Signal Alert*\\n${message}`,
    }),
  });
}
Enter fullscreen mode Exit fullscreen mode

Schedule this pipeline weekly using Apify Schedules (free) or a cron job. The alert lands in Slack before your team's Monday standup. When the product announcement drops three months later, you already saw it coming.

Full Weekly Run Sequence

  1. Scrape LinkedIn job listings for each competitor (linkedin-job-scraper)
  2. Scrape careers pages for direct postings (website-content-crawler)
  3. Classify each role by function keyword
  4. Load last week's counts from storage
  5. Run spike detection — threshold: 3+ new roles in one function
  6. Send Slack alert if spikes detected
  7. Save current counts as new baseline

Total runtime: under 5 minutes per weekly run.

What to Watch For

The most actionable signal patterns:

  • Sudden ML/AI cluster — multiple ML or AI engineer roles in a single quarter → AI feature announcement likely within 3–6 months
  • Payments + compliance together — combined hiring signal → fintech expansion or regulated-market push
  • Enterprise sales + solutions engineering simultaneously — moving upmarket, likely adding an enterprise tier
  • Hiring spike followed by silence — the project may have been completed, cancelled, or paused

Hiring data is not a crystal ball. But it's a directional signal that costs nothing to collect and consistently arrives months before the press release.

Cost

Component Tool Cost
LinkedIn job scraping linkedin-job-scraper Free tier
Careers page scraping website-content-crawler Free tier
Role classification Custom JS
Spike detection Custom JS
Alerts Slack webhook Free
Total $0

Commercial competitive intelligence platforms with hiring signal tracking: $500–$2,000/month. Manual analyst tooling for hiring analysis: $300–$800/month.

The Apify free tier is sufficient for monitoring 3–5 competitors at weekly frequency. If you need to watch 10+ competitors daily, you move into paid tiers — but still at a fraction of platform pricing.

Summary

Set it up once. Run it weekly. The next time your competitor makes a major product announcement, you will have had the signal in your Slack channel months earlier — not forwarded to you by a prospect on a sales call.

Top comments (0)