DEV Community

Vhub Systems
Vhub Systems

Posted on

How to Build a Competitor Pricing Tracker That Emails You When Prices Change

You find out a competitor dropped their free plan — from a churned customer.

"We switched because your pricing doesn't match what we need now," they wrote. You go to the competitor's pricing page. Sure enough: three weeks ago they added a higher seat limit on the mid-tier plan and quietly dropped the free plan entirely. Your AE had no idea. Your CSM had no idea. Nobody had a Google Alert for it.

This happens constantly. Competitors change pricing structures — mid-cycle, mid-quarter, mid-sales-conversation — and most teams only notice when it hurts them.

A weekly automated scraper would have flagged it within days of the change. Here's how to build one.


Why Pricing Pages Are Unusually Hard to Monitor

The naive approach — download the page, hash it, compare to last week — fails immediately in practice.

Dynamic rendering. Most SaaS pricing pages render via JavaScript. A simple curl returns a skeleton with no pricing data. You need a browser-based scraper that executes JavaScript before reading the DOM.

A/B testing. Pricing page copy is heavily A/B-tested. The "Get started" button might read "Start free trial" or "Try for free" depending on which cohort you land in. A raw HTML diff will trigger on copy variants, not actual pricing changes.

Geo-gating. Some vendors show different prices by region. USD pricing may differ from EUR pricing even at the same URL. Your scraper's IP location matters.

Trivial markup noise. The page title might include a dynamic count ("Join 12,847 teams →"), the footer might timestamp itself, and ad scripts update on every load. A hash diff on raw HTML produces constant false positives.

The answer isn't to hash the whole page. It's to extract only the structured elements you care about — plan names, price points, feature counts, CTA text — and diff those against last week's snapshot.


The Defense Pattern

The architecture is straightforward:

  1. Weekly scheduled run — trigger a scraper every Monday at 9am
  2. Structured extraction — pull plan names, prices, billing cadence, feature list items, and CTA labels into JSON
  3. Snapshot storage — save the structured output per run with a timestamp
  4. Diff engine — compare this week's snapshot to last week's; flag any changed fields
  5. Email/Slack alert — send a formatted alert only when a meaningful change is detected

"Meaningful change" is the key filter. You want alerts for: price point changes, plan name changes, plan additions or removals, seat limit changes, feature additions or removals from a plan. You don't want alerts for: footer copy changes, testimonial rotations, nav link updates.


Step 1 — Scrape the Pricing Page with Apify

Apify's website-content-crawler renders JavaScript-heavy pages and extracts clean text content. It handles the browser execution layer for you — no Puppeteer setup, no Playwright infrastructure.

Configure the actor:

{
  "startUrls": [
    { "url": "https://competitor-a.com/pricing" },
    { "url": "https://competitor-b.com/pricing" }
  ],
  "maxCrawlPages": 1,
  "crawlerType": "playwright:firefox"
}
Enter fullscreen mode Exit fullscreen mode

The actor returns structured output including text (the readable content) and markdown (a clean markdown rendering of the page). For pricing pages, the markdown output is particularly useful — it strips layout noise and surfaces the actual plan structure.

Run it via the Apify API:

const Apify = require('apify-client');

const client = new Apify.ApifyClient({ token: process.env.APIFY_TOKEN });

async function scrapePricingPages(urls) {
  const run = await client.actor('apify/website-content-crawler').call({
    startUrls: urls.map(url => ({ url })),
    maxCrawlPages: 1,
    crawlerType: 'playwright:firefox'
  });

  const { items } = await client.dataset(run.defaultDatasetId).listItems();
  return items;
}
Enter fullscreen mode Exit fullscreen mode

Cost: ~$0.002–$0.005 per page crawl on the Apify free tier. Monitoring 5 competitor pricing pages weekly costs less than $0.10/month.


Step 2 — Extract Structured Pricing Data

Raw page content isn't enough. You need to parse it into a schema you can diff.

function extractPricingStructure(pageText) {
  // Extract price patterns: $X/mo, $X/month, $X per user
  const priceMatches = pageText.match(/\$[\d,]+(?:\/mo(?:nth)?|\/user)?/gi) || [];

  // Extract plan-like headings
  const planMatches = pageText.match(/(?:starter|basic|pro|professional|business|enterprise|free|growth|scale)\s+(?:plan)?/gi) || [];

  // Extract seat/user limits
  const seatMatches = pageText.match(/(?:up to\s+)?\d+\s+(?:seats?|users?|team members?)/gi) || [];

  return {
    prices: [...new Set(priceMatches.map(p => p.toLowerCase()))].sort(),
    plans: [...new Set(planMatches.map(p => p.toLowerCase().trim()))].sort(),
    seats: [...new Set(seatMatches.map(s => s.toLowerCase().trim()))].sort(),
    scrapedAt: new Date().toISOString()
  };
}
Enter fullscreen mode Exit fullscreen mode

This is intentionally loose — it catches patterns rather than relying on fragile CSS selectors that break on redesigns. Adjust the regex patterns for your specific competitors' pricing vocabulary.


Step 3 — Compare Snapshots and Detect Changes

Store each week's snapshot in a simple JSON file or key-value store. Compare against the previous snapshot:

function compareSnapshots(current, previous, competitorName) {
  const changes = [];

  const addedPrices = current.prices.filter(p => !previous.prices.includes(p));
  const removedPrices = previous.prices.filter(p => !current.prices.includes(p));

  if (addedPrices.length) changes.push(`New prices detected: ${addedPrices.join(', ')}`);
  if (removedPrices.length) changes.push(`Prices removed: ${removedPrices.join(', ')}`);

  const addedPlans = current.plans.filter(p => !previous.plans.includes(p));
  const removedPlans = previous.plans.filter(p => !current.plans.includes(p));

  if (addedPlans.length) changes.push(`New plans: ${addedPlans.join(', ')}`);
  if (removedPlans.length) changes.push(`Plans removed: ${removedPlans.join(', ')}`);

  const addedSeats = current.seats.filter(s => !previous.seats.includes(s));
  const removedSeats = previous.seats.filter(s => !current.seats.includes(s));

  if (addedSeats.length || removedSeats.length) {
    changes.push(`Seat/user limits changed`);
  }

  return changes.length > 0 ? { competitor: competitorName, changes } : null;
}
Enter fullscreen mode Exit fullscreen mode

Step 4 — Send the Email Alert

Use Nodemailer with any SMTP provider (Gmail, SendGrid, Mailgun) to send an alert when changes are detected:

const nodemailer = require('nodemailer');

async function sendPricingAlert(alerts) {
  if (!alerts.length) return;

  const transporter = nodemailer.createTransport({
    service: 'gmail',
    auth: { user: process.env.ALERT_EMAIL, pass: process.env.ALERT_PASSWORD }
  });

  const body = alerts.map(a =>
    `**${a.competitor}**\n${a.changes.map(c => `  - ${c}`).join('\n')}`
  ).join('\n\n');

  await transporter.sendMail({
    from: process.env.ALERT_EMAIL,
    to: process.env.NOTIFY_EMAIL,
    subject: `⚠️ Competitor pricing change detected — ${new Date().toDateString()}`,
    text: `Pricing changes detected this week:\n\n${body}\n\nRun your weekly competitor pricing review.`
  });
}
Enter fullscreen mode Exit fullscreen mode

For Slack, replace sendMail with a fetch call to your Slack webhook URL and format the body as Slack blocks.


Scheduling It Weekly

In Apify, you can schedule the entire pipeline as an Actor task with a cron expression:

0 9 * * 1
Enter fullscreen mode Exit fullscreen mode

This triggers every Monday at 9am. Apify's free tier includes scheduled runs — monitoring 5 pricing pages weekly fits comfortably within the free compute allocation.


Cost vs. Commercial Alternatives

Option Monthly cost Coverage
This setup (Apify free tier) $0 Your specific competitor list
Kompyte / Crayon / Klue $300–$800/mo Broad but generic
Manual spot-checks Free but unreliable Whatever someone remembers

Commercial pricing intelligence platforms do more — they track G2 reviews, news mentions, and job postings. But if your primary need is "alert me when a competitor changes their pricing page," this setup covers it completely at zero marginal cost.


What to Do When You Get the Alert

The alert is a signal, not a verdict. When you receive a notification:

  1. Visit the page manually — confirm the change is real, not a scraper artifact
  2. Screenshot the new structure — for your records and battlecard updates
  3. Brief your sales team within 24 hours — they'll face objections in the next call
  4. Update your competitive battlecard — pricing comparisons go stale fast
  5. Flag to product — pricing structure changes often signal strategic pivots

The scraper buys you the lead time. What you do with that time is the actual competitive advantage.


Pricing pages are one of the most strategically important pages a competitor publishes — and one of the most overlooked signals in competitive intelligence. A 30-minute setup running weekly on Apify's free tier means you're never the last to know.

Top comments (0)