DEV Community

Cover image for How I'd Build a Competitor Pricing Change Detector From Ads and Landing Pages
Olamide Olaniyan
Olamide Olaniyan

Posted on

How I'd Build a Competitor Pricing Change Detector From Ads and Landing Pages

Competitors almost never announce pricing changes cleanly.

What usually happens is messier.

An ad that used to say "book a demo" now says "start free." A landing page introduces team pricing. A comparison page suddenly starts talking about replacing three tools. A savings claim appears where there used to be none.

By the time sales hears about it in calls, the move is already live.

That is why I like building pricing detection from public signals instead of waiting for market gossip.

This post is the workflow I would use: watch ad-library message shifts, pull the landing pages behind those ads, extract the parts that matter, and diff them over time.

It is a much better system than casually checking a pricing page once a month and pretending that counts as monitoring.

Why Ads Matter Here

When a company changes pricing, it usually changes the framing around the offer too.

That shows up as:

  • explicit price language
  • a new CTA like start free
  • annual savings messaging
  • bundles or consolidation language
  • more aggressive ROI framing

So ad libraries are not just creative research. They are early pricing signals.

That is what makes this workflow useful.

The Two Layers I Actually Need

I want two layers working together.

Layer 1: Ad monitoring

Catch pricing-related message shifts as soon as they show up in paid campaigns.

Layer 2: Landing-page monitoring

Confirm whether the actual page now reflects a new pricing, packaging, or CTA structure.

If you only look at the landing page, you are often late.

If you only look at the ads, you miss the underlying offer detail.

You need both.

JavaScript Version: Detect Pricing Signals From Public Ads

This version uses Facebook and LinkedIn ad data, then flags likely pricing changes.

const headers = {
  'X-API-Key': process.env.SOCIAVAULT_API_KEY,
};

async function fetchJson(url) {
  const response = await fetch(url, { headers });
  if (!response.ok) {
    throw new Error(`Request failed with ${response.status}`);
  }
  return response.json();
}

function normalizeAds(items = []) {
  return (items || []).map(item => ({
    headline: item.headline || item.title || item.snapshot?.title || '',
    body: item.body || item.text || item.snapshot?.body?.markup || '',
    cta: item.cta || item.call_to_action || item.snapshot?.cta_text || '',
    url: item.url || item.landingPageUrl || item.snapshot?.link_url || '',
  }));
}

function isPricingSignal(ad) {
  const text = `${ad.headline} ${ad.body} ${ad.cta}`.toLowerCase();
  return /(pricing|save|annual|per month|\/month|start free|bundle|replace|book a demo|talk to sales)/.test(text);
}

function summarizeSignals(ads) {
  return ads
    .filter(isPricingSignal)
    .map(ad => ({
      headline: ad.headline,
      cta: ad.cta,
      url: ad.url,
    }));
}

async function collectSignals(company) {
  const [facebookJson, linkedinJson] = await Promise.all([
    fetchJson(
      `https://api.sociavault.com/v1/scrape/facebook-ad-library/company-ads?companyName=${encodeURIComponent(company)}&status=ACTIVE&trim=true`
    ),
    fetchJson(
      `https://api.sociavault.com/v1/scrape/linkedin-ad-library/search?company=${encodeURIComponent(company)}`
    ),
  ]);

  return {
    facebook: summarizeSignals(normalizeAds(facebookJson.data)),
    linkedin: summarizeSignals(normalizeAds(linkedinJson.data)),
  };
}

collectSignals('HubSpot')
  .then(signals => console.log(signals))
  .catch(error => console.error(error));
Enter fullscreen mode Exit fullscreen mode

That gets me the first layer: which ads currently look pricing-relevant.

If you want a public ad-data layer without wiring every platform separately, SociaVault is the piece I would use for that.

Python Version: Add Landing Page Extraction

Once I have the ad URLs, I want to inspect the pages too.

import os
import re
import requests


HEADERS = {'X-API-Key': os.environ['SOCIAVAULT_API_KEY']}


def fetch_json(url):
    response = requests.get(url, headers=HEADERS, timeout=30)
    response.raise_for_status()
    return response.json()


def normalize_ads(items=None):
    items = items or []
    normalized = []
    for item in items:
        normalized.append({
            'headline': item.get('headline') or item.get('title') or item.get('snapshot', {}).get('title', ''),
            'body': item.get('body') or item.get('text') or item.get('snapshot', {}).get('body', {}).get('markup', ''),
            'cta': item.get('cta') or item.get('call_to_action') or item.get('snapshot', {}).get('cta_text', ''),
            'url': item.get('url') or item.get('landingPageUrl') or item.get('snapshot', {}).get('link_url', ''),
        })
    return normalized


def is_pricing_signal(ad):
    text = f"{ad['headline']} {ad['body']} {ad['cta']}".lower()
    return any(term in text for term in ['pricing', 'save', 'annual', 'per month', '/month', 'start free', 'bundle', 'replace', 'book a demo', 'talk to sales'])


def fetch_html(url):
    response = requests.get(
        url,
        headers={'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36'},
        timeout=30,
    )
    response.raise_for_status()
    return response.text


def extract(html, pattern):
    match = re.search(pattern, html, re.IGNORECASE | re.DOTALL)
    return re.sub(r'\s+', ' ', match.group(1)).strip() if match else None


def parse_page(html, url):
    return {
        'url': url,
        'title': extract(html, r'<title>(.*?)</title>'),
        'h1': extract(html, r'<h1[^>]*>(.*?)</h1>'),
        'cta': extract(html, r'<a[^>]*>(Start free|Book a demo|Talk to sales|Try free|Get started)</a>'),
    }


def collect_pricing_detector(company):
    facebook = fetch_json(
        f'https://api.sociavault.com/v1/scrape/facebook-ad-library/company-ads?companyName={company}&status=ACTIVE&trim=true'
    )
    linkedin = fetch_json(
        f'https://api.sociavault.com/v1/scrape/linkedin-ad-library/search?company={company}'
    )

    ads = normalize_ads(facebook.get('data')) + normalize_ads(linkedin.get('data'))
    candidate_ads = [ad for ad in ads if is_pricing_signal(ad) and ad.get('url')]

    pages = []
    for ad in candidate_ads[:10]:
        try:
            html = fetch_html(ad['url'])
            pages.append(parse_page(html, ad['url']))
        except Exception as error:
            print(f'Failed to fetch {ad["url"]}: {error}')

    return {
        'signals': candidate_ads,
        'pages': pages,
    }


print(collect_pricing_detector('HubSpot'))
Enter fullscreen mode Exit fullscreen mode

That gives me both layers:

  • the pricing-related ads
  • the current landing pages behind them

Which is exactly what I need to spot changes earlier.

What I Watch For First

The most useful signs are usually:

  • CTA shift from demo to self-serve
  • explicit price language appearing in ads
  • bundling or consolidation language on the page
  • hero headline changing from feature-led to savings-led
  • comparison page rollout tied to active ad campaigns

Those often tell you more than a formal pricing page refresh.

Honest Alternatives

There are a few other approaches.

Manual competitor review

Fine for one or two companies once a quarter.

Usually too inconsistent to rely on.

Pure landing-page monitoring

Helpful, but often later than the ad-side signal.

Full site crawler

Powerful, but more breadth than you need if the real interest is active paid traffic.

That is why I like starting from ad-linked pages.

They are higher signal.

Final Take

Competitor pricing changes usually show up as public clues before they show up as clean announcements.

That is why I like building a detector around ad-library signals plus landing-page diffs.

It is faster, more repeatable, and much less dependent on memory.

If you want to build that workflow without stitching together the public ad layer by hand, SociaVault is a good place to start.

Then keep the rest straightforward: collect the ads, identify pricing signals, fetch the landing pages, diff the language.

That alone gives you a very useful early-warning system.

webdev #python #javascript #marketing #competitiveanalysis

Top comments (0)