DEV Community

Cover image for Building a Lightweight Website Monitor with Cron and an API
Sen Shi
Sen Shi

Posted on

Building a Lightweight Website Monitor with Cron and an API

You can build a working website change monitor in under 100 lines of Node.js. No database, no queue, no container orchestration. Just cron, a screenshot API, pixelmatch, and a notification hook. The whole thing runs on a $5/mo VPS or a free GitHub Actions workflow.

Here's the full build, from first line to deployment.

The Stack

Component Tool Cost
Scheduler System cron or node-cron Free
Screenshot capture Screenshot API (GET request) Free tier / $9 mo
Image comparison pixelmatch Free (npm)
Notification Slack webhook or email (nodemailer) Free
Storage Local filesystem Free

Total infrastructure cost: $0-5/month. Screenshot API cost depends on how many URLs you monitor and how often. Twenty URLs captured once daily is 600 screenshots per month, which fits comfortably in the free tier of most providers.

Prerequisites

mkdir website-monitor && cd website-monitor
npm init -y
npm install pngjs pixelmatch nodemailer
Enter fullscreen mode Exit fullscreen mode

That's three dependencies. pngjs parses PNG buffers. pixelmatch does pixel-level comparison. nodemailer sends email alerts (swap this for a Slack webhook fetch if you prefer).

The Complete Script

This is the entire monitor. Under 100 lines of actual logic.

// monitor.js
import fs from "fs/promises";
import path from "path";
import { PNG } from "pngjs";
import pixelmatch from "pixelmatch";
import nodemailer from "nodemailer";

// ─── Config ──────────────────────────────────────────────
const SCREENSHOT_API = "<your api of choise>";
const API_KEY = process.env.SCREENSHOT_API_KEY;
const STORAGE = process.env.STORAGE_DIR || "./captures";
const DIFF_THRESHOLD = 0.005; // 0.5% of pixels must differ

const URLS = [
  { name: "competitor-pricing", url: "https://competitor.com/pricing" },
  { name: "partner-terms", url: "https://partner.com/terms" },
  { name: "our-homepage", url: "https://our-app.com" },
];

const EMAIL = {
  from: "monitor@yourdomain.com",
  to: "you@yourdomain.com",
  smtp: {
    host: process.env.SMTP_HOST || "smtp.mailgun.org",
    port: 587,
    auth: {
      user: process.env.SMTP_USER,
      pass: process.env.SMTP_PASS,
    },
  },
};

// ─── Capture ─────────────────────────────────────────────
async function capture(url) {
  const params = new URLSearchParams({
    url,
    format: "png",
    width: "1280",
    height: "800",
    full_page: "true",
    block_ads: "true",
    block_cookie_banners: "true",
  });

  const res = await fetch(`${SCREENSHOT_API}?${params}`, {
    headers: { "X-API-Key": API_KEY },
  });

  if (!res.ok) throw new Error(`API ${res.status}: ${res.statusText}`);
  return Buffer.from(await res.arrayBuffer());
}

// ─── Compare ─────────────────────────────────────────────
function compare(bufA, bufB) {
  const a = PNG.sync.read(bufA);
  const b = PNG.sync.read(bufB);

  if (a.width !== b.width || a.height !== b.height) {
    return { changed: true, pct: 100 };
  }

  const { width, height } = a;
  const diff = new PNG({ width, height });
  const badPixels = pixelmatch(
    a.data, b.data, diff.data,
    width, height,
    { threshold: 0.1 }
  );

  const pct = (badPixels / (width * height)) * 100;
  return { changed: pct / 100 > DIFF_THRESHOLD, pct, diffBuf: PNG.sync.write(diff) };
}

// ─── Alert ───────────────────────────────────────────────
async function alert(site, pct, diffBuf) {
  const transport = nodemailer.createTransport(EMAIL.smtp);
  await transport.sendMail({
    from: EMAIL.from,
    to: EMAIL.to,
    subject: `[Monitor] ${site.name} changed (${pct.toFixed(2)}%)`,
    text: `${site.url} has changed by ${pct.toFixed(2)}%.\nDiff image attached.`,
    attachments: diffBuf
      ? [{ filename: "diff.png", content: diffBuf }]
      : [],
  });
}

// ─── Main loop ───────────────────────────────────────────
async function run() {
  for (const site of URLS) {
    const dir = path.join(STORAGE, site.name);
    await fs.mkdir(dir, { recursive: true });

    const latestPath = path.join(dir, "latest.png");
    const prevPath = path.join(dir, "previous.png");

    try {
      const screenshot = await capture(site.url);

      // Load previous capture if it exists
      let prev = null;
      try {
        prev = await fs.readFile(latestPath);
      } catch {
        // First run, no previous file
      }

      if (prev) {
        const result = compare(prev, screenshot);
        const status = result.changed ? "CHANGED" : "ok";
        console.log(`${site.name}: ${result.pct.toFixed(2)}% diff [${status}]`);

        if (result.changed) {
          await alert(site, result.pct, result.diffBuf);
          // Archive the previous version
          const ts = new Date().toISOString().replace(/[:.]/g, "-");
          await fs.copyFile(latestPath, path.join(dir, `${ts}.png`));
        }
      } else {
        console.log(`${site.name}: first capture`);
      }

      // Rotate: current latest becomes previous, new capture becomes latest
      try {
        await fs.copyFile(latestPath, prevPath);
      } catch {
        // No latest yet
      }
      await fs.writeFile(latestPath, screenshot);
    } catch (err) {
      console.error(`${site.name}: ${err.message}`);
    }
  }
}

run();
Enter fullscreen mode Exit fullscreen mode

What Each Step Does

  1. Capture: sends a GET request to the screenshot API with the target URL. Returns a PNG buffer. Full-page capture with ad blocking and cookie banner removal keeps the captures clean.

  2. Load previous: reads latest.png from the site's directory. On first run this file doesn't exist, so we skip comparison and just save.

  3. Compare: pixelmatch walks both images pixel by pixel. The threshold: 0.1 option handles anti-aliasing and sub-pixel rendering differences. We calculate the percentage of pixels that differ.

  4. Alert: if the diff exceeds 0.5%, we send an email with the diff image attached. The diff image highlights changed pixels in red, making it easy to spot what moved.

  5. Save: the new screenshot becomes latest.png. If a change was detected, we also archive the old version with a timestamp so you have a history of what changed and when.

No database needed. The filesystem is your storage. Each monitored URL gets a directory, each directory holds latest.png, previous.png, and any archived snapshots from detected changes.

Swapping Email for Slack

If your team lives in Slack, replace the alert function:

async function alert(site, pct) {
  const webhook = process.env.SLACK_WEBHOOK_URL;
  await fetch(webhook, {
    method: "POST",
    headers: { "Content-Type": "application/json" },
    body: JSON.stringify({
      blocks: [
        {
          type: "section",
          text: {
            type: "mrkdwn",
            text: `*Website change detected*\n<${site.url}|${site.name}>: ${pct.toFixed(2)}% different`,
          },
        },
      ],
    }),
  });
}
Enter fullscreen mode Exit fullscreen mode

For Slack image attachments, you'd need to upload the diff image to a public URL (S3, R2, or your own server) and include it in the block. For most cases, the text notification is enough to prompt someone to go look.

Deployment Options

Option 1: $5/mo VPS

Any cheap VPS (Hetzner, DigitalOcean, Vultr) works. Install Node.js, clone your monitor, set up cron.

# Install
ssh root@your-vps
apt update && apt install -y nodejs npm
git clone https://github.com/you/website-monitor.git /opt/monitor
cd /opt/monitor && npm install

# Environment
cat > /opt/monitor/.env << 'EOF'
SCREENSHOT_API_KEY=your_key_here
SMTP_HOST=smtp.mailgun.org
SMTP_USER=postmaster@yourdomain.com
SMTP_PASS=your_smtp_password
STORAGE_DIR=/opt/monitor/captures
EOF

# Cron: run daily at 9 AM UTC
crontab -e
# Add:
0 9 * * * cd /opt/monitor && env $(cat .env | xargs) node monitor.js >> /var/log/monitor.log 2>&1
Enter fullscreen mode Exit fullscreen mode

Option 2: Raspberry Pi

Same setup as the VPS but runs on your local network. Good if you want to avoid monthly hosting costs and don't mind the Pi being on 24/7. Use node-cron instead of system cron for easier management:

import cron from "node-cron";
import { run } from "./monitor.js";

cron.schedule("0 9 * * *", () => {
  console.log(`[${new Date().toISOString()}] Running check...`);
  run().catch(console.error);
});

console.log("Monitor started. Checking daily at 09:00.");
Enter fullscreen mode Exit fullscreen mode

Option 3: GitHub Actions (Free)

This is the zero-infrastructure option. No server to maintain. GitHub gives you 2,000 minutes/month on the free plan, and each monitor run takes under a minute.

# .github/workflows/monitor.yml
name: Website Monitor

on:
  schedule:
    - cron: "0 9 * * *"  # Daily at 9 AM UTC
  workflow_dispatch:       # Manual trigger

jobs:
  monitor:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v4

      - uses: actions/setup-node@v4
        with:
          node-version: 20

      - run: npm ci

      - name: Download previous captures
        uses: actions/download-artifact@v4
        with:
          name: captures
          path: ./captures
        continue-on-error: true  # First run has no artifact

      - name: Run monitor
        env:
          SCREENSHOT_API_KEY: ${{ secrets.SCREENSHOT_API_KEY }}
          SLACK_WEBHOOK_URL: ${{ secrets.SLACK_WEBHOOK_URL }}
        run: node monitor.js

      - name: Upload captures
        uses: actions/upload-artifact@v4
        with:
          name: captures
          path: ./captures
          retention-days: 30
Enter fullscreen mode Exit fullscreen mode

The trick is using GitHub Actions artifacts to persist screenshots between runs. download-artifact grabs the previous captures, the script runs and compares, then upload-artifact saves the new state.

Limitation: artifacts are retained for a maximum of 90 days, and storing large numbers of screenshots will eat into your storage quota. For 20 URLs, this is fine. For 200, use a VPS.

Dealing with Noise

The first time you run website change monitoring with screenshots, you'll get false positives. Dynamic content like timestamps, ad rotations, and chat widgets cause small differences on every capture.

Three fixes:

1. Hide noisy elements. Add hide_selectors to your API call to remove dynamic content before capture:

const params = new URLSearchParams({
  url,
  format: "png",
  width: "1280",
  full_page: "true",
  block_ads: "true",
  block_cookie_banners: "true",
  hide_selectors: ".chat-widget,.timestamp,.ad-slot,.visitor-count",
});
Enter fullscreen mode Exit fullscreen mode

2. Raise the threshold for noisy pages. Instead of a global 0.5% threshold, use per-URL thresholds:

const URLS = [
  { name: "legal-page", url: "https://example.com/terms", threshold: 0.001 },
  { name: "news-site", url: "https://news.example.com", threshold: 0.03 },
  { name: "pricing", url: "https://competitor.com/pricing", threshold: 0.005 },
];
Enter fullscreen mode Exit fullscreen mode

Then in the comparison: result.pct / 100 > site.threshold.

3. Use a file size pre-check. If the PNG file size barely changed, skip the expensive pixel comparison entirely:

function quickCheck(bufA, bufB) {
  const sizeDiff = Math.abs(bufA.length - bufB.length) / bufA.length;
  return sizeDiff > 0.05; // Only run pixelmatch if size changed > 5%
}
Enter fullscreen mode Exit fullscreen mode

Monitoring More URLs Without More Code

The script handles any number of URLs. Just add them to the URLS array. To load targets from a file instead of hardcoding them:

// urls.json
[
  { "name": "competitor-a", "url": "https://a.com/pricing", "threshold": 0.005 },
  { "name": "competitor-b", "url": "https://b.com/pricing", "threshold": 0.005 },
  { "name": "our-terms", "url": "https://our.com/terms", "threshold": 0.001 }
]
Enter fullscreen mode Exit fullscreen mode
const URLS = JSON.parse(
  await fs.readFile("./urls.json", "utf-8")
);
Enter fullscreen mode Exit fullscreen mode

Now adding a new monitored URL is a one-line JSON edit. No code changes, no redeployment.

Cost Breakdown

Monitoring Setup URLs Frequency Monthly Captures API Cost Infra Cost
Minimal 5 Daily 150 Free $0 (GitHub Actions)
Typical 20 Daily 600 Free $0 (GitHub Actions)
Active 20 4x daily 2,400 ~$9/mo $0 (GitHub Actions)
Heavy 50 Hourly 36,000 ~$29/mo $5/mo VPS

For most indie dev use cases, 20 URLs monitored daily lands you at 600 screenshots per month. That's well within the free tier of APIs or barely into the $9/mo starter plan. Website change monitoring with screenshots doesn't have to cost anything meaningful.

Extending the Monitor

Once the base monitor works, useful additions include:

Screenshot history viewer. A simple Express server that lists archived screenshots by URL and date. Ten lines of code to serve the captures/ directory with a file listing.

Change log. Append a line to a CSV or JSON file whenever a change is detected. Over time, this gives you a timeline of when each monitored page changed.

async function logChange(site, pct) {
  const entry = JSON.stringify({
    site: site.name,
    url: site.url,
    diff: pct,
    time: new Date().toISOString(),
  });
  await fs.appendFile("./change-log.jsonl", entry + "\n");
}
Enter fullscreen mode Exit fullscreen mode

Multiple notification channels. Alert via email for critical pages, Slack for informational ones, and a webhook for integration with your internal tools.

The core stays under 100 lines. Each extension adds maybe 10-20 lines. That's the benefit of building your own website change monitoring with screenshots instead of paying for a SaaS tool: you control every part of the pipeline and can add exactly what you need.

Top comments (0)