DEV Community

Custodia-Admin
Custodia-Admin

Posted on • Originally published at pagebolt.dev

Website Monitoring: Automatic Screenshots Every Hour (Detect Changes & Prove Performance)

Website Monitoring: Automatic Screenshots Every Hour (Detect Changes & Prove Performance)

Imagine this: your website goes down at 3 AM. By the time your on-call engineer notices, it's been offline for 45 minutes. When the incident post-mortem happens, your CEO asks: "Where's the proof of the outage? How long was it actually down?"

You have logs. But logs aren't proof — they're just text. What you need are screenshots — visual evidence of what your website looked like at 2:15 AM, 3:00 AM, and 3:45 AM.

Or worse: you deploy a CSS change on Friday. By Monday, you have 10 angry support tickets about "the site looks broken now." But the screenshots look fine to you. You need side-by-side visual comparison: Friday 5 PM vs. Monday 9 AM.

Here's how to set up automatic website monitoring with hourly screenshots — no infrastructure, no headaches.

The Problem: Text Logs Aren't Proof

Right now, your monitoring setup probably includes:

  • Uptime monitoring (Pingdom, StatusPage) — just HTTP 200/500 codes
  • Performance monitoring (Datadog) — metrics only
  • Error tracking (Sentry) — stack traces

But none of these answer: What did the user actually see?

Text logs miss the visual story. Screenshots prove it.

The Solution: Scheduled Screenshots + Storage

Here's a simple architecture:

  1. Node.js runs a cron job every hour
  2. Takes a screenshot of your production site
  3. Stores it in cloud storage (S3, Cloudflare R2, or even a GitHub repo)
  4. Generates a timestamped archive
const fetch = require('node-fetch');
const fs = require('fs');
const path = require('path');

async function scheduleHourlyScreenshots() {
  // Run every hour
  setInterval(async () => {
    const timestamp = new Date().toISOString();
    console.log(`[${timestamp}] Taking screenshot...`);

    try {
      const response = await fetch('https://api.pagebolt.com/v1/screenshot', {
        method: 'POST',
        headers: {
          'Authorization': `Bearer ${process.env.PAGEBOLT_API_KEY}`,
          'Content-Type': 'application/json'
        },
        body: JSON.stringify({
          url: 'https://yoursite.com',
          viewport: { width: 1280, height: 720 },
          format: 'png'
        })
      });

      if (!response.ok) {
        console.error(`Screenshot failed: ${response.statusText}`);
        return;
      }

      const buffer = await response.buffer();
      const filename = `screenshot-${timestamp.replace(/[:.]/g, '-')}.png`;

      // Save to local file or upload to S3
      fs.writeFileSync(path.join('./screenshots', filename), buffer);
      console.log(`✓ Saved ${filename}`);

    } catch (error) {
      console.error(`Error: ${error.message}`);
    }
  }, 60 * 60 * 1000); // Every hour
}

scheduleHourlyScreenshots();
Enter fullscreen mode Exit fullscreen mode

Production Setup: Upload to S3

For production, store screenshots in cloud storage. Here's an S3 example:

const AWS = require('aws-sdk');
const fetch = require('node-fetch');

const s3 = new AWS.S3({
  accessKeyId: process.env.AWS_ACCESS_KEY_ID,
  secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
  region: 'us-east-1'
});

async function uploadScreenshot(buffer, timestamp) {
  const key = `website-monitoring/${timestamp.split('T')[0]}/${timestamp}.png`;

  const params = {
    Bucket: process.env.S3_BUCKET,
    Key: key,
    Body: buffer,
    ContentType: 'image/png'
  };

  return new Promise((resolve, reject) => {
    s3.upload(params, (err, data) => {
      if (err) reject(err);
      else resolve(data);
    });
  });
}

async function captureAndUpload() {
  const timestamp = new Date().toISOString();

  const response = await fetch('https://api.pagebolt.com/v1/screenshot', {
    method: 'POST',
    headers: {
      'Authorization': `Bearer ${process.env.PAGEBOLT_API_KEY}`,
      'Content-Type': 'application/json'
    },
    body: JSON.stringify({
      url: process.env.WEBSITE_URL,
      viewport: { width: 1280, height: 720 },
      format: 'png'
    })
  });

  const buffer = await response.buffer();
  const s3Data = await uploadScreenshot(buffer, timestamp);

  console.log(`✓ Screenshot uploaded: ${s3Data.Location}`);
  return s3Data;
}

// Run every hour
setInterval(captureAndUpload, 60 * 60 * 1000);
Enter fullscreen mode Exit fullscreen mode

Use Case 1: Incident Post-Mortems

When your site goes down, you have visual evidence:

3:45 AM — Screenshot shows error page (500)
3:50 AM — Screenshot shows blank page (timeout)
4:05 AM — Screenshot shows recovery (200 OK)
Enter fullscreen mode Exit fullscreen mode

Post-mortem: "The database connection pool exhausted at 3:45 AM. Recovery took 20 minutes. Evidence: [screenshots linked]."

No guessing. No "we think it was down around 4." Just proof.

Use Case 2: Visual Regression Detection

Automatically detect when your site's appearance changes:

async function detectVisualChanges() {
  const current = await captureScreenshot();
  const previous = fs.readFileSync('./previous-screenshot.png');

  // Simple pixel-by-pixel comparison
  const diff = compareImages(current, previous);

  if (diff.percentChanged > 5) {
    // Alert engineering team
    console.warn(`⚠️  Visual change detected: ${diff.percentChanged}% different`);
    sendSlackAlert({
      text: `Visual regression detected. [View comparison](${current.s3Url})`,
      attachments: [
        { image_url: current.s3Url, title: 'Current' },
        { image_url: previous.s3Url, title: 'Previous' }
      ]
    });
  }

  fs.writeFileSync('./previous-screenshot.png', current);
}
Enter fullscreen mode Exit fullscreen mode

Cost Comparison

Self-hosted approach:

  • EC2 instance for scheduler: $10-20/month
  • Puppeteer overhead: 500MB+ RAM
  • S3 storage (10 screenshots/day): ~$1/month
  • Total: $11-21/month + DevOps overhead

PageBolt approach:

  • Starter plan: $29/month (covers 5,000 screenshots)
  • S3 storage: ~$1/month
  • Total: $30/month, zero infrastructure

For 240 hourly screenshots/month, PageBolt's Starter plan ($29) is more cost-effective than self-hosting.

Deployment: Render or Railway

Deploy the monitoring script on a cheap platform:

Render (free tier):

git push origin main
# Render auto-deploys and runs your cron service
Enter fullscreen mode Exit fullscreen mode

Railway (pay-as-you-go):

  • Deploy Node.js service
  • Set environment variables
  • Background worker runs hourly schedule

Cost: $0-5/month depending on usage.

Building the Screenshot Archive

After 30 days, you'll have a visual timeline of your website. Create an HTML dashboard:

<html>
<head><title>Website Monitoring Archive</title></head>
<body>
  <h1>Website Monitoring — Last 30 Days</h1>
  <div id="screenshots"></div>
  <script>
    // Load all screenshots from S3
    fetch('/api/screenshots?days=30')
      .then(r => r.json())
      .then(screenshots => {
        document.getElementById('screenshots').innerHTML =
          screenshots.map(s =>
            `<img src="${s.url}" style="max-width: 400px; margin: 10px;" title="${s.timestamp}" />`
          ).join('');
      });
  </script>
</body>
</html>
Enter fullscreen mode Exit fullscreen mode

Now you have a visual audit trail: every hour, your website as it appeared to users.

Getting Started

  1. Sign up free at pagebolt.dev/pricing — 100 screenshots/month
  2. Set up your S3 bucket (or use Cloudflare R2 for cheaper storage)
  3. Deploy the Node.js scheduler to Render or Railway
  4. Within 24 hours, you'll have a complete visual archive

Your website's health — captured every hour. No guessing. No "I think it was down." Just screenshots proving what happened, when.

Start free: pagebolt.dev/pricing. 100 screenshots/month, no credit card required.

Top comments (0)