DEV Community

孫昊
孫昊

Posted on

I Audit 49 LIVE URLs Every 30 Minutes. Here's the 30-Line Script.

TL;DR: A 30-line Python script that does HEAD requests on every LIVE asset URL, verifies HTTP 200, and reports failures. Catches dead Gumroad SKUs, broken site pages, and 404 dev.to slugs before customers do. Runs every 30 minutes via cron.


Why audit your URLs

If you're an indie hacker with 6+ Gumroad SKUs, 30+ dev.to articles, 12+ Substack issues, 9+ site pages — that's 50+ public URLs that could break:

  • Gumroad SKU unpublished (auto-delisting after 60 days inactive)
  • Substack post taken down (rare but possible)
  • dev.to slug changed (after edit + republish)
  • Site page broken (after refactor / nav change)
  • DNS / SSL issue affecting subdomain

Each broken URL costs you a customer or referral. You can't manually check 50 URLs daily.

The script

"""Verify all LIVE URLs from today's tick. HTTP 200 check + report."""
import requests

URLS = [
    # Gumroad SKUs (6)
    ("Gumroad", "$499 ASC API Toolkit", "https://jiejuefuyou.gumroad.com/l/vszsui"),
    ("Gumroad", "$39 AutoApp Dashboard", "https://jiejuefuyou.gumroad.com/l/hmmzt"),
    # ... etc

    # dev.to articles
    ("dev.to", "#26 Article title", "https://dev.to/snake_sun/..."),
    # ... etc

    # Site/ pages
    ("Site", "/", "https://jiejuefuyou.github.io/"),
    # ... etc
]


def verify():
    print(f"Verifying {len(URLS)} LIVE URLs...\n")
    ok_count = 0
    fail_count = 0
    by_platform = {}
    for platform, name, url in URLS:
        try:
            r = requests.head(url, timeout=10, allow_redirects=True)
            if r.status_code == 200:
                status = "✓ 200"
                ok_count += 1
            elif r.status_code in (301, 302, 304):
                status = f"{r.status_code} (redirect)"
                ok_count += 1
            else:
                status = f"{r.status_code}"
                fail_count += 1
        except Exception as e:
            status = f"✗ ERR: {str(e)[:30]}"
            fail_count += 1
        by_platform.setdefault(platform, []).append((name, status))
        print(f"  [{platform:8}] {status:18} {name[:40]:40} {url[:60]}")
    print(f"\n=== Summary ===")
    print(f"OK: {ok_count}  Fail: {fail_count}  Total: {len(URLS)}")
    for platform, items in by_platform.items():
        ok = sum(1 for _, s in items if s.startswith(""))
        print(f"  {platform}: {ok}/{len(items)} OK")


if __name__ == "__main__":
    verify()
Enter fullscreen mode Exit fullscreen mode

30 lines. URLS list grows over time as I add SKUs / articles / pages.

Running it

Manually

python verify_all_live_urls.py
Enter fullscreen mode Exit fullscreen mode

Output:

Verifying 49 LIVE URLs...
  [Gumroad ] ✓ 200             $499 ASC API Toolkit                ...
  [Gumroad ] ✓ 200             $39 AutoApp Dashboard               ...
  ...
  [Site    ] ✓ 200             /tools.html                         ...
=== Summary ===
OK: 49  Fail: 0  Total: 49
  Gumroad: 6/6 OK
  dev.to: 29/29 OK
  Substack: 4/4 OK
  Site: 10/10 OK
Enter fullscreen mode Exit fullscreen mode

Via cron

*/30 * * * * cd /path/to/project && python verify_all_live_urls.py >> logs/audit.log 2>&1
Enter fullscreen mode Exit fullscreen mode

Every 30 min. Failures get appended to log; review daily.

Via Flask endpoint (dashboard integration)

@app.route('/api/audit')
def audit_urls():
    """HTTP 200 verifier — runs across all LIVE assets"""
    results = []
    for url in scan_all_live_urls():
        try:
            r = requests.head(url, timeout=10, allow_redirects=True)
            results.append({'url': url, 'status': r.status_code, 'ok': r.status_code == 200})
        except Exception as e:
            results.append({'url': url, 'status': 'ERR', 'ok': False, 'error': str(e)[:80]})
    ok = sum(1 for r in results if r['ok'])
    return jsonify({'total': len(results), 'ok': ok, 'fail': len(results) - ok, 'details': results})
Enter fullscreen mode Exit fullscreen mode

Hit the endpoint from your daily briefing script.

Caching

For 49 URLs, the audit runs in ~5 seconds. No caching needed for indie scale.

If you scale to 500+ URLs, cache the audit results for 24h:

from datetime import datetime, timedelta
import json
from pathlib import Path

CACHE = Path("data/audit_cache.json")

def audit_with_cache():
    if CACHE.exists():
        cached = json.loads(CACHE.read_text())
        ts = datetime.fromisoformat(cached["ts"])
        if datetime.now() - ts < timedelta(hours=24):
            return cached
    fresh = audit_urls()
    fresh["ts"] = datetime.now().isoformat()
    CACHE.write_text(json.dumps(fresh, indent=2))
    return fresh
Enter fullscreen mode Exit fullscreen mode

What I learned

URLs DO break

In 60 days, I caught:

  • 1 Gumroad SKU briefly unpublished (recreated with new slug)
  • 1 dev.to article URL changed after slug edit (broken in 4 places that linked to it)
  • 1 site page after a refactor that removed index.html redirects
  • 0 Substack issues (their archive is solid)

Without the audit, those would have stayed broken for days.

Slug drift is real

Editing a dev.to article's title can change its slug. The old URL still works (302 redirect) but for a few minutes might 404. Audit catches this.

Dead links spread

If your just-published.html page lists 30 dev.to articles, and one slug changes, you have 30 hyperlinks pointing to a URL that may eventually 404. Audit flags it.

What this lets you sleep at night about

If the audit reports "49/49 OK" every 30 min, you know:

  • All your products are buyable
  • All your articles are readable
  • All your site pages are loading
  • All your customer-facing URLs work

You don't have to wonder. You don't have to check manually.

Source

Full audit script + Flask endpoint + caching:

AutoApp Dashboard ($39) includes:

  • verify_all_live_urls.py (this article)
  • dashboard/api/audit endpoint
  • Cron setup template (Windows + Linux + macOS)
  • Failure alerting (email/Slack on fail)

If you have 50+ public URLs and aren't auditing them, you have a churn-shaped problem. 30 lines of Python solves it.

Top comments (0)