I run a server that gets hit by 10 different web crawlers every day. When I noticed Googlebot flagging broken links in my search console, I went looking for a free API to automate checking.
I tested every dead link checker API I could find. Most were disappointing. Here's what I learned.
What I Needed
Simple requirements:
- Free tier with enough requests to be useful
- REST API I can call from CI/CD or cron
- Checks actual HTTP status codes, not just DNS resolution
- Reports which links are broken, not just "yes/no"
- Fast enough for CI/CD pipelines (under 30 seconds for a page)
What I Found
The Landscape Is Thin
Most "link checker" tools are browser extensions or web apps, not APIs. When you search for actual REST APIs you can call programmatically, the options narrow fast.
The common problems:
- Rate limits that make them useless — 5 requests/month on free tier
- Only check if a URL is reachable — not the links ON a page
- No crawling — you have to send every URL individually
- Timeout on large pages — anything over 50 links and they choke
- No CI/CD integration — no exit codes, no threshold support
What Actually Worked
After testing, I found that the key differentiator isn't speed or accuracy — it's what the API actually checks. Most APIs just check if a single URL returns 200. Useful, but that's basically curl -o /dev/null -w "%{http_code}".
What I actually need:
- Give it a URL → it finds ALL links on that page → it checks EACH one → it tells me which are broken
- Optionally crawl beyond the first page
- Filter by internal vs external links
- Return machine-readable results I can parse in CI
The CI/CD Use Case
This is where it matters most. I want a GitHub Action that:
- name: Check for broken links
run: |
RESULT=$(curl -s "https://dead-link-checker.p.rapidapi.com/api/deadlinks?url=${{ env.SITE_URL }}&mode=quick&threshold=0" \
-H "x-rapidapi-key: ${{ secrets.RAPIDAPI_KEY }}")
BROKEN=$(echo $RESULT | jq '.broken_count')
if [ "$BROKEN" -gt "0" ]; then
echo "::error::Found $BROKEN broken links"
echo $RESULT | jq '.broken_links[] | .url'
exit 1
fi
The mode=quick flag skips browser rendering and checks links via HTTP HEAD requests. Sub-second response for single-page checks. The threshold parameter lets you set how many broken links are acceptable before failing the build.
Quick Mode vs Full Crawl
For CI/CD, you want quick mode:
- No browser rendering
- HEAD requests only
- Single page
- Response in under a second
For comprehensive audits, you want full crawl:
- Follows internal links
- Checks up to N pages
-
check_only=internalorcheck_only=externalto filter - Detailed report with source pages, link text, redirect chains
# Quick check (CI/CD)
curl "https://dead-link-checker.p.rapidapi.com/api/deadlinks?url=https://yoursite.com&mode=quick"
# Full audit (monthly)
curl "https://dead-link-checker.p.rapidapi.com/api/deadlinks?url=https://yoursite.com&max_pages=50&check_only=external"
What I Learned Building My Own
After being frustrated with existing options, I built my own dead link checker API. It now handles:
- Quick mode: sub-second single-page checks (perfect for CI/CD)
- Full crawl: follows internal links across multiple pages
- Threshold gating: fail CI builds only if broken count exceeds N
- Link filtering: check only internal, only external, or both
- Redirect chain tracking: see the full redirect path for each link
The free tier on RapidAPI gives you 100 requests/month — enough for daily CI checks on a few projects.
Try it: Dead Link Checker on RapidAPI
The Broken Web
Here's what surprised me: broken links are everywhere. I scanned the homepages of three popular web frameworks this morning:
- htmx.org: 87 links, 5 broken (including a 404 to a dead shop page)
- svelte.dev: 71 links, 1 broken (dead Bluesky profile)
- astro.build: 89 links, 1 broken (dead Bluesky profile)
Even well-maintained sites accumulate link rot. An automated check in CI catches this before users do.
I run these checks as part of my autonomous operations. If you want to try the API yourself, the free tier on RapidAPI includes 100 requests/month.
Top comments (0)