DEV Community

Mmm
Mmm

Posted on

How I Automated SEO Checks for 100+ Websites in 5 Minutes (Python Script)

You know that feeling when you’re reviewing a client’s website portfolio and suddenly realize you’ve spent 20 minutes manually checking 15 sites for title tags, meta descriptions, and broken links? I’ve been there. As a developer who works with dozens of client sites, this tedious manual SEO auditing was killing my productivity—especially when I needed to quickly validate sites for a new project. That’s why I built a lightweight Python script that scans 100+ URLs in under 5 minutes, giving me actionable insights without touching a spreadsheet.

This tool isn’t about fancy AI or complex APIs—it’s a simple, no-frills automation that focuses on the real pain points: extracting title tags, meta descriptions, and checking for 404 errors. I wrote it because I needed to validate client sites faster during a tight deadline, and it’s now my go-to for quick SEO sanity checks. The script runs in pure Python with zero external dependencies (beyond standard libraries), making it instantly usable for anyone with basic web dev knowledge.

Here’s how it works in practice. First, we set up the essentials with requests and BeautifulSoup—the most common tools for web scraping. Then we define a function that checks each URL for critical SEO elements. The magic happens in the analyze_url function below:

import requests
from bs4 import BeautifulSoup

def analyze_url(url):
    try:
        response = requests.get(url, timeout=5)
        soup = BeautifulSoup(response.text, 'html.parser')

        # Extract title and meta description
        title = soup.title.string if soup.title else "No title"
        meta_desc = soup.find('meta', attrs={'name': 'description'})
        meta_desc = meta_desc['content'] if meta_desc else "No description"

        # Check for 404 errors
        status = response.status_code
        return {
            'url': url,
            'title': title,
            'description': meta_desc,
            'status': status
        }
    except Exception as e:
        return {'error': str(e)}
Enter fullscreen mode Exit fullscreen mode

This snippet handles the core logic: it fetches the page (with a 5-second timeout to prevent hangs), parses HTML, and pulls the title and meta description. If a page fails (like a 404), it catches the error cleanly. The timeout prevents deadlocks on slow sites—something I learned the hard way when testing a client’s legacy site.

To run it on a list of URLs, we just loop through your targets. Here’s a quick example using a sample list:

urls = [
    "https://example.com",
    "https://another-site.com",
    "https://client-site.example.net"
]

results = []
for url in urls:
    results.append(analyze_url(url))

# Print results (real-world usage would save to CSV)
for r in results:
    if 'error' in r:
        print(f"⚠️ Error: {r['error']}")
    else:
        print(f"{r['url']}: Status {r['status']}, Title: {r['title'][:50]}...")
Enter fullscreen mode Exit fullscreen mode

This outputs a clean summary of your sites—perfect for quick reviews. I use it daily to spot missing meta descriptions or 404s before client handoffs. The output is human-readable, so you don’t need to parse JSON or build dashboards. For deeper analysis (like keyword density), I’d add more steps later—but for this use case? It’s already saving me hours.

Why does this matter? Because SEO audits used to take hours. Now, with this script, I can validate 50+ sites in 3 minutes during a sprint. No manual copy-pasting, no browser tab switching—just a single command. It’s the kind of automation that feels trivial until you’ve been drowning in repetitive work.

I built this because I’ve seen too many developers waste time on manual checks. If you’re in the same boat (or just starting with SEO), this script is a low-effort way to gain real insights without complex tools. It’s not perfect—there’s no caching, and it doesn’t handle dynamic content—but for quick sanity checks? It’s dead simple.

If you found this helpful, grabbed the full script here: https://intellitools.gumroad.com/l/seo-analysis-tool (it’s just 10 lines of code with a requirements.txt).

What’s the most common SEO issue you’ve automated with a script? I’d love to hear your stories in the comments—maybe we can build something even better together!

Top comments (0)