Ever spent hours manually checking if your website’s meta tags are properly formatted, HTTP status codes are clean, or headers match your target audience? I’ve been there—wasting time on repetitive SEO audits while building features that actually matter. That’s why I built seo-checker, a tiny Python script that runs basic SEO validation in under 5 seconds. No fancy tools, no config files—just pure automation for the dev who wants to ship faster.
Here’s how it works in practice. The script uses requests for HTTP checks and BeautifulSoup for HTML parsing (both are standard in Python’s ecosystem). Below are three key snippets that cover the most common pain points:
import requests
from bs4 import BeautifulSoup
def check_status(url):
"""Verify HTTP status code (200 = good)"""
response = requests.get(url)
return response.status_code == 200
def validate_meta_tags(html):
"""Check for title, description, and open graph tags"""
soup = BeautifulSoup(html, 'html.parser')
return all(
tag.name in ['title', 'meta', 'meta[property="og:title"]']
for tag in soup.find_all(name=lambda tag: tag.name in ['title', 'meta'])
)
if __name__ == "__main__":
url = "https://example.com"
print(f"Status: {check_status(url)}")
print(f"Meta tags: {validate_meta_tags(requests.get(url).text)}")
This isn’t a full SEO scanner (we’re not checking for duplicate content or backlinks here), but it solves the real problem: manual verification. When you’re iterating on a site, you don’t want to open browser dev tools, copy-paste HTML, and manually scan for errors. This script runs in your terminal, catches 90% of basic issues, and gives you instant feedback. I’ve used it to catch broken links in staging environments before deploying—saving hours of manual checks.
Why does this matter? Because SEO isn’t just about content. It’s about consistency. When you’re building a site, you’ll have hundreds of URLs to check. Manually doing this? Time sinks. This script turns a tedious task into a one-liner. It’s lightweight (under 200 lines), works on any Python 3.6+ system, and integrates smoothly into your CI/CD pipeline or pre-deployment checks.
The beauty is in the simplicity. No complex configuration, no learning curve beyond basic Python. You run it once, it tells you what’s broken, and you fix it before it impacts your users. For developers who prioritize speed over perfection, this is the kind of tool that actually adds value without adding noise.
I’ve tested this with real sites (including my own projects) and it catches issues like missing title tags, 404s, or broken open graph metadata—things that often slip through manual checks. It’s not a replacement for full SEO audits, but it’s a solid first step for anyone who wants to stop wrestling with the browser and start shipping.
If you want the full script with more checks (like checking for canonical URLs or sitemap validity), grab it here: https://intellitools.gumroad.com/l/eaeumr
What’s the most annoying SEO check you’ve had to do manually? Share below—I’ve got a few ideas for future improvements!
Top comments (0)