Most serious SEO audit tools sit behind a login, a trial period, or a monthly subscription. Ahrefs, SEMrush, Moz — all great, all expensive. The tools that are free are usually watered-down or lead-gen for upselling.
I wanted to build something that gives a genuinely useful technical audit to anyone, immediately, for free. No signup, no email address, no credit card.
That's SEODoc.
What it checks
Paste a URL and SEODoc crawls the page and generates a report covering:
Technical health:
- HTTP status codes and redirects
- Page speed and Core Web Vitals estimates
- Mobile-friendliness indicators
- HTTPS and security headers
- Canonical tags and hreflang
On-page SEO:
- Title tag length, uniqueness, keyword presence
- Meta description length and quality signals
- Heading hierarchy (H1-H6) analysis
- Image alt text coverage
- Internal/external link audit
Structured data:
- Schema.org markup detection and validation
- Open Graph and Twitter Card tags
- JSON-LD validation
Crawlability:
- Robots.txt analysis
- Sitemap detection and validation
- Noindex/nofollow tag detection
The output is a structured report with pass/fail indicators and specific recommendations for each issue found.
The technical approach
The backend is a Python FastAPI application. The crawling is done with a combination of Playwright (for JavaScript-rendered pages) and BeautifulSoup (for static HTML parsing). This means it handles modern SPAs properly, not just server-rendered HTML.
The Core Web Vitals estimation uses Lighthouse programmatically via headless Chrome. The structured data validation uses Google's structured data testing logic.
@app.post("/audit")
async def run_audit(request: AuditRequest):
crawler = SEOCrawler(request.url)
results = await crawler.run_full_audit()
return AuditResponse(
technical=results.technical,
onpage=results.onpage,
structured_data=results.structured_data,
recommendations=results.prioritized_recommendations()
)
Why no login
Every "free SEO tool" I've seen uses the no-login experience as a lead funnel. You get the results and then hit a wall: "Sign up to see the full report." I wanted to do the opposite — give the full report upfront, no strings.
The business model is batch auditing and scheduled monitoring, which is a paid feature for when you want to run audits on 50 pages regularly. The one-off audit is just free.
What I learned building it
JavaScript-rendered pages are a surprisingly large percentage of the modern web. If you only use requests + BeautifulSoup, you miss entire categories of content and issues. Running Playwright for every audit adds latency but is necessary for accuracy.
Core Web Vitals are genuinely hard to estimate without running real browser measurement. The Lighthouse approach is good enough for directional guidance but not as reliable as field data from CrUX.
Try it at seodoc.site.
Built with: Python, FastAPI, Playwright, BeautifulSoup, Lighthouse, Pydantic
Top comments (0)