TL;DR
@automatelab/ai-seo-mcp is a 13-tool MCP server that runs AI-SEO analysis, citation scoring, and structured-data audits directly inside Claude or any MCP-compatible client.
What this is
AI search engines (Perplexity, ChatGPT, Google AI Overviews) pull answers from a small set of sources. Standard SEO tools tell you about backlinks and keyword density - they do not tell you whether a page is structured for AI citation. This MCP server fills that gap: it exposes audit, scoring, rewriting, and crawlability tools that work within your existing Claude context, no browser extension or SaaS dashboard required.
Why we built it
The tooling for AI-SEO was fragmented. You would need one tool to check robots.txt, another to generate llms.txt, another to score citation worthiness - and none of them integrated with the agent you are already using for content work. Claude has no native way to check whether a page will get cited. Wiring it up via MCP was the obvious fix.
Tool surface
Audit
- audit_page - full on-page AI-SEO audit: headings, FAQ blocks, entity density, structured data
- audit_schema - Schema.org validation and deprecation check
- audit_canonical - canonical tag audit and redirect chain inspection
Technical
- check_robots - robots.txt parse and AI-crawler directive check (GPTBot, ClaudeBot, PerplexityBot)
- check_sitemap - sitemap structure, freshness, and coverage check
- check_technical - Core Web Vitals summary and crawlability flags
Scoring
- score_ai_overview_eligibility - predicts likelihood of inclusion in Google AI Overviews
- score_citation_worthiness - composite score for LLM citation probability
Content
- generate_llms_txt - generates a draft llms.txt for a site
- validate_llms_txt - validates an existing llms.txt against the spec
- extract_entities - pulls named entities and semantic clusters from page content
- rewrite_for_aeo - rewrites a passage for Answer Engine Optimization
- rewrite_for_geo - rewrites content for Generative Engine Optimization
Install
Add to your claude_desktop_config.json (or equivalent for Cursor / Cline):
{
"mcpServers": {
"ai-seo": {
"command": "npx",
"args": ["-y", "@automatelab/ai-seo-mcp"]
}
}
}
Or run it once to test:
npx -y @automatelab/ai-seo-mcp
Example
After adding the server and asking Claude to audit a page, it runs audit_page and returns something like:
AI-SEO audit: automatelab.tech/n8n-mcp-server/
Citation worthiness: 72/100
- FAQ block present: yes (4 questions)
- Structured data: HowTo schema detected, no deprecation issues
- Entity density: moderate - consider adding more named tools/versions
- H2 headings: 6 match common query phrasing
- AI-crawler access: GPTBot allowed, PerplexityBot allowed, ClaudeBot allowed
Top recommendation: Add a direct-answer paragraph in the first 150 words. AI Overviews pull from the top of the page first.
That output comes back inside your Claude conversation - no tab-switching, no dashboard login. You can then ask Claude to rewrite_for_aeo the intro block and re-audit in the same session.
What is next
The next version will add per-tool caching so repeated audits on the same URL skip re-fetching, and a batch_audit tool for running a full site check against a sitemap. Feedback welcome - open an issue on the repo or reply here.
Repo: https://github.com/AutomateLab-tech/ai-seo
Landing: https://automatelab.tech/products/mcp/ai-seo/
Top comments (0)