You have a robots.txt. You probably have a sitemap.xml. But do you have an llms.txt?
What is llms.txt?
The llms.txt specification provides a machine-readable summary of your website for large language models. It's not about blocking crawlers — it's about helping AI understand your content.
Basic Structure
markdown# Your Site Name
One-line description of what you do
Main Resources
- Page Title: Brief description
- Another Page: Brief description
Topics You Cover
- Topic 1
- Topic 2
- Topic 3 Real Example Here's mine: markdown# Seekrates AI
AI consensus platform that queries multiple AI models and synthesises where they agree
Main Resources
- About: Company background and methodology
- Blog: AI-era SEO and content strategy
- Playbook: AI-Era SEO implementation guide
Topics We Cover
- Generative Engine Optimisation (GEO)
- Large Language Model Optimisation (LLMO)
- Answer Engine Optimisation (AEO)
- Multi-model AI consensus methodology Implementation
Create llms.txt in your root directory (same level as robots.txt)
Add your site description and key pages
Keep it under 500 lines
Optional: Create llms-full.txt for detailed documentation
Why Bother?
When someone asks Claude or ChatGPT about your domain, the AI draws from training data and accessible web content. A clear llms.txt is a structured signal about what you do.
Not magic. But a signal most competitors don't have.
Took me 15 minutes to create. Highest signal-to-effort ratio I've found for AI-era SEO.
Full implementation guide with schema markup and Rank Math settings: AI-Era SEO Playbook
Top comments (1)
This is a really smart and practical idea, nicely explained!!
It’s refreshing to see something concrete for AI-era SEO instead of vague theory...
One extra tip is to treat llms.txt like a living document and update it whenever your core pages or positioning change!
For 15 minutes of work, this feels like a very high leverage signal.