tags: [ai, seo, webdev, llm]
You've heard of robots.txt. Now meet llms.txt - a markdown file designed to help AI models understand website content more effectively.
The Problem It Solves
Large language models face challenges when processing websites:
- HTML is messy (nav, ads, scripts mixed with content)
- Websites are enormous (hundreds of pages)
- Every site structures information differently
What is llms.txt?
Jeremy Howard from Answer.AI proposed this standard in September 2024. It's a markdown file at your website root providing:
- Site overview
- Section descriptions
- Important links with context
- Information tailored for machine comprehension
The Format
Plain markdown with sections for site name, tagline, about information, site structure, key pages, and contact details - all designed for machine readability.
Should You Add One?
While major AI companies haven't officially endorsed llms.txt adoption, companies like Anthropic, Vercel, and Cloudflare have implemented it.
Add one if you:
- Have technical documentation
- Want early positioning in AI-native discovery
- Care about how AI represents your content
Implementation in Next.js
Create a dynamic route using the App Router. You can generate content statically or programmatically list blog posts.
Tools and Resources
- llms_txt2ctx CLI
- VitePress and Docusaurus plugins
- GitBook's auto-generation feature
Regardless of universal adoption, the need for structured machine-readable content won't disappear as AI becomes primary for content discovery.
Top comments (0)