Large Language Models (LLMs) don’t “browse” websites like humans. They fetch targeted context and synthesize answers. That’s where llms.txt comes in: a compact, human‑ and LLM‑readable map that highlights the most important, LLM‑friendly pages of your site.
Think of it as “sitemap for AI”—but curated. Where robots.txt tells crawlers what not to index, llms.txt tells LLMs what is most worth reading and how to read it efficiently.
What llms.txt Is (and What It Isn’t)
-
It is:
- A Markdown file at the root of your site (usually at /llms.txt).
- A curated index of high‑signal pages, ideally with Markdown mirrors (.md) for clean parsing.
- Structured to be easy for agents and tools to parse quickly.
-
It isn’t:
- A full sitemap clone or a complete crawl target.
- A ranking/SEO silver bullet.
- A magic switch that all LLMs use automatically today.
The practical value today: it gives your developers and AI tools a stable, predictable way to fetch the right docs, examples, and references—on demand, with fewer tokens and fewer hallucinations.
State of Adoption (Reality Check)
- Many early implementations copy a sitemap into Markdown. This is not LLM‑friendly and defeats the purpose. Curate, don’t dump.
- Major LLM providers vary in how they use llms.txt right now. Adoption is growing in the developer‑tools ecosystem (IDEs, MCP servers, agents) because it immediately improves answer quality and traceability.
- Low effort, immediate benefits: Even a minimal, well‑curated llms.txt helps your team’s agents today, regardless of broad search engine adoption.
Bottom line: Don’t wait for a “universal mandate.” Make your docs better for your own agents and developer workflows now.
A Good llms.txt: Structure and Conventions
Aim for clarity and token efficiency. A typical llms.txt:
- H1 title
- Optional one‑sentence summary (blockquote)
- H2 sections grouping related links
- Bulleted links to the most helpful pages, preferably .md versions
- An “Optional” or “Additional” section for nice‑to‑have pages
Example skeleton:
# Acme Docs
> Official developer documentation for Acme APIs, SDKs, and integrations.
## Getting Started
- [Overview](https://docs.acme.com/overview.md)
- [Quickstart](https://docs.acme.com/quickstart.md)
## API
- [API Overview](https://docs.acme.com/api.md)
- [Authentication](https://docs.acme.com/api/auth.md)
- [Errors](https://docs.acme.com/api/errors.md)
## Webhooks
- [Webhooks](https://docs.acme.com/webhooks.md)
- [Signatures](https://docs.acme.com/webhooks/signature.md)
## Optional
- [Changelog](https://docs.acme.com/changelog.md)
Key tips:
- Prefer .md mirrors where possible.
- Keep sections short and editorially selected (10–50 links beats 500).
- Link canonical pages, not marketing LPs or UI‑heavy content.
The Fastest Path: Autogenerate llms.txt with llms.page
If you don’t have llms.txt yet, you can bootstrap one automatically.
- Visit: llms.page
- Or fetch directly from your server, on demand:
https://get.llms.page/{yourdomain}/llms.txt
Example:
https://get.llms.page/github.com/llms.txt
How it works:
- Send a GET request to https://get.llms.page/example.com/llms.txt.
- The service analyzes your homepage/internal links and produces a curated llms.txt‑style Markdown file on the fly.
- You can fetch and serve that response directly from your own domain, or cache it locally if you prefer. No scheduler required.
Note: Auto‑generated files are a great starting point. For best results, review and prune to emphasize the most useful pages and, when possible, swap links to .md mirrors.
Integration Options
You have two easy ways to publish:
1) Redirect your /llms.txt to llms.page
- Pros: Zero backend changes, always fresh.
- Cons: Depends on a third‑party service.
Nginx example:
location = /llms.txt {
return 302 https://get.llms.page/$host/llms.txt;
}
Apache example:
Redirect 302 /llms.txt https://get.llms.page/%{HTTP_HOST}/llms.txt
2) Fetch and serve from your domain
- Pros: Full control, can cache, works offline, stable.
- Cons: You’re responsible for when/how you refresh.
Simple fetch‑and‑serve (Node, on request or during build/deploy):
# fetch_llms_txt.sh
node fetch_llms_txt.mjs
// fetch_llms_txt.mjs
import fs from "node:fs/promises";
import https from "node:https";
const domain = process.argv[2] || "example.com";
const url = `https://get.llms.page/${domain}/llms.txt`;
https.get(url, (res) => {
let data = "";
res.on("data", (chunk) => (data += chunk));
res.on("end", async () => {
if (res.statusCode === 200) {
await fs.writeFile("./public/llms.txt", data, "utf8");
console.log("Wrote ./public/llms.txt");
} else {
console.error("Failed:", res.statusCode, data.slice(0, 500));
process.exit(1);
}
});
});
- Serve
./public/llms.txt
athttps://yourdomain.com/llms.txt
. - Optional: add a periodic refresh (cron) if you want it to stay fresh without redeploys:
0 2 * * * /usr/local/bin/bash /path/fetch_llms_txt.sh yourdomain.com
Why Bother Now? Practical Benefits
- Faster, cleaner answers for your team’s AI tools and IDEs
- Lower token usage by pointing agents to clean Markdown
- Less hallucination, more citations
- Easy to automate and keep up to date
Bonus: Plug llms.txt into Your MCP Workflow
If you already use the Model Context Protocol (MCP), you can let your IDE/agent fetch the right docs automatically.

Integrate Any LLMs.txt into Your MCP (with Stripe as the Example)
Everyday Dev ・ Sep 1
FAQ
-
Will this help SEO?
- Indirectly at best. The main benefit is better agent performance and more reliable, cited answers for your users and teams.
-
Do I need perfect Markdown mirrors?
- No, but they help a lot. Start with HTML if you must, then add .md mirrors over time.
-
What if my site structure changes often?
- Use the llms.page redirect, fetch‑and‑serve on demand, or add an optional scheduled refresh to keep /llms.txt fresh with minimal effort.
Conclusion
llms.txt is low‑effort leverage: a lightweight, curated map that makes your content easier for agents to consume accurately and efficiently. Autogenerate it today, polish the top sections, and wire it into your MCP workflow so your team gets faster, source‑backed answers—no heavy infrastructure required.
Top comments (0)