DEV Community

Matthew Hou
Matthew Hou

Posted on • Edited on

llms.txt Is the New robots.txt — Why Every Developer Should Care

For 30 years, robots.txt has been the handshake between websites and crawlers. "Here's what you can index. Here's what you can't."

Now there's a new file showing up in website roots: llms.txt. And it flips the script entirely.

Instead of telling bots what not to read, llms.txt tells AI what it should read — a curated, structured summary of your site designed specifically for language models.

This just hit the top of Hacker News with 800+ points, and the implications are bigger than most people realize.

What Is llms.txt?

It's a markdown file at your site root (/llms.txt) that provides:

# Your Site Name

> A one-line description of what this site/project is.

## Docs

- [Getting Started](https://yoursite.com/docs/getting-started.md): Quick setup guide
- [API Reference](https://yoursite.com/docs/api.md): Full API documentation
- [Examples](https://yoursite.com/docs/examples.md): Code examples and tutorials

## Optional

- [Blog](https://yoursite.com/blog/index.md): Latest posts
- [Changelog](https://yoursite.com/changelog.md): Version history
Enter fullscreen mode Exit fullscreen mode

That's it. A structured table of contents with descriptions, optimized for LLM context windows.

Why This Exists

Here's the problem it solves:

When you ask Claude or GPT "how do I use Library X?", the AI either:

  1. Hallucinates based on training data (often outdated)
  2. Searches the web and gets buried in SEO spam, Stack Overflow threads, and blog posts from 2019

llms.txt gives AI agents a clean, authoritative entry point. Instead of scraping your entire site and guessing what matters, the AI reads your curated summary and knows exactly where to look.

The Shift: Optimizing for AI, Not Humans

Think about what this means:

robots.txt era (1994-2024):

  • Websites optimized for Google's crawler
  • SEO = make Google understand your content
  • Users find you through search results

llms.txt era (2025+):

  • Websites optimized for AI agents
  • LLMO (LLM Optimization) = make AI understand your content
  • Users find you through AI conversations

This is the beginning of a new kind of SEO — one where your audience isn't humans browsing search results, but AI agents making recommendations.

How to Implement It (5 Minutes)

1. Create the file

# My Developer Tool

> A CLI tool that automates database migrations for PostgreSQL.

## Docs

- [Installation](https://mytool.dev/docs/install.md): Setup for Mac, Linux, Windows
- [Quick Start](https://mytool.dev/docs/quickstart.md): Your first migration in 2 minutes
- [Configuration](https://mytool.dev/docs/config.md): All config options explained
- [CLI Reference](https://mytool.dev/docs/cli.md): Every command and flag

## Optional

- [Troubleshooting](https://mytool.dev/docs/troubleshooting.md): Common issues
- [Changelog](https://mytool.dev/CHANGELOG.md): Release history
Enter fullscreen mode Exit fullscreen mode

2. Host it

Drop it at your site root: https://yoursite.com/llms.txt

3. Point to full content

The URLs in llms.txt should point to clean markdown or plain text — not HTML pages full of navigation, ads, and JavaScript. Some sites create a /llms-full.txt with the complete content concatenated.

Who's Already Doing This?

The proposal is gaining traction fast. Some early adopters:

  • Documentation sites for developer tools
  • Open source projects
  • API providers
  • Technical blogs

It's still early — which means if you implement it now, you're ahead of 99% of the web.

The Bigger Picture

We're watching the web bifurcate in real-time:

  1. Human web: Visual, interactive, JavaScript-heavy, optimized for engagement
  2. AI web: Structured, clean, text-first, optimized for comprehension

Sites that serve both audiences will win. llms.txt is the first standard for the AI web, and it won't be the last.

If you maintain any developer-facing content — docs, APIs, tools, libraries — add an llms.txt today. It takes 5 minutes, and it's the cheapest way to make your project discoverable in the age of AI.


Already added llms.txt to your project? What was your experience? Any tips for structuring it well? Share in the comments.

Top comments (0)