The agents.txt IETF draft just expired this morning. If you deployed it, here is what to know.
What agents.txt was
agents.txt was a small spec inspired by robots.txt. The idea: drop a plain-text file at /.well-known/agents.txt declaring which automated agents your site permits, what content endpoints they should hit, and what capabilities you expose. It was the simplest possible answer to "how does an AI agent figure out what to do here without scraping HTML."
The draft hit the IETF datatracker in late 2025 as draft-agents-txt-00. It was never going to be a heavyweight standard. It was a 14-page document trying to claim a /.well-known slot before anyone else did.
It expired today, April 10 2026, at the standard six-month mark. No -01 revision was filed.
Why it expired
The honest answer: the working group never coalesced. There was a mailing list, two interim meetings, and a long thread arguing about whether User-Agent strings or signed manifests should be the identity primitive. The author moved on. Nobody picked it up.
This is normal for IETF individual submissions. Most expire. The interesting question is what fills the gap.
What is still live
If you deployed an /.well-known/agents.txt file, you do not need to take it down. Several tools, including the validator I help maintain at global-chat.io/validate, still parse it. The format itself is fine. It is the IETF status that lapsed, not the file.
A few practical things you can do today:
- Keep the file deployed. Anything that was reading it yesterday is still reading it.
- Run it through a validator to catch syntax drift before tools start fragmenting their parsers.
- Cross-reference your declared endpoints with whatever agent registry you point to.
How the competing standards compare
A short field guide for the post-agents.txt world.
llms.txt is the Anthropic-aligned proposal, focused on documentation surface for LLM crawlers. Narrower scope. Adoption is slowly growing on docs sites.
MCP server manifests expose capabilities through a server handshake instead of a static file. Different layer of the stack. Solves the "what can this thing do" question, but not the "where does it live" one.
A2A is Google's draft on inter-agent messaging rather than discovery proper. Useful once you have already found the other agent.
Web bot auth and the signed-agents drafts at IETF are working on cryptographic identity for automated traffic. Closer to what agents.txt should have been if it had ever shipped a security model.
None of these covers the full discovery surface that agents.txt was reaching for. If you want a side-by-side view, I keep one updated at global-chat.io/discovery-landscape.
What I would actually do
If you are a site operator who already shipped agents.txt: leave it. The marginal cost is zero and there is no replacement that does the same job at the same level of simplicity.
If you are starting fresh today: ship llms.txt for documentation, an MCP server if you have callable capabilities, and /.well-known/agents.txt as a fallback declaration. They cover different layers and do not conflict.
The IETF process is slow and expiring drafts is part of how it works. The real signal is not that agents.txt died. The real signal is that nobody has yet shipped the thing that replaces it cleanly. Discovery is still an open problem in the agent stack and the next twelve months are going to be interesting.
If you have an agents.txt deployed and want to confirm it still parses, the validator is at the link above. No login, no signup, paste the file in.
Top comments (0)