<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Everyday Dev</title>
    <description>The latest articles on DEV Community by Everyday Dev (@everyday_dev).</description>
    <link>https://dev.to/everyday_dev</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/everyday_dev"/>
    <language>en</language>
    <item>
      <title>Integrate Any LLMs.txt into Your MCP (with Stripe as the Example)</title>
      <dc:creator>Everyday Dev</dc:creator>
      <pubDate>Mon, 01 Sep 2025 11:50:13 +0000</pubDate>
      <link>https://dev.to/everyday_dev/integrate-any-llmstxt-into-your-mcp-with-stripe-as-the-example-j33</link>
      <guid>https://dev.to/everyday_dev/integrate-any-llmstxt-into-your-mcp-with-stripe-as-the-example-j33</guid>
      <description>&lt;p&gt;If you’re building or using AI agents, you want answers that are current, source‑backed, and token‑efficient. That’s exactly what llms.txt plus the Model Context Protocol (MCP) delivers: a way to discover the right docs, fetch only what you need, and wire it straight into your IDE or agent host.&lt;/p&gt;

&lt;p&gt;In this guide, you’ll learn:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://dev.to/everyday_dev/is-the-llmstxt-a-waste-of-time-59ja"&gt;What llms.txt is and why it matters&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;How to consume any llms.txt (we’ll use Stripe’s as an example)&lt;/li&gt;
&lt;li&gt;How to integrate docs via an MCP server into Cursor, Windsurf, or Claude Desktop&lt;/li&gt;
&lt;li&gt;Best practices for security, performance, and governance&lt;/li&gt;
&lt;li&gt;Troubleshooting tips and advanced patterns&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;We’ll use Stripe’s public llms.txt at:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;a href="https://docs.stripe.com/llms.txt" rel="noopener noreferrer"&gt;https://docs.stripe.com/llms.txt&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  What is llms.txt?
&lt;/h2&gt;

&lt;p&gt;llms.txt is a human‑ and LLM‑readable Markdown file at the root of a docs site (for example, &lt;code&gt;/llms.txt&lt;/code&gt;). It’s like a compact, curated “sitemap for AI” that lists the most important, LLM‑friendly pages—often with &lt;code&gt;.md&lt;/code&gt; mirrors for clean parsing and minimal tokens.&lt;/p&gt;

&lt;p&gt;Why it’s useful:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Curated: Points your agent at the best sources, not every page&lt;/li&gt;
&lt;li&gt;Efficient: Markdown mirrors parse cleanly and compress well&lt;/li&gt;
&lt;li&gt;Reliable: Reduces hallucinations by nudging agents to fetch real docs&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Core structure (typical):&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;H1 title&lt;/li&gt;
&lt;li&gt;Optional summary in a blockquote&lt;/li&gt;
&lt;li&gt;H2 sections with bullet‑listed links: &lt;code&gt;[Title](URL) — optional note&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Optional section (commonly titled “Optional” or “Additional”) for nice‑to‑have pages&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Try opening Stripe’s: &lt;a href="https://docs.stripe.com/llms.txt" rel="noopener noreferrer"&gt;https://docs.stripe.com/llms.txt&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  How llms.txt and MCP Work Together
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;llms.txt: Discovery layer. It tells agents where the “good stuff” lives.&lt;/li&gt;
&lt;li&gt;MCP (Model Context Protocol): Execution layer. It standardizes how an agent host (like an IDE) talks to tools/servers that can read resources (docs), call tools (APIs), and apply prompts.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Put simply: Use an MCP “docs server” that knows how to read llms.txt and fetch the linked pages on demand. Your IDE/agent then calls MCP tools to grab only the pages it needs to answer a question.&lt;/p&gt;

&lt;p&gt;We’ll use an off‑the‑shelf server (mcpdoc) to make this easy.&lt;/p&gt;




&lt;h2&gt;
  
  
  Quick Start: Consume Stripe’s llms.txt with an MCP Server
&lt;/h2&gt;

&lt;p&gt;We’ll run a local MCP server that:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Registers a “Stripe Docs” source via its llms.txt&lt;/li&gt;
&lt;li&gt;Exposes tools to list sources and fetch docs&lt;/li&gt;
&lt;li&gt;Speaks stdio (for IDEs) or SSE (for browser tooling/inspection)&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Prerequisites
&lt;/h3&gt;

&lt;p&gt;Install uv (a fast Python package runner):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;curl &lt;span class="nt"&gt;-LsSf&lt;/span&gt; https://astral.sh/uv/install.sh | sh
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Verify:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;uvx &lt;span class="nt"&gt;--version&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Option A: Run Over SSE (Great for Inspection)
&lt;/h3&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;uvx &lt;span class="nt"&gt;--from&lt;/span&gt; mcpdoc mcpdoc &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--urls&lt;/span&gt; &lt;span class="s2"&gt;"Stripe:https://docs.stripe.com/llms.txt"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--transport&lt;/span&gt; sse &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--port&lt;/span&gt; 8082 &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--host&lt;/span&gt; localhost
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;ul&gt;
&lt;li&gt;The server will fetch and index Stripe’s llms.txt.&lt;/li&gt;
&lt;li&gt;You can point an MCP inspector at it to explore tools/resources.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Optional: Open the MCP Inspector in another terminal:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;npx @modelcontextprotocol/inspector
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then connect to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;URL: &lt;code&gt;http://localhost:8082&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Option B: Run Over stdio (For IDE Integration)
&lt;/h3&gt;

&lt;p&gt;Many IDEs/hosts expect stdio transport. Just switch the flag:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;uvx &lt;span class="nt"&gt;--from&lt;/span&gt; mcpdoc mcpdoc &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--urls&lt;/span&gt; &lt;span class="s2"&gt;"Stripe:https://docs.stripe.com/llms.txt"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--transport&lt;/span&gt; stdio
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;We’ll wire this into popular IDEs next.&lt;/p&gt;




&lt;h2&gt;
  
  
  Integrate with Your IDE (Cursor, Windsurf, Claude Desktop)
&lt;/h2&gt;

&lt;p&gt;Below are minimal configurations that register the server and nudge the agent to fetch docs before answering.&lt;/p&gt;

&lt;h3&gt;
  
  
  Cursor
&lt;/h3&gt;

&lt;p&gt;Edit or create &lt;code&gt;~/.cursor/mcp.json&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"mcpServers"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"stripe-docs-mcp"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"command"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"uvx"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"args"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"--from"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"mcpdoc"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"mcpdoc"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"--urls"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"Stripe:https://docs.stripe.com/llms.txt"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"--transport"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"stdio"&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Optional: Add a simple “use‑the‑docs‑first” rule in Cursor’s User Rules:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;For any questions about Stripe, use the MCP server "stripe-docs-mcp":
1) call list_doc_sources
2) call fetch_docs for Stripe's llms.txt to see curated pages
3) select and fetch the most relevant .md pages
4) answer citing the fetched URLs
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  Windsurf (Codeium)
&lt;/h3&gt;

&lt;p&gt;Edit &lt;code&gt;~/.codeium/windsurf/mcp_config.json&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"mcpServers"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"stripe-docs-mcp"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"command"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"uvx"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"args"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"--from"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"mcpdoc"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"mcpdoc"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"--urls"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"Stripe:https://docs.stripe.com/llms.txt"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"--transport"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"stdio"&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Add a corresponding “fetch before answer” instruction in Windsurf’s settings.&lt;/p&gt;

&lt;h3&gt;
  
  
  Claude Desktop
&lt;/h3&gt;

&lt;p&gt;On macOS, edit:&lt;br&gt;
&lt;code&gt;~/Library/Application Support/Claude/claude_desktop_config.json&lt;/code&gt;&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="nl"&gt;"mcpServers"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="nl"&gt;"stripe-docs-mcp"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"command"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="s2"&gt;"uvx"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="nl"&gt;"args"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"--from"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"mcpdoc"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"mcpdoc"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"--urls"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"Stripe:https://docs.stripe.com/llms.txt"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"--transport"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
        &lt;/span&gt;&lt;span class="s2"&gt;"stdio"&lt;/span&gt;&lt;span class="w"&gt;
      &lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
    &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Tip: If Python path issues arise, add an explicit interpreter:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight json"&gt;&lt;code&gt;&lt;span class="nl"&gt;"args"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;&lt;span class="w"&gt; &lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="s2"&gt;"--python"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="s2"&gt;"/usr/bin/python3"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="s2"&gt;"--from"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="s2"&gt;"mcpdoc"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="s2"&gt;"mcpdoc"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="s2"&gt;"--urls"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="s2"&gt;"Stripe:https://docs.stripe.com/llms.txt"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="s2"&gt;"--transport"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt;&lt;span class="w"&gt;
  &lt;/span&gt;&lt;span class="s2"&gt;"stdio"&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;&lt;span class="w"&gt;
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  How to Use It in Practice (Agent Flow)
&lt;/h2&gt;

&lt;p&gt;Ask your IDE agent something like:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;“How does Stripe recommend handling API version upgrades?”&lt;/li&gt;
&lt;li&gt;“Show me how to verify webhook signatures with Stripe.”&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The ideal flow:&lt;br&gt;
1) list_doc_sources → confirms Stripe source is registered&lt;br&gt;
2) fetch_docs on &lt;code&gt;https://docs.stripe.com/llms.txt&lt;/code&gt; → returns curated links&lt;br&gt;
3) fetch_docs on the most relevant &lt;code&gt;.md&lt;/code&gt; pages (for example, &lt;code&gt;api/versioning.md&lt;/code&gt;, &lt;code&gt;upgrades.md&lt;/code&gt;, &lt;code&gt;webhooks.md&lt;/code&gt;, &lt;code&gt;webhooks/signature.md&lt;/code&gt;)&lt;br&gt;
4) Agent answers and cites the exact URLs it fetched&lt;/p&gt;

&lt;p&gt;This approach yields:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Current answers directly grounded in Stripe docs&lt;/li&gt;
&lt;li&gt;Minimal token waste&lt;/li&gt;
&lt;li&gt;Clear traceability via MCP tool logs&lt;/li&gt;
&lt;/ul&gt;


&lt;h2&gt;
  
  
  Useful Stripe Pages Typically Found via llms.txt
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;API overview: &lt;code&gt;https://docs.stripe.com/api.md&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Versioning: &lt;code&gt;https://docs.stripe.com/api/versioning.md&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Upgrades: &lt;code&gt;https://docs.stripe.com/upgrades.md&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Testing: &lt;code&gt;https://docs.stripe.com/testing.md&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Webhooks: &lt;code&gt;https://docs.stripe.com/webhooks.md&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Webhook signatures: &lt;code&gt;https://docs.stripe.com/webhooks/signature.md&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Connect: &lt;code&gt;https://docs.stripe.com/connect.md&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Note: Always rely on the live &lt;code&gt;llms.txt&lt;/code&gt; to see the authoritative, current set.&lt;/p&gt;


&lt;h2&gt;
  
  
  Advanced: Combine Multiple llms.txt Sources
&lt;/h2&gt;

&lt;p&gt;Your work might span multiple ecosystems (e.g., Stripe + LangChain + your internal docs). Point the server at several llms.txt files:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;uvx &lt;span class="nt"&gt;--from&lt;/span&gt; mcpdoc mcpdoc &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--urls&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="s2"&gt;"Stripe:https://docs.stripe.com/llms.txt"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="s2"&gt;"LangChain:https://python.langchain.com/llms.txt"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="s2"&gt;"LangGraph:https://langchain-ai.github.io/langgraph/llms.txt"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--transport&lt;/span&gt; sse &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--port&lt;/span&gt; 8082 &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--host&lt;/span&gt; localhost
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Your agent can now fetch across these sources, still grounded in curated pages.&lt;/p&gt;




&lt;h2&gt;
  
  
  Building Your Own Docs MCP Server (Optional)
&lt;/h2&gt;

&lt;p&gt;If you prefer custom logic or internal/private docs:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Expose resources:

&lt;ul&gt;
&lt;li&gt;One resource for the llms.txt itself&lt;/li&gt;
&lt;li&gt;One resource per linked &lt;code&gt;.md&lt;/code&gt; page (on demand)&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Expose tools:

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;list_doc_sources&lt;/code&gt; to enumerate registered llms.txt endpoints&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;fetch_docs(urls: string[])&lt;/code&gt; to retrieve pages as needed&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Add authentication for private sources (API keys, OAuth) and enforce domain allowlists.&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;Skeleton (Python) with a pseudo‑MCP server outline:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight python"&gt;&lt;code&gt;&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;asyncio&lt;/span&gt;
&lt;span class="kn"&gt;from&lt;/span&gt; &lt;span class="n"&gt;typing&lt;/span&gt; &lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;List&lt;/span&gt;
&lt;span class="kn"&gt;import&lt;/span&gt; &lt;span class="n"&gt;httpx&lt;/span&gt;

&lt;span class="n"&gt;ALLOWED_DOMAINS&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;docs.stripe.com&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;

&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;fetch_text&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;url&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;-&amp;gt;&lt;/span&gt; &lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
    &lt;span class="n"&gt;domain&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="n"&gt;url&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;split&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;/&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)[&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt;
    &lt;span class="k"&gt;if&lt;/span&gt; &lt;span class="n"&gt;domain&lt;/span&gt; &lt;span class="ow"&gt;not&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;ALLOWED_DOMAINS&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="k"&gt;raise&lt;/span&gt; &lt;span class="nc"&gt;ValueError&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="sa"&gt;f&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Domain not allowed: &lt;/span&gt;&lt;span class="si"&gt;{&lt;/span&gt;&lt;span class="n"&gt;domain&lt;/span&gt;&lt;span class="si"&gt;}&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
    &lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="k"&gt;with&lt;/span&gt; &lt;span class="n"&gt;httpx&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nc"&gt;AsyncClient&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;timeout&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="mi"&gt;30&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="n"&gt;r&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="n"&gt;client&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;url&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;headers&lt;/span&gt;&lt;span class="o"&gt;=&lt;/span&gt;&lt;span class="p"&gt;{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Accept&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;text/markdown,text/plain,*/*&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;})&lt;/span&gt;
        &lt;span class="n"&gt;r&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;raise_for_status&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt;
        &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;r&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;text&lt;/span&gt;

&lt;span class="c1"&gt;# Pseudo-MCP handlers:
&lt;/span&gt;&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;list_doc_sources&lt;/span&gt;&lt;span class="p"&gt;():&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="p"&gt;[{&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;label&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;Stripe&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;url&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;https://docs.stripe.com/llms.txt&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;}]&lt;/span&gt;

&lt;span class="k"&gt;async&lt;/span&gt; &lt;span class="k"&gt;def&lt;/span&gt; &lt;span class="nf"&gt;fetch_docs&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;urls&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;List&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="nb"&gt;str&lt;/span&gt;&lt;span class="p"&gt;]):&lt;/span&gt;
    &lt;span class="n"&gt;results&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="p"&gt;[]&lt;/span&gt;
    &lt;span class="k"&gt;for&lt;/span&gt; &lt;span class="n"&gt;url&lt;/span&gt; &lt;span class="ow"&gt;in&lt;/span&gt; &lt;span class="n"&gt;urls&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
        &lt;span class="k"&gt;try&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;text&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nf"&gt;fetch_text&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;url&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt;
            &lt;span class="n"&gt;results&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;url&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;url&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;ok&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="bp"&gt;True&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;content&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;text&lt;/span&gt;&lt;span class="p"&gt;})&lt;/span&gt;
        &lt;span class="k"&gt;except&lt;/span&gt; &lt;span class="nb"&gt;Exception&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt;
            &lt;span class="n"&gt;results&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;append&lt;/span&gt;&lt;span class="p"&gt;({&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;url&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="n"&gt;url&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;ok&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="bp"&gt;False&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="s"&gt;error&lt;/span&gt;&lt;span class="sh"&gt;"&lt;/span&gt;&lt;span class="p"&gt;:&lt;/span&gt; &lt;span class="nf"&gt;str&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="n"&gt;e&lt;/span&gt;&lt;span class="p"&gt;)})&lt;/span&gt;
    &lt;span class="k"&gt;return&lt;/span&gt; &lt;span class="n"&gt;results&lt;/span&gt;

&lt;span class="c1"&gt;# Wire these into your MCP SDK of choice (FastMCP, custom, etc.)
&lt;/span&gt;&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Use an MCP SDK (for example, FastMCP) to register these as tools/resources with stdio/SSE transports.&lt;/p&gt;




&lt;h2&gt;
  
  
  Security, Performance, and Governance
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Trust but verify: llms.txt is discovery, not trust. Enforce domain allowlists in your server and host.&lt;/li&gt;
&lt;li&gt;Prefer &lt;code&gt;.md&lt;/code&gt; mirrors: Faster, cleaner parsing. Many sites expose &lt;code&gt;.md&lt;/code&gt; or content‑negotiated Markdown.&lt;/li&gt;
&lt;li&gt;Rate limits &amp;amp; caching:

&lt;ul&gt;
&lt;li&gt;Cache fetched pages by URL and ETag/Last‑Modified.&lt;/li&gt;
&lt;li&gt;Backoff on 429s; respect robots and publisher guidance if applicable.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Permissions:

&lt;ul&gt;
&lt;li&gt;For private docs, authenticate and log all access.&lt;/li&gt;
&lt;li&gt;Keep IDE/host tool traces enabled for auditing.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;Governance:

&lt;ul&gt;
&lt;li&gt;If you publish docs, add &lt;code&gt;/llms.txt&lt;/code&gt; and consider an “Optional” section for secondary content.&lt;/li&gt;
&lt;li&gt;Review and prune your llms.txt to keep it tight and useful.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;




&lt;h2&gt;
  
  
  Troubleshooting
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;Domain blocked errors:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Add &lt;code&gt;--allowed-domains&lt;/code&gt; for off‑domain links, e.g.:
&lt;/li&gt;
&lt;/ul&gt;

&lt;pre class="highlight shell"&gt;&lt;code&gt;uvx &lt;span class="nt"&gt;--from&lt;/span&gt; mcpdoc mcpdoc &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--urls&lt;/span&gt; &lt;span class="s2"&gt;"Stripe:https://docs.stripe.com/llms.txt"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--allowed-domains&lt;/span&gt; docs.stripe.com,anotherdomain.com &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--transport&lt;/span&gt; stdio
&lt;/code&gt;&lt;/pre&gt;




&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;Tools not visible in IDE:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Ensure the process is running and using &lt;code&gt;--transport stdio&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Validate your JSON config paths and syntax.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;Python/uv path issues:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Add &lt;code&gt;--python /path/to/python&lt;/code&gt; to &lt;code&gt;uvx&lt;/code&gt; args (Claude Desktop often needs this).&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;Agent ignores the server:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Add a short rule reminding it to use &lt;code&gt;list_doc_sources&lt;/code&gt; and &lt;code&gt;fetch_docs&lt;/code&gt; before answering Stripe questions.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;




&lt;h2&gt;
  
  
  Example: A Complete Developer Flow
&lt;/h2&gt;

&lt;p&gt;1) Start server (stdio for IDE use):&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;uvx &lt;span class="nt"&gt;--from&lt;/span&gt; mcpdoc mcpdoc &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--urls&lt;/span&gt; &lt;span class="s2"&gt;"Stripe:https://docs.stripe.com/llms.txt"&lt;/span&gt; &lt;span class="se"&gt;\&lt;/span&gt;
  &lt;span class="nt"&gt;--transport&lt;/span&gt; stdio
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;2) Configure your IDE (Cursor/Windsurf/Claude Desktop) as shown above.&lt;/p&gt;

&lt;p&gt;3) Ask: “What’s the recommended approach to handle Stripe webhook signature verification?”&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Agent calls &lt;code&gt;list_doc_sources&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Agent fetches &lt;code&gt;https://docs.stripe.com/llms.txt&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Agent fetches &lt;code&gt;https://docs.stripe.com/webhooks.md&lt;/code&gt; and &lt;code&gt;https://docs.stripe.com/webhooks/signature.md&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Agent answers with steps and includes links for verification&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Result: An answer grounded in live Stripe docs, fetched just‑in‑time.&lt;/p&gt;




&lt;h2&gt;
  
  
  Wrap‑Up
&lt;/h2&gt;

&lt;p&gt;llms.txt gives your agents a curated map; MCP turns that map into action. By plugging Stripe’s &lt;code&gt;llms.txt&lt;/code&gt; (or any other) into an MCP docs server, you get:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Source‑backed answers&lt;/li&gt;
&lt;li&gt;Token efficiency and faster runs&lt;/li&gt;
&lt;li&gt;Auditable, composable workflows across multiple documentation sets&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If you’d like, I can:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Generate a ready‑to‑paste IDE config for your team&lt;/li&gt;
&lt;li&gt;Add rules that force “fetch before answer” for your key vendors&lt;/li&gt;
&lt;li&gt;Scaffold a custom MCP server that merges Stripe docs with your internal guides behind auth&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Happy building—and happy fetching.&lt;/p&gt;

</description>
      <category>mcp</category>
      <category>llm</category>
      <category>ai</category>
      <category>tutorial</category>
    </item>
    <item>
      <title>llms.txt: From “Nice to Have” to Practical Advantage</title>
      <dc:creator>Everyday Dev</dc:creator>
      <pubDate>Mon, 01 Sep 2025 07:37:41 +0000</pubDate>
      <link>https://dev.to/everyday_dev/is-the-llmstxt-a-waste-of-time-59ja</link>
      <guid>https://dev.to/everyday_dev/is-the-llmstxt-a-waste-of-time-59ja</guid>
      <description>&lt;p&gt;Large Language Models (LLMs) don’t “browse” websites like humans. They fetch targeted context and synthesize answers. That’s where llms.txt comes in: a compact, human‑ and LLM‑readable map that highlights the most important, LLM‑friendly pages of your site.&lt;/p&gt;

&lt;p&gt;Think of it as “sitemap for AI”—but curated. Where robots.txt tells crawlers what not to index, llms.txt tells LLMs what is most worth reading and how to read it efficiently.&lt;/p&gt;

&lt;h2&gt;
  
  
  What llms.txt Is (and What It Isn’t)
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;It is:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A Markdown file at the root of your site (usually at /llms.txt).&lt;/li&gt;
&lt;li&gt;A curated index of high‑signal pages, ideally with Markdown mirrors (.md) for clean parsing.&lt;/li&gt;
&lt;li&gt;Structured to be easy for agents and tools to parse quickly.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;It isn’t:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A full sitemap clone or a complete crawl target.&lt;/li&gt;
&lt;li&gt;A ranking/SEO silver bullet.&lt;/li&gt;
&lt;li&gt;A magic switch that all LLMs use automatically today.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;The practical value today: it gives your developers and AI tools a stable, predictable way to fetch the right docs, examples, and references—on demand, with fewer tokens and fewer hallucinations.&lt;/p&gt;

&lt;h2&gt;
  
  
  State of Adoption (Reality Check)
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Many early implementations copy a sitemap into Markdown. This is not LLM‑friendly and defeats the purpose. Curate, don’t dump.&lt;/li&gt;
&lt;li&gt;Major LLM providers vary in how they use llms.txt right now. Adoption is growing in the developer‑tools ecosystem (IDEs, MCP servers, agents) because it immediately improves answer quality and traceability.&lt;/li&gt;
&lt;li&gt;Low effort, immediate benefits: Even a minimal, well‑curated llms.txt helps your team’s agents today, regardless of broad search engine adoption.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Bottom line: Don’t wait for a “universal mandate.” Make your docs better for your own agents and developer workflows now.&lt;/p&gt;

&lt;h2&gt;
  
  
  A Good llms.txt: Structure and Conventions
&lt;/h2&gt;

&lt;p&gt;Aim for clarity and token efficiency. A typical llms.txt:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;H1 title&lt;/li&gt;
&lt;li&gt;Optional one‑sentence summary (blockquote)&lt;/li&gt;
&lt;li&gt;H2 sections grouping related links&lt;/li&gt;
&lt;li&gt;Bulleted links to the most helpful pages, preferably .md versions&lt;/li&gt;
&lt;li&gt;An “Optional” or “Additional” section for nice‑to‑have pages&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Example skeleton:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Acme Docs

&amp;gt; Official developer documentation for Acme APIs, SDKs, and integrations.

## Getting Started
- [Overview](https://docs.acme.com/overview.md)
- [Quickstart](https://docs.acme.com/quickstart.md)

## API
- [API Overview](https://docs.acme.com/api.md)
- [Authentication](https://docs.acme.com/api/auth.md)
- [Errors](https://docs.acme.com/api/errors.md)

## Webhooks
- [Webhooks](https://docs.acme.com/webhooks.md)
- [Signatures](https://docs.acme.com/webhooks/signature.md)

## Optional
- [Changelog](https://docs.acme.com/changelog.md)

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Key tips:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Prefer .md mirrors where possible.&lt;/li&gt;
&lt;li&gt;Keep sections short and editorially selected (10–50 links beats 500).&lt;/li&gt;
&lt;li&gt;Link canonical pages, not marketing LPs or UI‑heavy content.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  The Fastest Path: Autogenerate llms.txt with llms.page
&lt;/h2&gt;

&lt;p&gt;If you don’t have llms.txt yet, you can bootstrap one automatically.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Visit: &lt;a href="https://llms.page" rel="noopener noreferrer"&gt;llms.page&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Or fetch directly from your server, on demand:

&lt;ul&gt;
&lt;li&gt;&lt;code&gt;https://get.llms.page/{yourdomain}/llms.txt&lt;/code&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Example:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;https://get.llms.page/github.com/llms.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;How it works:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Send a GET request to &lt;a href="https://get.llms.page/example.com/llms.txt" rel="noopener noreferrer"&gt;https://get.llms.page/example.com/llms.txt&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;The service analyzes your homepage/internal links and produces a curated llms.txt‑style Markdown file on the fly.&lt;/li&gt;
&lt;li&gt;You can fetch and serve that response directly from your own domain, or cache it locally if you prefer. No scheduler required.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Note: Auto‑generated files are a great starting point. For best results, review and prune to emphasize the most useful pages and, when possible, swap links to .md mirrors.&lt;/p&gt;
&lt;h2&gt;
  
  
  Integration Options
&lt;/h2&gt;

&lt;p&gt;You have two easy ways to publish:&lt;/p&gt;

&lt;p&gt;1) Redirect your /llms.txt to llms.page&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Pros: Zero backend changes, always fresh.&lt;/li&gt;
&lt;li&gt;Cons: Depends on a third‑party service.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Nginx example:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight nginx"&gt;&lt;code&gt;&lt;span class="k"&gt;location&lt;/span&gt; &lt;span class="p"&gt;=&lt;/span&gt; &lt;span class="n"&gt;/llms.txt&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kn"&gt;return&lt;/span&gt; &lt;span class="mi"&gt;302&lt;/span&gt; &lt;span class="s"&gt;https://get.llms.page/&lt;/span&gt;&lt;span class="nv"&gt;$host&lt;/span&gt;&lt;span class="n"&gt;/llms.txt&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="p"&gt;}&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;Apache example:&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight apache"&gt;&lt;code&gt;&lt;span class="nc"&gt;Redirect&lt;/span&gt; 302 /llms.txt https://get.llms.page/%{HTTP_HOST}/llms.txt
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;p&gt;2) Fetch and serve from your domain&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Pros: Full control, can cache, works offline, stable.&lt;/li&gt;
&lt;li&gt;Cons: You’re responsible for when/how you refresh.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Simple fetch‑and‑serve (Node, on request or during build/deploy):&lt;br&gt;
&lt;/p&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;&lt;span class="c"&gt;# fetch_llms_txt.sh&lt;/span&gt;
node fetch_llms_txt.mjs
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight javascript"&gt;&lt;code&gt;&lt;span class="c1"&gt;// fetch_llms_txt.mjs&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;fs&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;node:fs/promises&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="k"&gt;import&lt;/span&gt; &lt;span class="nx"&gt;https&lt;/span&gt; &lt;span class="k"&gt;from&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;node:https&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;domain&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;argv&lt;/span&gt;&lt;span class="p"&gt;[&lt;/span&gt;&lt;span class="mi"&gt;2&lt;/span&gt;&lt;span class="p"&gt;]&lt;/span&gt; &lt;span class="o"&gt;||&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;example.com&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;span class="kd"&gt;const&lt;/span&gt; &lt;span class="nx"&gt;url&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="s2"&gt;`https://get.llms.page/&lt;/span&gt;&lt;span class="p"&gt;${&lt;/span&gt;&lt;span class="nx"&gt;domain&lt;/span&gt;&lt;span class="p"&gt;}&lt;/span&gt;&lt;span class="s2"&gt;/llms.txt`&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;

&lt;span class="nx"&gt;https&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;get&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;url&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
  &lt;span class="kd"&gt;let&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt; &lt;span class="o"&gt;=&lt;/span&gt; &lt;span class="dl"&gt;""&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
  &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;on&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;data&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;chunk&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;data&lt;/span&gt; &lt;span class="o"&gt;+=&lt;/span&gt; &lt;span class="nx"&gt;chunk&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
  &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;on&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;end&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;async &lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="o"&gt;=&amp;gt;&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
    &lt;span class="k"&gt;if &lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;statusCode&lt;/span&gt; &lt;span class="o"&gt;===&lt;/span&gt; &lt;span class="mi"&gt;200&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="k"&gt;await&lt;/span&gt; &lt;span class="nx"&gt;fs&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;writeFile&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;./public/llms.txt&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;utf8&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
      &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;log&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Wrote ./public/llms.txt&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt; &lt;span class="k"&gt;else&lt;/span&gt; &lt;span class="p"&gt;{&lt;/span&gt;
      &lt;span class="nx"&gt;console&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;error&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="s2"&gt;Failed:&lt;/span&gt;&lt;span class="dl"&gt;"&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;res&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nx"&gt;statusCode&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="nx"&gt;data&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;slice&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;0&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="mi"&gt;500&lt;/span&gt;&lt;span class="p"&gt;));&lt;/span&gt;
      &lt;span class="nx"&gt;process&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="nf"&gt;exit&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="mi"&gt;1&lt;/span&gt;&lt;span class="p"&gt;);&lt;/span&gt;
    &lt;span class="p"&gt;}&lt;/span&gt;
  &lt;span class="p"&gt;});&lt;/span&gt;
&lt;span class="p"&gt;});&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;


&lt;ul&gt;
&lt;li&gt;Serve &lt;code&gt;./public/llms.txt&lt;/code&gt; at &lt;code&gt;https://yourdomain.com/llms.txt&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;Optional: add a periodic refresh (cron) if you want it to stay fresh without redeploys:
&lt;/li&gt;
&lt;/ul&gt;
&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight shell"&gt;&lt;code&gt;0 2 &lt;span class="k"&gt;*&lt;/span&gt; &lt;span class="k"&gt;*&lt;/span&gt; &lt;span class="k"&gt;*&lt;/span&gt; /usr/local/bin/bash /path/fetch_llms_txt.sh yourdomain.com
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;

&lt;h2&gt;
  
  
  Why Bother Now? Practical Benefits
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;Faster, cleaner answers for your team’s AI tools and IDEs&lt;/li&gt;
&lt;li&gt;Lower token usage by pointing agents to clean Markdown&lt;/li&gt;
&lt;li&gt;Less hallucination, more citations&lt;/li&gt;
&lt;li&gt;Easy to automate and keep up to date&lt;/li&gt;
&lt;/ul&gt;
&lt;h2&gt;
  
  
  Bonus: Plug llms.txt into Your MCP Workflow
&lt;/h2&gt;

&lt;p&gt;If you already use the Model Context Protocol (MCP), you can let your IDE/agent fetch the right docs automatically.&lt;/p&gt;


&lt;div class="ltag__link--embedded"&gt;
  &lt;div class="crayons-story "&gt;
  &lt;a href="https://dev.to/everyday_dev/integrate-any-llmstxt-into-your-mcp-with-stripe-as-the-example-j33" class="crayons-story__hidden-navigation-link"&gt;Integrate Any LLMs.txt into Your MCP (with Stripe as the Example)&lt;/a&gt;


  &lt;div class="crayons-story__body crayons-story__body-full_post"&gt;
    &lt;div class="crayons-story__top"&gt;
      &lt;div class="crayons-story__meta"&gt;
        &lt;div class="crayons-story__author-pic"&gt;

          &lt;a href="/everyday_dev" class="crayons-avatar  crayons-avatar--l  "&gt;
            &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F2968354%2Fe8ee31c0-f4a7-4fa5-91ed-6ed7c2020f65.jpg" alt="everyday_dev profile" class="crayons-avatar__image"&gt;
          &lt;/a&gt;
        &lt;/div&gt;
        &lt;div&gt;
          &lt;div&gt;
            &lt;a href="/everyday_dev" class="crayons-story__secondary fw-medium m:hidden"&gt;
              Everyday Dev
            &lt;/a&gt;
            &lt;div class="profile-preview-card relative mb-4 s:mb-0 fw-medium hidden m:inline-block"&gt;
              
                Everyday Dev
                
              
              &lt;div id="story-author-preview-content-2812595" class="profile-preview-card__content crayons-dropdown branded-7 p-4 pt-0"&gt;
                &lt;div class="gap-4 grid"&gt;
                  &lt;div class="-mt-4"&gt;
                    &lt;a href="/everyday_dev" class="flex"&gt;
                      &lt;span class="crayons-avatar crayons-avatar--xl mr-2 shrink-0"&gt;
                        &lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Fuser%2Fprofile_image%2F2968354%2Fe8ee31c0-f4a7-4fa5-91ed-6ed7c2020f65.jpg" class="crayons-avatar__image" alt=""&gt;
                      &lt;/span&gt;
                      &lt;span class="crayons-link crayons-subtitle-2 mt-5"&gt;Everyday Dev&lt;/span&gt;
                    &lt;/a&gt;
                  &lt;/div&gt;
                  &lt;div class="print-hidden"&gt;
                    
                      Follow
                    
                  &lt;/div&gt;
                  &lt;div class="author-preview-metadata-container"&gt;&lt;/div&gt;
                &lt;/div&gt;
              &lt;/div&gt;
            &lt;/div&gt;

          &lt;/div&gt;
          &lt;a href="https://dev.to/everyday_dev/integrate-any-llmstxt-into-your-mcp-with-stripe-as-the-example-j33" class="crayons-story__tertiary fs-xs"&gt;&lt;time&gt;Sep 1 '25&lt;/time&gt;&lt;span class="time-ago-indicator-initial-placeholder"&gt;&lt;/span&gt;&lt;/a&gt;
        &lt;/div&gt;
      &lt;/div&gt;

    &lt;/div&gt;

    &lt;div class="crayons-story__indention"&gt;
      &lt;h2 class="crayons-story__title crayons-story__title-full_post"&gt;
        &lt;a href="https://dev.to/everyday_dev/integrate-any-llmstxt-into-your-mcp-with-stripe-as-the-example-j33" id="article-link-2812595"&gt;
          Integrate Any LLMs.txt into Your MCP (with Stripe as the Example)
        &lt;/a&gt;
      &lt;/h2&gt;
        &lt;div class="crayons-story__tags"&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/mcp"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;mcp&lt;/a&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/llm"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;llm&lt;/a&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/ai"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;ai&lt;/a&gt;
            &lt;a class="crayons-tag  crayons-tag--monochrome " href="/t/tutorial"&gt;&lt;span class="crayons-tag__prefix"&gt;#&lt;/span&gt;tutorial&lt;/a&gt;
        &lt;/div&gt;
      &lt;div class="crayons-story__bottom"&gt;
        &lt;div class="crayons-story__details"&gt;
          &lt;a href="https://dev.to/everyday_dev/integrate-any-llmstxt-into-your-mcp-with-stripe-as-the-example-j33" class="crayons-btn crayons-btn--s crayons-btn--ghost crayons-btn--icon-left"&gt;
            &lt;div class="multiple_reactions_aggregate"&gt;
              &lt;span class="multiple_reactions_icons_container"&gt;
                  &lt;span class="crayons_icon_container"&gt;
                    &lt;img src="https://assets.dev.to/assets/exploding-head-daceb38d627e6ae9b730f36a1e390fca556a4289d5a41abb2c35068ad3e2c4b5.svg" width="18" height="18"&gt;
                  &lt;/span&gt;
                  &lt;span class="crayons_icon_container"&gt;
                    &lt;img src="https://assets.dev.to/assets/multi-unicorn-b44d6f8c23cdd00964192bedc38af3e82463978aa611b4365bd33a0f1f4f3e97.svg" width="18" height="18"&gt;
                  &lt;/span&gt;
                  &lt;span class="crayons_icon_container"&gt;
                    &lt;img src="https://assets.dev.to/assets/sparkle-heart-5f9bee3767e18deb1bb725290cb151c25234768a0e9a2bd39370c382d02920cf.svg" width="18" height="18"&gt;
                  &lt;/span&gt;
              &lt;/span&gt;
              &lt;span class="aggregate_reactions_counter"&gt;5&lt;span class="hidden s:inline"&gt; reactions&lt;/span&gt;&lt;/span&gt;
            &lt;/div&gt;
          &lt;/a&gt;
            &lt;a href="https://dev.to/everyday_dev/integrate-any-llmstxt-into-your-mcp-with-stripe-as-the-example-j33#comments" class="crayons-btn crayons-btn--s crayons-btn--ghost crayons-btn--icon-left flex items-center"&gt;
              Comments


              &lt;span class="hidden s:inline"&gt;Add Comment&lt;/span&gt;
            &lt;/a&gt;
        &lt;/div&gt;
        &lt;div class="crayons-story__save"&gt;
          &lt;small class="crayons-story__tertiary fs-xs mr-2"&gt;
            6 min read
          &lt;/small&gt;
            
              &lt;span class="bm-initial"&gt;
                

              &lt;/span&gt;
              &lt;span class="bm-success"&gt;
                

              &lt;/span&gt;
            
        &lt;/div&gt;
      &lt;/div&gt;
    &lt;/div&gt;
  &lt;/div&gt;
&lt;/div&gt;


&lt;/div&gt;
&lt;br&gt;


&lt;h2&gt;
  
  
  FAQ
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;Will this help SEO?&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Indirectly at best. The main benefit is better agent performance and more reliable, cited answers for your users and teams.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;Do I need perfect Markdown mirrors?&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;No, but they help a lot. Start with HTML if you must, then add .md mirrors over time.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;What if my site structure changes often?&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Use the llms.page redirect, fetch‑and‑serve on demand, or add an optional scheduled refresh to keep /llms.txt fresh with minimal effort.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;llms.txt is low‑effort leverage: a lightweight, curated map that makes your content easier for agents to consume accurately and efficiently. Autogenerate it today, polish the top sections, and wire it into your MCP workflow so your team gets faster, source‑backed answers—no heavy infrastructure required.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>webdev</category>
      <category>programming</category>
      <category>llm</category>
    </item>
    <item>
      <title>Automate Emails with AI: Building an Email Agent in n8n</title>
      <dc:creator>Everyday Dev</dc:creator>
      <pubDate>Sun, 23 Mar 2025 15:15:39 +0000</pubDate>
      <link>https://dev.to/everyday_dev/automate-emails-with-ai-building-an-email-agent-in-n8n-2ajk</link>
      <guid>https://dev.to/everyday_dev/automate-emails-with-ai-building-an-email-agent-in-n8n-2ajk</guid>
      <description>&lt;p&gt;In this tutorial, we'll walk through building an AI-powered email assistant using &lt;strong&gt;n8n&lt;/strong&gt;, &lt;strong&gt;OpenAI&lt;/strong&gt;, and &lt;strong&gt;Gmail&lt;/strong&gt;. This agent will send, retrieve, and reply to emails—entirely driven by natural language.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv2gre1zu2yidywz8usqn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv2gre1zu2yidywz8usqn.png" alt="n8n AI email agent with OpenAI and Gmail integration." width="800" height="507"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  ⚙️ What You'll Build
&lt;/h2&gt;

&lt;p&gt;A conversational email agent that can:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Send emails via AI prompts&lt;/li&gt;
&lt;li&gt;Read your inbox&lt;/li&gt;
&lt;li&gt;Reply to messages&lt;/li&gt;
&lt;li&gt;Maintain context across chats&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Let’s dive in.&lt;/p&gt;




&lt;h2&gt;
  
  
  🧠 Prerequisites
&lt;/h2&gt;

&lt;ul&gt;
&lt;li&gt;An &lt;a href="https://n8n.io" rel="noopener noreferrer"&gt;n8n&lt;/a&gt; instance (self-hosted or cloud)&lt;/li&gt;
&lt;li&gt;Gmail OAuth2 credentials set up in n8n&lt;/li&gt;
&lt;li&gt;OpenAI API credentials&lt;/li&gt;
&lt;li&gt;Basic familiarity with n8n workflows&lt;/li&gt;
&lt;/ul&gt;




&lt;h2&gt;
  
  
  🗺️ Workflow Blueprint
&lt;/h2&gt;

&lt;p&gt;Here's the high-level flow:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;Trigger&lt;/strong&gt; – Waits for a chat message&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AI Agent&lt;/strong&gt; – Parses intent (send, get, reply)&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;OpenAI Chat Model&lt;/strong&gt; – Powers the agent's brain&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Gmail Tool&lt;/strong&gt; – Performs email actions&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Memory Buffer&lt;/strong&gt; – Maintains chat context&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  🔧 Step-by-Step Setup
&lt;/h2&gt;

&lt;h3&gt;
  
  
  1. &lt;strong&gt;Trigger: Start When a Chat Message Is Received&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Use the &lt;code&gt;When chat message received&lt;/code&gt; node. This will simulate a user sending instructions to the agent, like:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Send an email to alice@example.com saying “Meeting moved to 3 PM.”
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxlcomr8sdb8464ahyh9p.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxlcomr8sdb8464ahyh9p.png" alt="n8n trigger node for receiving chat messages." width="337" height="274"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  2. &lt;strong&gt;AI Agent: Interpret the Command&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Add an &lt;code&gt;AI Agent&lt;/code&gt; node from the Advanced AI collection.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Prompt Type&lt;/strong&gt;: &lt;code&gt;Define Below&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Text&lt;/strong&gt;: &lt;code&gt;{{ $json.chatInput }}&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Connect this to:

&lt;ul&gt;
&lt;li&gt;Language Model (next step)&lt;/li&gt;
&lt;li&gt;Memory (Simple Memory node)&lt;/li&gt;
&lt;li&gt;Tools (Gmail nodes)&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx69jvuswqs2e0ck1qh99.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx69jvuswqs2e0ck1qh99.png" alt="n8n AI Agent node setup using Tools Agent." width="800" height="331"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  3. &lt;strong&gt;LLM: Add the OpenAI Chat Model&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Use the &lt;code&gt;OpenAI Chat Model&lt;/code&gt; node.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Model&lt;/strong&gt;: &lt;code&gt;gpt-4o-mini&lt;/code&gt; (or another OpenAI model)&lt;/li&gt;
&lt;li&gt;Provide your OpenAI API credentials&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Connect this to the &lt;code&gt;AI Agent&lt;/code&gt; via &lt;code&gt;Chat Model&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkvrf0pkxz1g7rj759faj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkvrf0pkxz1g7rj759faj.png" alt="OpenAI Chat Model node configured in n8n." width="800" height="545"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  4. &lt;strong&gt;Memory: Store Chat Context&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Add a &lt;code&gt;Simple Memory&lt;/code&gt; node.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Connect it to the AI Agent via &lt;code&gt;Memory&lt;/code&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This allows your agent to remember previous commands and carry conversation context.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxm4dsqyyr3d4gopf6vwc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxm4dsqyyr3d4gopf6vwc.png" alt="Simple Memory node in n8n for storing chat context." width="800" height="514"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  📬 Gmail Tool Integration
&lt;/h2&gt;

&lt;p&gt;You’ll need to authenticate with Gmail via OAuth2 in n8n.&lt;/p&gt;

&lt;h3&gt;
  
  
  a) &lt;strong&gt;Send Emails&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Add a &lt;code&gt;Gmail Tool&lt;/code&gt; node:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Operation&lt;/strong&gt;: &lt;code&gt;Send&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;To&lt;/strong&gt;: &lt;code&gt;{{ $fromAI('To') }}&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Subject&lt;/strong&gt;: &lt;code&gt;{{ $fromAI('Subject') }}&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Message&lt;/strong&gt;: &lt;code&gt;{{ $fromAI('Message') }}&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Connect this to the AI Agent via &lt;code&gt;Tool&lt;/code&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  b) &lt;strong&gt;Get Recent Emails&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Add another &lt;code&gt;Gmail Tool&lt;/code&gt; node:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Operation&lt;/strong&gt;: &lt;code&gt;Get Many&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Limit&lt;/strong&gt;: &lt;code&gt;10&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;Same credentials, connected to the AI Agent.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  c) &lt;strong&gt;Reply to Emails&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Add one more &lt;code&gt;Gmail Tool&lt;/code&gt; node:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Operation&lt;/strong&gt;: &lt;code&gt;Reply&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Message ID&lt;/strong&gt;: &lt;code&gt;{{ $fromAI('Message_ID') }}&lt;/code&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Message&lt;/strong&gt;: &lt;code&gt;{{ $fromAI('Message') }}&lt;/code&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Same connection and credential setup.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv2gre1zu2yidywz8usqn.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv2gre1zu2yidywz8usqn.png" alt="n8n AI email agent with OpenAI and Gmail integration." width="800" height="507"&gt;&lt;/a&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  🚀 Example Prompts
&lt;/h2&gt;

&lt;p&gt;Try any of these in your chat interface:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Send an email to john@example.com with subject “Invoice” and say “Please find the attached invoice.”
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Show me my latest 5 emails.
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;





&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;Reply to the email from Jane saying “Thanks, sounds good!”
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  🧪 Test &amp;amp; Deploy
&lt;/h2&gt;

&lt;p&gt;Once everything is connected:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Activate the workflow&lt;/li&gt;
&lt;li&gt;Send chat prompts to trigger the workflow&lt;/li&gt;
&lt;li&gt;Watch your AI assistant handle your emails ✨&lt;/li&gt;
&lt;/ol&gt;




&lt;h2&gt;
  
  
  📦 Final Thoughts
&lt;/h2&gt;

&lt;p&gt;This Email Agent leverages the power of LLMs and n8n's automation to streamline your inbox. Expand this further by integrating file attachments, smart filtering, or even calendar scheduling.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;💡 Pro tip: You can extend this to other email platforms like Outlook with minimal changes.&lt;/p&gt;
&lt;/blockquote&gt;

</description>
      <category>ai</category>
      <category>tutorial</category>
      <category>opensource</category>
      <category>learning</category>
    </item>
  </channel>
</rss>
