DEV Community

Cover image for I Built a Website Not for Humans: Optimizing for 80% AI Agent Traffic
anhmtk
anhmtk

Posted on

I Built a Website Not for Humans: Optimizing for 80% AI Agent Traffic

Most developers obsess over SEO to attract human clicks. I did the opposite. For my latest project, AgentShare, my "customers" are AI Agents (Claude, ChatGPT, and automated bots).When I checked my Cloudflare dashboard, I saw a "weird" stat: 80% of my traffic comes from data centers in the US and Netherlands. For a regular blog, that’s a red flag. For me, it’s a successful launch.Here is how I optimized for the "Agent Era" without compromising security.1. AIO (AI Optimization) over SEOInstead of fighting for keywords, I focused on being "readable" for an LLM in under 3 seconds:The llm.txt Standard: I provided a dedicated /llm.txt file. No messy HTML scraping; just a clean, markdown-based "menu" of my API and documentation.MCP (Model Context Protocol) First: AgentShare acts as a context server. If you use Claude Desktop or Cursor, you can connect directly to my data layer. No more copy-pasting.2. Security: Opening Doors, But Not to EveryoneOpening your doors to bots is risky. You want the "good" AI, not the "bad" scrapers.I implemented a few hard-learned lessons (shoutout to Claude for the security audit!):Stripping Sensitive Data: All raw payloads from partners (like AliExpress) go through a strip_sensitive function. We filter out internal IDs and tokens before the AI even sees them.Stealth Mode for Admin: I removed my /admin/login from robots.txt (to avoid Google's "blocked but indexed" warnings) and replaced it with a hard . If you aren't the owner, you shouldn't find the door.Blocking "Commercial" Scrapers: I blocked aggressive bots that only tilled my server resources without providing value.3. The Honest Truth: Building with AII’ll be honest: my first spec was a mess. I accidentally exposed debug endpoints in my public documentation.I used Cursor and Claude to audit my own codebase. We found the leaks, fixed the webhook ownership logic, and refined the robots.txt together. This "human-AI collaboration" is what made the current version stable.ConclusionOptimizing for AI isn't just about APIs; it's about trust and structure. If you are building in the Agent space, stop building for eyes and start building for "brains."Check it out at: agentshare.dev (It’s better if you ask your AI about it!)

Top comments (0)