If you are someone who is reading this post, you most likey understand that AI Search is gonna be huge.
I have been studying exactly how to get ranked in AI Search and even though there is no real correct answer. I have determined that every website you create should do the following 3 things to help you get recommended faster.
Publish a Clean, Structured Sitemap
- AI models often discover and interpret your site through sitemaps. If you don’t have one (or it’s outdated), you’re invisible.
- Make sure it’s updated and submitted to Bing Webmaster Tools.
Create an llms.txt File
This is similar to robots.txt, but for AI. It's a new file that lists your preferred LLMs and how they should crawl or cite your content.
If you are using Cursor, windsurf, you can use this prompt to generate one.
"Generate a properly formatted llms.txt file for the website [INSERT YOUR DOMAIN HERE]. The file should specify crawling and citation preferences for major AI crawlers like GPTBot, ClaudeBot, PerplexityBot, and others.
Assume the goal is to:
- Allow citation of content
- Allow crawling of public pages (excluding admin, login, and private areas)
- Include contact information
- Include a policy link if available
- Be future-proof and follow the evolving llms.txt structure Make sure it’s clean, readable, and production-ready."
Write With LLMs in Mind
Forget fluff. AI prefers well-structured content with clear facts, citations, and semantic HTML. Focus on(it's why my blog is an html site only)
- Clear answers to common user questions
- Short, factual statements (we call them “fact nuggets”)
- Using proper heading tags and schema markup (JSON-LD)
This is just the tip of the iceberg of what I've been learning over the past few months. My blog is fully dedicated to this topic because I feel that there is no single, definitive source for everything related to LLMs. You can read the full blog at LLM Logs
Top comments (0)