Hello, I'm Maneshwar. I’m building LiveReview, a private AI code review tool that runs on your LLM key (OpenAI, Gemini, etc.) with highly competitive pricing -- built for small teams. Do check it out and give it a try!
In the first part of this guide, we covered the basics of SEO: helping Google find your site, creating great content, optimizing links, titles, images, and promoting your site.
In this second part, we’re going a step further. We’ll focus on how to make your site discoverable not only by search engines, but also by AI-powered tools and large language models (LLMs) like ChatGPT, Perplexity, or Claude.
Why LLM Discovery Matters
These AI systems often pull data from the web to answer user queries.
If your site is structured properly, it’s more likely to be surfaced when someone asks: “Where can I find free developer tools?”
So the goal isn’t just to rank in Google—it’s to give AI systems clear signals about what your page contains.
1. Sitemaps: Your Website’s Map for Crawlers
A sitemap is an XML file listing all the important pages on your site.
It helps search engines and AI crawlers find content quickly.
Here’s an example for your free dev tools section (Free DevTools):
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
<url>
<loc>https://hexmos.com/freedevtools/t/</loc>
<lastmod>2025-08-23</lastmod>
<changefreq>weekly</changefreq>
<priority>1.0</priority>
</url>
<url>
<loc>https://hexmos.com/freedevtools/t/json-prettifier/</loc>
<changefreq>weekly</changefreq>
<priority>0.9</priority>
</url>
<url>
<loc>https://hexmos.com/freedevtools/t/password-generator/</loc>
<changefreq>weekly</changefreq>
<priority>0.9</priority>
</url>
<url>
<loc>https://hexmos.com/freedevtools/t/dockerfile-linter/</loc>
<changefreq>weekly</changefreq>
<priority>0.9</priority>
</url>
<url>
<loc>https://hexmos.com/freedevtools/t/date-time-converter/</loc>
<changefreq>weekly</changefreq>
<priority>0.9</priority>
</url>
</urlset>
Tips:
- Update
<lastmod>
whenever a tool is updated. - Use
<changefreq>
to signal how often pages change. -
<priority>
indicates which pages are most important.
Once you have this, make sure it’s referenced in your main sitemap index (sitemap.xml
) and in your robots.txt
so crawlers know it exists.
2. Structured Data: Tell AI Exactly What Your Page Contains
Structured data uses Schema.org markup (usually JSON-LD) to describe your page’s content. This gives crawlers and LLMs a “cheat sheet” about your page.
Example for the Free Dev Tools page:
<script type="application/ld+json">
{
"@context": "https://schema.org",
"@type": "ItemList",
"name": "Free Developer Tools",
"description": "A collection of free developer tools like JSON Prettifier, Password Generator, Dockerfile Linter, and Date Time Converter.",
"url": "https://hexmos.com/freedevtools/t/",
"itemListElement": [
{
"@type": "ListItem",
"position": 1,
"url": "https://hexmos.com/freedevtools/t/json-prettifier/",
"name": "JSON Prettifier",
"description": "Format and beautify JSON instantly."
},
{
"@type": "ListItem",
"position": 2,
"url": "https://hexmos.com/freedevtools/t/password-generator/",
"name": "Password Generator",
"description": "Generate strong random passwords securely."
},
{
"@type": "ListItem",
"position": 3,
"url": "https://hexmos.com/freedevtools/t/dockerfile-linter/",
"name": "Dockerfile Linter",
"description": "Lint and validate your Dockerfiles."
},
{
"@type": "ListItem",
"position": 4,
"url": "https://hexmos.com/freedevtools/t/date-time-converter/",
"name": "Date Time Converter",
"description": "Convert dates and times across formats and zones."
}
]
}
</script>
Why it matters:
- Google can show rich snippets for your page.
- LLMs can better understand that your page is a list of developer tools, improving the chance they’ll cite it.
3. Make Your Site AI-Friendly
Some AI crawlers, like GPTBot, respect robots.txt
. You can explicitly allow them to index your content: Example (Hexmos)
User-agent: GPTBot
Allow: /
User-agent: PerplexityBot
Allow: /
User-agent: ClaudeBot
Allow: /
This signals to AI crawlers that your dev tools page is intended to be indexed and referenced.
4. Promote Your Tools Page
Even with perfect SEO and structured data, promotion is still key. AI models often learn from popular or highly-linked pages.
- Share your dev tools page on Reddit, Hacker News, Dev.to, Hashnode.
- Mention it in GitHub READMEs of relevant repositories.
- Engage communities where developers hang out.
Word-of-mouth, backlinks, and social signals help both search engines and AI discover your content faster.
5. Keep Optimizing Content for Both Humans and AI
- Use clear headings, concise text, and descriptive links.
- Write your content to answer likely search questions: “best free developer tools,” “JSON formatter online,” etc.
- Update your structured data whenever you add new tools.
Key Takeaways
- Sitemaps = tell crawlers where your pages are.
- Structured data = tell crawlers and LLMs what your pages actually contain.
- AI-friendly robots.txt = explicitly allow major AI crawlers.
- Promotion & backlinks = increase visibility and LLM citation likelihood.
- Content quality = still the most important factor for both humans and AI.
By combining traditional SEO with these LLM-focused techniques, your free dev tools page is more likely to appear:
- In Google search results.
- When someone asks AI systems like ChatGPT or Perplexity about free developer tools.
LiveReview helps you get great feedback on your PR/MR in a few minutes.
Saves hours on every PR by giving fast, automated first-pass reviews.
If you're tired of waiting for your peer to review your code or are not confident that they'll provide valid feedback, here's LiveReview for you.
Top comments (1)
Thank you