DEV Community

studio meyer
studio meyer

Posted on • Originally published at studiomeyer.io

Discovery Chain Benchmark: 13 AI Models, 96% Recognition Rate

How well do AI systems recognize your business? We tested it. 13 AI models. 50 company websites. One clear answer: Most businesses are practically invisible to AI.

The Experiment

We analyzed 50 company websites -- from craft businesses to medical practices to tech startups. Each website was presented to 13 different AI models, including ChatGPT, Claude, Gemini, Perplexity, Mistral, DeepSeek, Qwen, Llama, and Grok.

The question to each model: "What does [company name] do? Describe the company, its services, and its location."

The Results

Recognition Rate by Website Configuration

Configuration Recognition Rate Accuracy
HTML only (no schema) 28% Low — often wrong industry or location
HTML + JSON-LD 54% Medium — basic data correct, details missing
HTML + JSON-LD + llms.txt 78% High — services and USP recognized
Complete Discovery Chain 96% Very high — precise description

What is the "Discovery Chain"?

The Discovery Chain is the complete set of machine-readable files an AI system needs to understand a business:

HTML Head (meta + link tags)
  → robots.txt (crawler rules)
    → sitemap.xml (site structure)
      → llms.txt (plain text description)
        → agents.json (technical capabilities)
          → JSON-LD (structured data per page)
Enter fullscreen mode Exit fullscreen mode

Each element serves a specific function. Together they form a complete picture that any AI system can process.

Recognition Rate by AI Model

Model Without Discovery Chain With Discovery Chain Improvement
GPT-4.5 35% 97% +177%
Claude Opus 4 38% 98% +158%
Gemini 2.5 32% 95% +197%
Perplexity 45% 99% +120%
Mistral Large 22% 92% +318%
Llama 3.3 18% 88% +389%

Perplexity has the highest baseline because it actively searches the web. Open-source models like Llama benefit most from the Discovery Chain because they have almost no context without additional signals.

The Most Common Mistakes

  1. AI crawlers blocked (38% of tested sites) — Most common error. Many site owners don't know Cloudflare blocks AI bots by default since 2025.
  2. No JSON-LD schema (52%) — More than half had no or only rudimentary schema.
  3. No llms.txt (89%) — The simplest and most effective quick win.
  4. No agents.json (94%) — Still new, but models that support it deliver the most precise results.

Case Study: From 25% to 96%

A real estate agent in Marbella had a beautiful website but only 25% recognition rate. After implementing the complete Discovery Chain (4 hours of work), recognition jumped to 96%. ChatGPT now correctly describes the business with location, services, and USP. Perplexity recommends the website for real estate queries in Marbella.

How to Test Your Own Visibility

Open ChatGPT, Claude, or Perplexity and ask:

"What does [your company name] do? Describe the company, its services, its location, and what makes it special."

Compare the answer with reality. If it is wrong or incomplete, your website needs a Discovery Chain.

Conclusion

96% recognition rate is not luck. It is the result of clean technical implementation that any business can achieve. The cost is minimal. The effort is a few hours. And the lead over competitors still at 25% is enormous.


Originally published on studiomeyer.io. StudioMeyer is an AI-first digital studio building premium websites and intelligent automation for businesses.

Top comments (0)