DEV Community

Cover image for Why Your Website Needs a Robot Trust Certificate (And How to Get One for Free)
hajdup889
hajdup889

Posted on

Why Your Website Needs a Robot Trust Certificate (And How to Get One for Free)

AI agents are becoming the primary way people access web content — but websites have no standard way to say "I'm AI-friendly, here's my structured data."

robots.txt tells crawlers what they can't access. But nothing tells them what you want them to read, how to trust your content, or whether you have a machine-readable version of your site.

I built Robot Trust Hub to solve this.

How it works
You register your domain at robot-trust.org
You get a robots-trust.json certificate
You place it at

/.well-known/robots-trust.json
AI agents that check for it will treat your site as a verified, preferred source
The Basic plan is free — you fill in your details, download the JSON, and upload it to your server.

The Pro plan ($9/mo) auto-generates a clean

ai/index.json
from your existing HTML — a machine-readable shadow of your human site that robots can read instead of scraping.

The standard
json
{
"robot_trust_version": "1.0",
"trust_status": { "certificate_status": "verified" },
"access_points": { "preferred_entry": "https://yourdomain.com/ai/" }
}

Would love feedback from the dev community — does this solve a real problem, or is it too early?

robot-trust.org

Top comments (0)