DEV Community

Peter
Peter

Posted on

I Built a Validator for a Protocol That Didn't Exist 8 Weeks Ago

In January 2026, Google and Shopify announced the Universal Commerce Protocol (UCP) at NRF - an open standard that lets AI agents like ChatGPT, Gemini, and Copilot discover products, browse catalogs, and complete purchases on e-commerce stores.

Within hours, I knew there'd be a tooling gap. Merchants would need to validate their UCP profiles. Developers would need to test their implementations. And nobody was building it yet.

So I did. Here's what 8 weeks of building on an emerging standard actually looks like.

Week 1: Racing the Spec

The UCP spec was published alongside the announcement. I read the whole thing that weekend. The core concept is elegant: serve a JSON manifest at /.well-known/ucp that tells AI agents what your store can do.

{
  "version": "2026-01-11",
  "capabilities": [
    { "name": "checkout", "version": "1.0" }
  ],
  "payment_handlers": [
    { "type": "stripe" }
  ]
}
Enter fullscreen mode Exit fullscreen mode

The first validator was 200 lines of TypeScript. Parse JSON, check required fields, validate URLs. Ship it.

Lesson learned: Perfect is the enemy of shipped. The first version had exactly one feature: paste JSON, see errors. That was enough.

Week 2: Four Levels of Validation

The simple JSON checker wasn't enough. Merchants had profiles that looked valid but failed in practice - wrong URL formats, HTTP instead of HTTPS, missing signing keys.

I built a 4-level validation pipeline:

  1. Structural - Is the JSON valid? Are required fields present?
  2. Rules - Do namespaces match origins? Are endpoints HTTPS?
  3. Network - Can we actually fetch the referenced schemas?
  4. SDK - Does it pass the official @ucp-js/sdk compliance check?

Each level catches errors the previous one can't. A profile can pass structural validation but fail at the network level because its schema URLs are dead.

Week 3: First Users from Google Search

This surprised me. Within 3 weeks of launching, Google was sending organic traffic for "ucp validator" and "ucp checker."

The lesson: when you build tooling for an emerging standard before anyone else, SEO is almost free. There's no competition for keywords nobody has targeted yet.

Traffic grew from 0 to ~230 sessions/week in the first month. Not viral, but consistent.

Week 4: The Agent Simulator

Validation tells you if your profile is correct. But does your store actually work for AI agents?

I built an agent simulator that tests the full workflow:

  • Discovery - fetch /.well-known/ucp like a real agent would
  • Capability inspection - parse capabilities and verify endpoints
  • Checkout simulation - test the checkout flow end-to-end

This caught a whole category of bugs that static validation missed: endpoints returning HTML instead of JSON, timeouts on product APIs, CORS blocking agent requests.

Week 5-6: The Monetization Experiment

Free tools get traffic. But can they generate revenue?

I added a $9/month monitoring tier:

  • Weekly auto-validation of your UCP profile
  • Email alerts when your score drops
  • Validation history and trend tracking
  • AI agent traffic analytics

The result so far: 9 trial signups, 0 paid conversions. Users explore the dashboard for 5-10 minutes, find value, and... never come back.

The problem isn't the product - it's the trial experience. Users get full value in one session. There's no reason to return until something breaks. And by then, they've forgotten about us.

I'm currently building better retention hooks: personalized Day 2 emails showing their actual scan results, "next scan" countdowns on the dashboard, and more visible upgrade prompts.

What I'd Do Differently

1. Build the monitoring first, not the validator.

The one-time validator is useful but commoditizable. Three other validators launched within weeks. The monitoring - weekly scans, alerts, history - is the defensible moat.

2. Ship the email sequence on day one.

I launched without trial nurture emails. For 3 weeks, every trial user signed up and... silence. No welcome email, no quick start guide, no reminder. By the time I fixed it, those users were gone.

3. Don't underestimate competitors building on the same trend.

Within 8 weeks, at least 3 other UCP validators appeared: UCPChecker.com (with a browser extension), Nextwaves (40+ test claims), and several WooCommerce plugins. First-mover advantage is real but temporary.

The Numbers (8 Weeks In)

Metric Value
Weekly sessions ~120
Total trial signups 9
Paid conversions 0
Monthly cost $5 (Hetzner VPS)
Revenue $0
Tools built 7 (validator, simulator, generator, security scan, feed analyzer, ACP checker, schema generator)
Platform guides 6 (Shopify, WooCommerce, Magento, BigCommerce, Wix, Squarespace)

Am I profitable? No. Am I building something people use? Yes. The gap between those two is where the real work happens.

What's Next

The agentic commerce space is heating up. Visa launched Intelligent Commerce. Mastercard has Agent Pay. Meta is building shopping into its AI. McKinsey projects $1 trillion in AI-driven commerce by 2030.

UCP is still early. Most e-commerce stores don't have a profile yet. But the ones that do will have a head start when AI agents become the primary shopping interface.

If you want to check where your store stands: ucptools.dev - free, no signup required.


I'm building UCPtools in public. Ask me anything in the comments.

UCPtools is an independent community tool. UCP is an open standard by Google, Shopify, and partners.

Top comments (0)