DEV Community

Cover image for Your Nginx Logs Are Your Sales Pipeline (In the Agentic Economy)
Jonathan Bulkeley
Jonathan Bulkeley

Posted on

Your Nginx Logs Are Your Sales Pipeline (In the Agentic Economy)

I've been running a pay-per-query financial data oracle for about six weeks — cryptographically signed price feeds and economic indicators, payable directly by AI agents. No accounts, no API keys, no subscriptions. Agents hit an endpoint, get an HTTP 402 Payment Required response, pay via Lightning Network or USDC on Base, and receive a signed attestation.

Quick context on payment protocols: L402 is a Lightning Network-based micropayment standard for APIs. x402 is an emerging standard from Coinbase that uses USDC on Base. Both allow AI agents to pay for API calls autonomously — no human in the loop, no OAuth dance, no billing portal.

I built it solo. No marketing. No announcements beyond a GitHub repo and a directory listing.
And then I started watching the logs.

What I Expected

Silence, mostly. Maybe a few curious developers. Hopefully a paying client or two eventually.

What Actually Happened

Bots found me. Paying bots. Within days of going live.
A Hetzner client running python-requests started hitting BTC/USD and BTC/USD VWAP every 30 seconds. It paid. Real micropayments, both rails verified. It eventually ran out of funds and queued for more. I have no idea who operates it or how it found me — it just appeared in the logs one day and started paying.

A GCP-hosted service started doing systematic HEAD+GET probes on all 56 endpoints. Every endpoint, in sequence, multiple times a day. It's been doing this for two weeks. It hasn't paid yet — but that pattern looks exactly like an engineering team doing pre-integration due diligence.
OpenAI's GPTBot crawled my API root. Meta's external agent crawler found specific endpoint URLs and hit them directly — BTC/USD, ETH/USD, SOL/USD, XAU/EUR. Both found me through backlinks from payment API directories I didn't even know I was listed on.
None of this came from outreach. It came from watching the logs.

The Moment That Changed How I Think About This

Two days ago, at 17:43 UTC, this appeared in my nginx access log:
2a02:8424:... "GET /openapi.json HTTP/1.1" 404
2a02:8424:... "GET /llms.txt HTTP/1.1" 404
2a02:8424:... "GET /llms.txt HTTP/1.1" 404
2a02:8424:... "GET /openapi.json HTTP/1.1" 404
2a02:8424:... "GET / HTTP/1.1" 200
2a02:8424:... "GET / HTTP/1.1" 200
2a02:8424:... "POST / HTTP/1.1" 405
2a02:8424:... "POST / HTTP/1.1" 405
User agent: node. Eight requests, then gone.

This was an autonomous agent doing API discovery. The pattern is textbook: try standard machine-readable manifests first (openapi.json, llms.txt), fall back to root when those fail, try to interact with what it found (POST).

It got 404s on the discovery files because those live on myceliasignal.com not api.myceliasignal.com. It read my API root but got the nginx default page — completely useless to a machine. It tried to POST to root, got 405, and left.

It came, tried to integrate, failed, and left. Silently. Without ever telling me.

If I hadn't been watching the logs I never would have known. I fixed both issues within the hour:

nginx# Add to your API subdomain nginx config
location = /openapi.json {
return 301 https://yourdomain.com/openapi.json;
}

location = /llms.txt {
return 301 https://yourdomain.com/llms.txt;
}

location = / {
default_type application/json;
return 200 '{"name":"Your Service","description":"What you do","docs":"https://yourdomain.com","openapi":"https://yourdomain.com/openapi.json","health":"https://api.yourdomain.com/health"}';
}
The next morning the same agent came back, hit GET /, got the JSON manifest, and left with everything it needed.

The Next Morning

A different IP hit /.well-known/agent.json at 04:30 UTC and got a 404. I'd never heard of agent.json. Twenty minutes of research later: it's Google's Agent2Agent (A2A) protocol discovery standard. Agents that implement A2A look for this file at startup to understand what capabilities a service offers.
Here's the structure:
json{
"name": "Your Service",
"description": "What your API does",
"url": "https://api.yourdomain.com",
"version": "1.0.0",
"capabilities": {
"streaming": false,
"pushNotifications": false
},
"authentication": {
"schemes": ["Bearer"],
"description": "How agents authenticate"
},
"skills": [
{
"id": "your_skill",
"name": "Skill Name",
"description": "What this skill does",
"examples": ["GET /api/resource"]
}
],
"docs": "https://yourdomain.com/docs",
"openapi": "https://yourdomain.com/openapi.json"
}
Serve it at /.well-known/agent.json:
nginxlocation = /.well-known/agent.json {
alias /var/www/well-known/agent.json;
default_type application/json;
}
I built the file, deployed it to both geographic nodes. The whole loop — log anomaly → research → fix → deploy — took about an hour. It took longer to research the standard than to implement it.

What the Logs Have Taught Me

Agents don't browse. They probe. Human users explore. They click around, read docs, maybe send an email. Agents execute a discovery protocol, check a few well-known paths, and either find what they need or move on. If your infrastructure isn't right they don't complain — they just leave.

Your discovery stack is your storefront.

In six weeks I've added: llms.txt, openapi.json redirect, /.well-known/x402, /.well-known/agent.json (A2A), a root JSON manifest, and multiple directory listings. Every single one came from a log entry showing an agent looking for something I didn't have.

The silence is misleading.

Before I started watching the logs I thought I had maybe 5-10 external visitors a day. The actual number was closer to 10,000 automated hits. Most were 402s — unpaid probes. But the signal was there in the noise: who was looking, what they were looking for, and what they couldn't find.

Pre-integration evaluation looks like traffic.

The GCP service doing HEAD+GET on all 56 endpoints isn't noise — it's potentially a team deciding whether to integrate. That's a sales conversation happening entirely in my access logs. I can't talk to them. I can't send them a pitch deck. The only thing I can do is make sure every probe returns the right response.

The Practical Checklist

If you're running any API you want agents to discover and use:
Serve these at your API domain:
PathPurpose/.well-known/agent.jsonA2A agent card — increasingly standard/.well-known/openid-configurationIf you use OAuth/openapi.jsonOr redirect to where it lives/llms.txtMachine-readable capability description for LLMs
Fix your root response:
GET / should return a machine-readable JSON manifest, not an nginx default page or HTML. Include your name, description, docs URL, OpenAPI URL, and a health endpoint. This is the fallback when everything else fails — and agents do fall back to it.
Watch your logs for these patterns:
bash# 404s on well-known paths = agent looking for something you don't have
sudo grep "404" /var/log/nginx/access.log | grep "well-known|openapi|llms"

Systematic HEAD+GET sweeps = evaluation traffic

sudo grep "HEAD" /var/log/nginx/access.log | awk '{print $1}' | sort | uniq -c | sort -rn

Unknown user agents = new frameworks and directories

sudo awk -F'"' '{print $6}' /var/log/nginx/access.log | sort | uniq -c | sort -rn | grep -v "Mozilla|curl|python"

The Bigger Point

The agentic economy changes the discovery model completely. Human customers find you through search, word of mouth, developer communities. Agents find you through protocol — well-known URLs, directory listings, OpenAPI specs, capability manifests.
If you're building APIs that you want AI agents to consume, your nginx logs aren't ops hygiene. They're your sales pipeline. The next customer might already be in there, probing your endpoints at 3am, deciding whether to integrate.

Watch the logs, carefully

I'm building Mycelia Signal — a sovereign cryptographic oracle with 56 signed endpoints, payable by AI agents via Lightning (L402) or USDC on Base (x402). Happy to answer questions about the payment protocol implementation or agent discovery stack in the comments.

Top comments (0)