A few weeks ago I launched patchBay, an API directory with 3,100+ entries. Think ProgrammableWeb but not dead.
The SEO play is obvious: good content, structured data, wait for Google to compound. But there's a second distribution channel that
didn't exist two years ago: AI coding assistants. When a developer asks Claude Code or Cursor "what's the best geocoding API with
a free tier," something has to answer that question. I wanted it to be patchBay.
That's where MCP comes in.
What MCP actually is
MCP (Model Context Protocol) is the standard that lets AI assistants call external tools mid-conversation. When you ask Claude Code
to help you pick a library and it goes off and checks something, that's MCP. It's how you wire your own data into the conversation
context of tools like Claude, Cursor, and Cline.
For patchBay, this meant: instead of waiting for a developer to Google "best free weather API" and maybe find my site, their AI
assistant could query patchBay directly and return a real, structured answer.
What I built
A standalone Node.js MCP server, separate from the Next.js app, exposing four tools:
search_apis - keyword search across names and descriptions, with optional category filter. The bread and butter.
get_api - full detail lookup by slug or name, for when the assistant wants to dig into a specific entry.
list_categories - returns all 56 categories with counts, useful for narrowing a vague query.
filter_apis - structured filtering by category, auth type, HTTPS support, CORS, no-auth requirement. Useful when a developer has
specific constraints.
All queries go straight to Supabase. No duplication of data access logic, no separate DB.
The setup
I used the official @modelcontextprotocol/sdk package. The server runs over stdio locally, which is what Claude Code and Cursor
expect for local MCP servers.
import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
const server = new McpServer({
name: "patchbay",
version: "1.0.0",
});
Each tool is registered with a Zod schema for input validation:
server.registerTool(
"search_apis",
{
description: "Search the patchBay directory for APIs by keyword",
inputSchema: {
query: z.string().describe("Search term"),
category: z.string().optional().describe("Filter by category slug"),
limit: z.number().optional().default(10),
},
},
async ({ query, category, limit }) => {
// Supabase query here
}
);
To connect it in Claude Code, you add it to your MCP settings with the Supabase env vars:
{
"mcpServers": {
"patchbay": {
"command": "node",
"args": ["/path/to/patchbay/mcp/dist/server.js"],
"env": {
"SUPABASE_URL": "...",
"SUPABASE_ANON_KEY": "..."
}
}
}
}
After that, you can ask Claude Code "what are the best free weather APIs?" and it queries patchBay in real time to answer.
Why this matters more than SEO
SEO compounds over 12 to 18 months. An MCP server works the moment a developer installs it. These are different distribution
channels with different timelines.
More importantly: a developer who has patchBay wired into their AI assistant will query it every time they need an API. That's not
a page view, that's a dependency. The stickiness is completely different.
The ecosystem is still early. Most directories and data sources have not built MCP servers yet. That gap closes quickly.
What's next
The server is live and documented at https://patchbay.cc/developers. Install instructions for Claude Code and Cursor are in the
https://github.com/jeremieLouvaert/patchbay.
If you build something with it, or find APIs missing from the directory, https://patchbay.cc/submit.
Top comments (0)