The GitHub MCP Registry exists. It is a real piece of infrastructure now — vendors publish into it, enterprises mirror it, IDEs are starting to point at it for discovery.
The problem with the registry today, as a developer, is that it is a webpage you scroll. Your agent reads it badly. Your security team queries it manually. Your platform automation cannot really call it as a tool because it does not look like one — it looks like a JSON catalog of metadata, with search-as-keyword.
This post is the engineering version of the argument: wrap the registry as a Naftiko capability, expose three surfaces from one spec, and turn it into something every consumer (humans, agents, pipelines) can call the same way.
The shape of a Naftiko capability
A Naftiko capability YAML is a single artifact that declares two things:
-
consumes— what upstream APIs the capability calls -
exposes— what surfaces the capability serves outward (REST, MCP, Agent Skills)
The Naftiko Engine runs the spec. The same artifact is the deployable unit, the documentation, and the governance contract. Specs do not drift from runtime, because the spec is the runtime artifact. The framework wiki has the long-form argument under Spec-Driven Integration.
For the GitHub MCP Registry, the capability shape is roughly:
naftiko: "1.0.0-alpha2"
info:
name: github-mcp-registry-discovery
description: Conversational discovery surface over the GitHub MCP Registry
version: "1.0.0"
consumes:
github-registry:
type: http
baseUrl: https://api.github.com
auth: github_app_installation
operations:
list-registry-entries:
method: GET
path: /repos/{{ .registry_owner }}/{{ .registry_repo }}/contents/servers
get-registry-entry:
method: GET
path: /repos/{{ .registry_owner }}/{{ .registry_repo }}/contents/servers/{{ .entry_path }}
That is the consume side. The capability now knows how to read the registry through GitHub's Contents API. The next block is where the conversational discovery comes in — the exposes side.
Three exposures, one spec
exposes:
rest:
operations:
find-mcp-servers:
method: GET
path: /v1/mcp-servers
parameters:
- name: purpose
in: query
schema: { type: string }
- name: vendor
in: query
schema: { type: string }
- name: review_status
in: query
schema: { type: string, enum: [approved, pending, revoked] }
outputParameters:
- name: servers
type: array
jsonpath: $.results[*]
items:
- { name: urn, jsonpath: $.urn }
- { name: install_url, jsonpath: $.install.url }
- { name: scopes, jsonpath: $.install.scopes[*] }
- { name: review_status, jsonpath: $.governance.review.status }
- { name: review_date, jsonpath: $.governance.review.date }
mcp:
transport: streamable-http
tools:
find-mcp-servers:
description: Find MCP servers in the registry filtered by purpose, vendor, or review status
inputSchema:
type: object
properties:
purpose: { type: string, description: "What task the agent needs the server for" }
vendor: { type: string, description: "Vendor name filter (optional)" }
review_status: { type: string, enum: [approved, pending, revoked] }
outputSchema:
# shaped output, same projection as REST
skills:
package: github-mcp-registry-discovery
skills:
- find-mcp-servers
- get-mcp-server-details
That single block emits three runtime surfaces:
- A REST endpoint your platform team's existing portal can render
- An MCP server your agent can call directly from the IDE
- A skill folder you can install into any agent that supports Agent Skills
Same source of truth. The Naftiko Framework Wiki documents the underlying Compose AI Context and Rightsize AI Context use cases that explain why the typed outputParameters matter — you are not handing a verbose registry envelope to the model, you are handing it task-ready fields.
Why this matters for agents
The agent change is the one developers feel first.
Today, an agent that needs an MCP server fetches the registry page and parses it. The model burns tokens reading entries it will not use. The accuracy of selection depends on whether the model parsed the JSON correctly under context-window pressure.
After wrapping the registry, the agent calls a typed tool. find-mcp-servers accepts structured inputs (purpose, vendor, review_status) and returns a structured output. The model is no longer reading a webpage. It is making a tool call.
That is what right-sized AI context looks like in production. Not better prompts. Better tools that return better-shaped data, so the model has less to do.
Why this matters for vendor-neutrality
Two extensions hit the developer-portal market this year — Zuplo x-mcp and Redocly x-mcp. Both let you describe MCP server tools as OpenAPI extensions. Both are useful. Both lock you to whichever portal vendor ships them.
A Naftiko capability YAML is open. It emits valid OpenAPI for any portal that already speaks OpenAPI. The MCP catalog ships as a sibling artifact. Your custom Swagger UI client (or whatever you have) renders the REST half today, and the MCP half feeds whatever MCP-aware tooling you wire up — without picking either Zuplo or Redocly's Redoc fork.
The contract is portable. The vendor extension is not.
Three places this shows up at runtime
Once the capability is shipped, three different consumers call it three different ways:
- A developer in VS Code asks Copilot "what MCP servers are approved for Jira?" — Copilot calls the MCP tool, returns a typed list, the developer installs the right server in two clicks
- An automation pipeline queries the REST endpoint to check whether a server is allow-listed before installing it in a sandbox — programmatic discovery, no HTML scraping
-
A security reviewer uses the same MCP tool but filters by
review_status: pendingandreview_date< last 30 days — the same surface, a different question, returns the audit slice
The capability is not a single tool. It is a single contract that all three workflows query.
How to actually ship it
Steps:
- Write the capability YAML — declare the GitHub Contents API as
consumes, declare the threeexposessurfaces - Add governance metadata — tags on
consumesfor vendor risk, tags onexposesfor agent safety + scope - Run it under the Naftiko Engine (
docker run naftiko/framework:latest) - Point your IDE at the MCP endpoint, point your portal at the REST endpoint, install the skill package into agents
The Naftiko Framework runs the spec. The framework wiki Tutorial Part 1 walks through the basic engine setup if you have not used Naftiko before.
The closer
A registry is a list. A capability is a tool. The first thing every enterprise running MCP at any meaningful scale is going to need is a way to query its registry without scrolling — for humans, for agents, and for the automation in between.
Naftiko ships that contract today. One YAML, three surfaces, vendor-neutral.
Stop scrolling. Start calling.
Naftiko is the API integration framework for the AI era. Wrap your APIs as capabilities. Expose REST + MCP + Agent Skills from one spec. Ship governance, observability, and identity as defaults. → naftiko.io
Top comments (0)