MCP crossed 97 million monthly SDK downloads, making it the universal standard for AI tool integration
Anthropic donated MCP to the Linux Foundation's new Agentic AI Foundation alongside OpenAI and Block
10,000+ active public MCP servers exist across databases, APIs, browsers, file systems, and dev tools
Every major AI platform now supports MCP: Claude, ChatGPT, Gemini, Copilot, Cursor, VS Code
The protocol went from Anthropic experiment to industry standard in 12 months
New spec features include async operations, server identity, and a community-driven registry
MCP turns every API into a tool any AI agent can use without custom integration code
MCP Hit 97 Million Downloads. The Protocol War Is Over Before It Started.
A year ago, Anthropic released a protocol for connecting AI models to external tools. It felt like another standard competing for attention. One more spec in a world drowning in specs.
97 million monthly SDK downloads later, the competition is not competing anymore. They are adopting.
From Experiment to Infrastructure in 12 Months
The Model Context Protocol started as Anthropic's answer to a specific problem: AI models are smart but isolated. They can reason about code, answer questions, write documents. But they cannot check your database, read your files, or call your APIs without custom glue code for every single integration.
MCP standardized that glue. One protocol. Any tool. Any AI model.
The adoption curve was steep. Claude supported it first, obviously. Then Cursor added it. Then VS Code. Then ChatGPT. Then Gemini. Then Microsoft Copilot. At some point, every major AI platform decided the same thing: building a proprietary tool protocol was not worth the fight.
The numbers in March 2026: 97 million monthly SDK downloads across Python and TypeScript. Over 10,000 active public MCP servers. Official SDKs in every major programming language.
Those are not vanity metrics. 10,000 servers means 10,000 tools that any MCP-compatible AI can use out of the box. Your database. Your CI pipeline. Your monitoring stack. Your browser. Connected to whichever AI you prefer, through the same protocol.
What the Agentic AI Foundation Actually Is
In December 2025, Anthropic did something unusual for a company with a dominant protocol: they gave it away. MCP was donated to the Linux Foundation's new Agentic AI Foundation (AAIF).
The founding members:
Anthropic (donating MCP)
Block (donating goose, their open-source AI agent)
OpenAI (donating AGENTS.md, their agent definition spec)
Supporting organizations: Google, Microsoft, AWS, Cloudflare, Bloomberg.
Read that list again. Anthropic and OpenAI are direct competitors. Google and Microsoft are direct competitors. All of them backing the same foundation, the same protocol, the same governance structure.
That does not happen because everyone suddenly got generous. It happens because the cost of fragmenting the ecosystem became higher than the cost of cooperating. When every platform builds its own tool protocol, every tool developer has to write integrations for each one. Nobody wins that game except whoever has the most developer relations headcount.
MCP under Linux Foundation governance means neutral ground. No single company controls the spec. Maintainers operate independently. The foundation provides infrastructure, not technical direction.
Why 97 Million Matters
To understand why this number is significant, compare it to other developer infrastructure adoption curves.
Docker Hub hit 100 billion pulls total, but that took years of gradual adoption. npm serves billions of downloads per month now, but the early growth was measured in thousands.
MCP reached 97 million monthly downloads in its first year. Monthly. That pace puts it in the same category as foundational developer tooling, not a niche AI library.
The reason is mechanical. Every AI application that wants to interact with the real world needs tool integration. Before MCP, that meant building custom connectors. After MCP, you install an SDK and connect to existing servers. The value proposition is not "this is cool." The value proposition is "this saves you three weeks of integration work per tool."
Multiply that across 10,000 available servers and you start to see why the downloads compound. Each new server makes the protocol more valuable. Each new client makes building servers more worthwhile. Classic network effect, except the network is AI tools instead of social connections.
What MCP Servers Actually Look Like
If you have not built or used an MCP server, here is the practical picture.
An MCP server exposes tools, resources, and prompts to any connected AI. A Postgres MCP server lets your AI query your database. A GitHub MCP server lets it read issues and create PRs. A Stripe MCP server lets it check payment status. A file system server lets it read and write local files.
The server runs locally or remotely. The AI connects through the protocol. No custom code on the AI side. No API-specific integration work.
claude mcp add postgres -- npx @mcp/postgres
claude mcp add github -- npx @mcp/github
Two commands. Your AI can now query your database and manage your GitHub repos. Before MCP, that required building custom tool definitions, handling auth, parsing responses, and managing errors for each integration separately.
The newest spec additions make it even more practical. Async operations mean long-running tools do not block the conversation. Server identity provides authentication and trust. The community-driven registry makes discovering servers as easy as searching npm.
What Changed for Claude Code Users
If you use Claude Code daily, MCP is already part of your workflow whether you configured it manually or not. Claude Code ships with built-in MCP support. Every tool you add through claude mcp add uses the protocol.
But the ecosystem growing to 10,000+ servers changes the game. A year ago, you had maybe a dozen useful MCP servers. Now there are servers for Slack, Linear, Jira, Figma, Notion, Sentry, every major database, most cloud services, and hundreds of niche tools.
The practical impact: tasks that used to require leaving your terminal now happen inside your conversation. Check deployment status, review a PR, query production logs, update a ticket. All through MCP servers, all without switching context.
The Agentic AI Foundation governance also matters for trust. Enterprise teams that were hesitant about depending on a protocol controlled by a single AI company now have Linux Foundation governance backing it. That changes procurement conversations.
The Three Founding Projects
The AAIF launched with three projects, each solving a different piece of the agentic AI puzzle.
MCP handles tool integration. How does an AI use external tools? Through MCP servers.
goose (from Block) is an open-source AI agent framework. How do you build an agent that uses those tools autonomously? With goose.
AGENTS.md (from OpenAI) defines agent behavior and capabilities. How do you describe what an agent can do, so other systems can interact with it? Through AGENTS.md.
Together, they cover the stack: define the agent, connect it to tools, let it act. Three projects from three competing companies, unified under one foundation.
Security and Trust at Scale
10,000 servers is great for productivity. It is also 10,000 potential attack surfaces. MCP's new server identity feature addresses this directly. Servers can now authenticate themselves, and clients can verify who they are talking to before sending data.
This matters because MCP servers often get access to sensitive systems. A Postgres server can read your production database. A GitHub server can push code. A Slack server can send messages as you. If a malicious server impersonates a legitimate one, the damage is real.
The Linux Foundation governance helps here too. A community-driven registry with verification adds a layer of trust that a single company's registry cannot. When enterprise security teams evaluate MCP adoption, "governed by the Linux Foundation" carries weight that "maintained by Anthropic" does not, no matter how good Anthropic's security practices are.
The async operations in the latest spec also reduce risk. Long-running tools no longer block the entire conversation. If a server hangs or misbehaves, the client can timeout and move on instead of waiting indefinitely. Small feature, big resilience improvement.
What to Watch Next
The 97 million number will keep climbing. The more interesting metrics are:
How many enterprise-grade MCP servers ship in 2026? Consumer and developer tools are well covered. Enterprise integrations (SAP, Salesforce, ServiceNow, internal tools) are still early.
Does the registry become the npm of AI tools? A searchable, versioned registry of MCP servers would accelerate adoption dramatically. The community-driven registry is in early stages.
Do competing protocols emerge anyway? Google and OpenAI both backed AAIF, but that does not prevent them from adding proprietary extensions. If the foundation governance is strong, extensions flow back to the spec. If not, fragmentation creeps in through the side door.
The protocol war that people expected never materialized. Instead of five competing standards, the industry coalesced around one in 12 months. That almost never happens in tech. Usually standards wars drag on for years (USB-C, web standards, container formats).
MCP got there fast because the alternative was worse for everyone. That is the strongest possible foundation for a standard: not that everyone loves it, but that nobody wants to live without it.
Building Your First MCP Server
If you have not built one yet, the barrier is lower than you think. An MCP server is a Python or TypeScript program that registers tools and handles requests. Here is the skeleton:
from mcp import Server
server = Server("my-tool")
@server.tool("check_status")
async def check_status(service: str) -> str:
# Your logic here
return f"{service} is running"
server.run()
Register it with your AI client, and the tool is available in every conversation. The protocol handles discovery, parameter validation, and response formatting. You write the business logic. MCP handles the plumbing.
The practical advice: start with something you check manually every day. Deployment status, database health, queue depth, whatever pulls you out of your editor. Build an MCP server for it. That first server teaches you the protocol. The second one takes half the time. By the third, you are connecting things you never thought to automate.
The ecosystem is a year old and already has 10,000 servers. The next 10,000 will include the weird, specific, internal tools that only make sense for your team. That is where MCP stops being infrastructure and starts being a competitive advantage.
The Standard Nobody Planned
A year ago, nobody predicted this. Anthropic released a protocol. Competitors could have ignored it, built alternatives, or waited it out. Instead, within months, every major platform adopted it. Not because Anthropic is special, but because the problem MCP solves is universal enough that one standard serves everyone better than five.
The Agentic AI Foundation formalizes what was already true in practice: MCP is not Anthropic's protocol anymore. It belongs to the ecosystem. The 97 million monthly downloads are a trailing indicator. The leading indicator is that nobody is building an alternative.
The protocol war ended. The building starts now.
Top comments (0)