Unlock the power of your existing APIs — make them AI-agent friendly, discoverable, and usable by intelligent applications with just a few clicks.
Modern applications aren’t just about serving data to front-end clients or backend services anymore. With the rise of AI agents and powerful LLM-driven tools, the expectations for how systems should expose and consume APIs are shifting fast. APIs that once served developers now need to serve AI agents. And that’s where the Model Context Protocol (MCP) comes in.
In this post, you’ll learn why MCP matters, how Azure API Management (APIM) can act as an AI-Gateway, and how you can turn any REST API into an MCP server in minutes, without writing any backend logic.
If you found this useful, I share deeper dives and additional articles on Substack and LinkedIn. Feel free to follow me there for more content.
What Is Azure API Management?
Azure API Management (APIM) is a fully managed service from Microsoft that sits in front of your APIs and acts as a gateway between clients and backend services.
At its core, APIM lets you:
- Publish existing APIs without changing backend code
- Secure APIs with authentication, authorization, and rate limits
- Transform requests and responses (headers, paths, payloads)
- Monitor usage, performance, and failures
Traditionally, APIM is used to expose APIs to developers in a controlled, scalable way. But the same gateway capabilities also make it ideal for AI-driven use cases. With APIM, you can shape how APIs are discovered, described, and invoked, without touching the underlying service.
That’s exactly why APIM works so well as an AI Gateway: it already understands APIs at the contract level and can enforce policies consistently at the edge.
What Is an MCP Server?
MCP (Model Context Protocol) is a protocol designed to make tools and APIs natively usable by AI agents. Instead of treating APIs as opaque HTTP endpoints, MCP defines a structured way to expose:
- What capabilities are available
- What inputs each operation expects
- What outputs it returns
- How an AI agent should call it safely and correctly
An MCP server is simply a service that exposes these capabilities in an MCP-compatible way. You can think of it like USB for AI tools.
Just as USB provides a standard interface that lets any compatible device plug into any computer without custom drivers, MCP provides a standard way for AI agents to discover, understand, and use tools, databases, APIs, etc. Once an API is available via MCP, AI agents can discover it, reason about it, and invoke it as a tool, without custom glue code or hard-coded prompts.
The key idea is this:
MCP turns APIs into first-class tools for AI agents.
By combining MCP with Azure API Management, you can wrap existing REST APIs and instantly make them AI-ready—without rewriting services, adding new backends, or maintaining custom adapters.
From Zero to MCP in Azure API Management
Now here’s the practical part, how I turned a normal REST API (Star Wars API) into an MCP server using Azure API Management (APIM), then connected it to ChatGPT and verified the calls end-to-end.
If you want to follow along, you only need:
- An Azure subscription
- Permission to create an API Management instance
- A public REST API (I used SWAPI because it’s free: https://swapi.info/)
Step 1) Create a new Azure API Management instance
Go to the Azure Portal → Create a resource → search for API Management.
-
Fill in the basics
- Resource group: create new (e.g.,
rg-mcp-demo) - Region: pick whatever’s closest
- Name: must be globally unique (this becomes part of your gateway hostname)
- Organization name / admin email: required
- Resource group: create new (e.g.,
-
Choose a pricing tier that fits your demo:
- For experiments, pick a developer-friendly option (but not consumption.
Click Create and wait until the APIM instance is provisioned.
Note: MCP server export is currently not available when using the Consumption pricing tier in Azure API Management.
Step 2) Create a new HTTP API in APIM (using SWAPI)
In your APIM instance, go to APIs → + Add API.
Choose HTTP.
-
Set:
- Display name: SWAPI
- Web service URL: https://swapi.info/api/
- API URL suffix (optional): something like swapi (this becomes /swapi/... on your gateway)
Create.
Step 3) Add the GET operations you want to expose
Inside your SWAPI API in APIM:
Click + Add operation
-
Create one operation per resource (GET):
- GET /people
- GET /planets
- GET /species
- GET /vehicles
- GET /starships
Click Save
Step 4) Test the API from APIM
- In the operation (example:
GET /people), open the Test tab. - Click Send.
- Confirm you get a valid JSON response.
Step 5) Create an MCP server from the existing API (APIM → MCP)
In your APIM instance menu, go to APIs → MCP servers.
Click + Create MCP server.
Choose the option to Expose an API as an MCP server.
-
Select:
- The API you created (
SWAPI) - The operations you want to expose as tools (select all the GET operations you added)
- The API you created (
Click Create
APIM will generate an MCP endpoint that describes your tools and supports agent-style invocation.
Step 6) Use the MCP server in an AI host
That’s it! Your API is now MCP-enabled.
You can now use the MCP server in:
- ChatGPT
- Visual Studio Code
- and any MCP-compatible host or agent
The host connects to the MCP server URL, discovers the available tools, and can immediately start calling your API, with no custom prompts, glue code, or backend changes.
Where This Pattern Really Shines
At a glance, turning a REST API into an MCP server might look like a convenience feature. In practice, it’s a shift in how APIs participate in modern systems.
By putting Azure API Management in front of your services and exporting them as MCP servers, you’re creating a stable, contract-driven interface not just for developers, but for AI agents. That matters because agents don’t behave like traditional clients:
- They discover tools dynamically
- They reason about capabilities instead of endpoints
- They chain calls together without hard-coded flows
APIM already solves the hardest parts of this problem: Versioning, security, throttling, observability. MCP simply gives those capabilities a language AI agents understand.
This pattern is especially powerful in a few scenarios:
- Enterprise APIs that can’t be easily modified but need to be AI-accessible
- Legacy systems where adding agent logic directly would be risky or slow
- Platform teams that want a single, governed way to expose tools to AI across the organization
Instead of every team building custom “AI adapters,” you centralize the responsibility at the gateway, where it belongs.
Important Things to Keep in Mind
Before you go all-in, a few practical considerations are worth calling out:
- Pricing tier matters: MCP export is not supported on the Consumption tier. Plan accordingly.
- Tool design still matters: MCP doesn’t fix poorly designed APIs. Clear operation names, sensible inputs, and predictable outputs make a huge difference for agents.
- Security is amplified: AI agents can call tools more frequently and creatively than humans. Rate limits, authentication, and scopes aren’t optional, they’re essential.
- Observability becomes critical: MCP makes it easier to call APIs; APIM makes it easier to see who called what and why. Use that data.
Most importantly, treat MCP exposure as a product decision, not just a technical switch. You’re defining how intelligent systems interact with your business logic.







Top comments (0)