A few weeks ago, I shared an insight on LinkedIn that sparked a conversation. I've since built a POC to test the feasibility - here's what I discovered.
TL;DR
✅ Make existing REST APIs AI-accessible — No rebuilding required
✅ Protocol adapter pattern — One service translates MCP ↔ REST
✅ Minimal overhead — Standard HTTP
✅ Working POC available — try it here
The Problem That Started Everything
After analyzing the Model Context Protocol (MCP), I had a nagging feeling that we were making AI integration harder than it needed to be. Everyone was talking about building "MCP servers" from scratch, but I kept thinking:
What if MCP Servers are just microservices with a protocol wrapper?
So I decided to put the idea to the test. What I built is a working example that brings the concept to life—and it could change the way we think about connecting AI to existing systems.
We're Solving the Wrong Problem
Everyone was asking: "How do we build MCP servers?"
But the better question is: "How do we make existing systems MCP-compatible?"
The Real Comparison
Let me show you what I discovered:
Without MCP (Traditional):
AI Agent → Custom Integration Code → REST API → Tool
With MCP (Current Approach):
AI Agent → MCP Client → MCP Server → Tool
With MCP → REST Pattern (My Approach):
AI Agent → MCP Client → MCP Adapter → Existing REST APIs → Existing Microservices
The insight: An MCP Server is often just a protocol wrapper around an existing REST-based microservice.
What I Built: The MCP → REST Pattern
Instead of a theoretical discussion, let me show you what happened when I took this seriously.
The Architecture
The MCP Adapter acts as a bidirectional protocol translator that enables seamless communication between AI Agents (speaking MCP/JSON-RPC) and your existing microservices (speaking REST/HTTP).
Inter-Process Communication Flow
Each process runs independently, communicating through standard HTTP protocols. The adapter translates between MCP and REST without any process coupling.
Request Flow Example: Real AI in Action
Let me show you exactly how this works with a real user request:
User Query: "Show me all orders for customer John Doe and their shipping status"
Step 1: Auto-Discovery
The adapter discovers your services and generates tools automatically — zero configuration required.
Step 2: AI Agent's Sequential Tool Execution
1. search_customers(name="John Doe")
→ MCP Adapter → GET /customers/search?name=John+Doe
→ Customer Service → Result: customer_id: "cust-456"
2. get_customer_orders(customer_id="cust-456")
→ MCP Adapter → GET /orders?customer_id=cust-456
→ Order Service → Result: orders: [1001, 1002, 1003]
3. get_shipping_status(order_ids=[1001, 1002, 1003])
→ MCP Adapter → GET /orders/shipping?ids=1001,1002,1003
→ Order Service → Result: shipping: ["shipped", "in_transit", "processing"]
4. Final synthesis:
→ "John Doe has 3 orders: Order #1001 shipped, Order #1002 in transit, Order #1003 processing"
Step 3: The Magic — No Changes Needed
Your existing services handle these requests exactly as they always have. Zero modifications required.
Try It Yourself: Real curl Example
Want to see the actual API calls? Here's a real example from the working POC:
# Call the MCP adapter to list customers
curl -X POST http://localhost:8000/mcp \
-H "Content-Type: application/json" \
-d '{
"jsonrpc": "2.0",
"id": 3,
"method": "tools/call",
"params": {
"name": "customer_list_customers_customers_get",
"arguments": {}
}
}'
Response:
{
"jsonrpc": "2.0",
"id": 3,
"result": {
"content": [{
"type": "text",
"text": "[\n {\n \"id\": \"cust-001\",\n \"name\": \"John Doe\",\n \"email\": \"john@example.com\",\n \"phone\": \"+1-555-0123\",\n \"status\": \"active\",\n \"created_at\": \"2024-01-15T10:30:00Z\"\n },\n {\n \"id\": \"cust-002\",\n \"name\": \"Jane Smith\",\n \"email\": \"jane@example.com\",\n \"phone\": null,\n \"status\": \"active\",\n \"created_at\": \"2024-02-20T14:45:00Z\"\n }\n]"
}],
"isError": false,
"_meta": {
"status_code": 200,
"tool_name": "customer_list_customers_customers_get",
"service": "customer"
}
}
}
Behind the scenes: The adapter translated this MCP call to GET /customers
on your existing customer service, then wrapped the response in MCP format.
Key Takeaways
The MCP Adapter is a focused, single-purpose component with clearly defined boundaries:
✅ Does: Protocol translation between MCP and REST
❌ Does NOT: Business logic, data storage, authentication, service mesh functions
This focused scope is what makes the pattern so powerful — it does one thing exceptionally well: making existing REST APIs accessible to AI agents.
Avoid the Rebuild: Leverage What You Already Have
Enterprises Already Have Everything They Need
Your existing infrastructure provides the foundation:
✅ Business logic — Already implemented in microservices
✅ Security foundation — AuthN/AuthZ systems provide the base
✅ Monitoring — Observability stack can extend to AI interactions
✅ Reliability — Services are battle-tested in production
✅ Compliance — Audit trails and governance frameworks exist
The MCP → REST Pattern leverages these proven foundations rather than rebuilding them.
MCP Adapter: Just Another Service
The beauty of this approach is that the MCP Adapter is just like any other microservice in your stack:
✅ Same operational patterns — Docker, Kubernetes, whatever you use
✅ Same monitoring tools — Prometheus, Grafana, your existing APM
✅ Same scaling strategies — Horizontal scaling, load balancers
✅ Same security models — Network policies, service mesh, OAuth
Your ops team already knows how to manage this. Your monitoring already covers this. Your security policies already apply to this.
This pattern excels at making existing business logic AI-accessible, not replacing specialized AI infrastructure.
Check Out the Code
This is just the beginning. What you've seen here is a proof of concept that validates the core insight, but there's so much more to explore.
You can check the code at: https://github.com/deepwissen/mcp_rest_adapter
Feel free to clone and play with it because:
✅ The pattern works — Real AI agents, real microservices, real results
✅ The community needs this — Too many teams are rebuilding what already exists
✅ We need diverse perspectives — Different use cases will uncover new insights
✅ Better together — This becomes more powerful with community contributions
What's Next
This POC is just the beginning. Together, we'll improve the pattern.
As the community experiments with different use cases and environments, we'll learn what works, what doesn't, and what needs to be built next. The pattern will evolve based on real-world feedback and contributions.
The Bottom Line
In many cases, MCP servers are just microservices with a schema wrapper.
For teams that already have REST APIs, this insight can unlock fast, scalable AI integration with minimal effort.
Of course, this is not a one-size-fits-all solution. Use this pattern where it fits best:
✅ Ideal for: Exposing existing business logic to AI agents
✅ Ideal for: Leveraging proven infrastructure and security
❌ Not ideal for: AI-native workflows, real-time streaming, or vector data
When the use case fits, the MCP → REST Pattern can replace months of effort with a single, focused translation layer.
I hope you'll explore it, share your feedback, and help evolve the pattern.
I'd love to hear your thoughts—or better yet, see what you build with it!
What are your thoughts on this approach? Have you faced similar challenges integrating AI with existing systems? Let me know in the comments below!
Top comments (0)