DEV Community

Om Shree
Om Shree

Posted on • Originally published at glama.ai

Cloud-Native AI: Leveraging MCP for Scalable Integrations

Cloud-native environments demand flexible and scalable AI integrations. Usually, this has required writing custom connectors for each service and managing deployment logic manually, a tedious/redundant and error-prone process. The Model Context Protocol (MCP) solves this by offering a single, structured interface for AI agents to interact with cloud services like AWS Lambda, Google Cloud Run, and BigQuery.

Image

With MCP, agents use natural language to trigger complex cloud operations—provisioning infrastructure, querying data, or calling APIs. MCP handles schema validation, authentication, error reporting, and discovery eliminating glue code between models and tools 12.

Hosting MCP Servers on AWS

AWS now supports MCP servers on Lambda, ECS, EKS, and Finch. These servers allow AI agents to request deployments, monitor infrastructure, or perform cloud-native tasks, all with natural language. MCP ensures best practices around IAM roles, scaling, and logging are automatically included 3.

Here’s an example using the AWS Labs adapter to run a traditional stdio MCP server as a Lambda function:

from mcp.client.stdio import StdioServerParameters
from mcp_lambda import stdio_server_adapter

def handler(event, context):
    server_params = StdioServerParameters(
        command="python", args=["-m", "mcp_server_time"]
    )
    return stdio_server_adapter(server_params, event, context)
Enter fullscreen mode Exit fullscreen mode

This turns your local MCP server into a serverless tool that is automatically invoked by the AI agent when needed 4.

Deploying MCP on Google Cloud Run

Google Cloud enables remote deployment of MCP servers via Cloud Run. This makes it easy to serve AI tools over HTTPS endpoints, with Cloud Run handling auto-scaling, stateless execution, and IAM-based authentication 5.

Example using FastMCP to build a math tool:

from fastmcp import FastMCP

mcp = FastMCP("MathMCP")

@mcp.tool()
def add(a: int, b: int) -> int:
    return a + b
Enter fullscreen mode Exit fullscreen mode

Once deployed, tools like add() are available to agents through secure cloud endpoints. This architecture eliminates the need to manage custom REST APIs manually 6.

Integrating with Bedrock and Vertex AI

On AWS, Bedrock’s Converse API supports tool calling by embedding MCP-compatible tools into its prompt system. The model automatically routes structured tool requests (like querying sales data) to an MCP server, then integrates the result into its response 7.

{
  "toolResult": {
    "json": {
      "total_sales": 12450000,
      "growth": 0.12
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Similarly, Google Cloud’s Vertex AI allows LLMs to orchestrate workflows by invoking MCP tools connected to BigQuery, Cloud SQL, or Cloud Storage. This allows end-to-end pipelines like “get marketing campaign results, analyze trends, and visualize outcome” to be executed via language, without manual scripting 2.

Security and Governance in MCP Deployments

As MCP grows in adoption, securing its tool ecosystem is essential. Key threats include:

  • Tool Squatting: Malicious tools masquerading under legitimate names.
  • Rug Pulls: Trusted tools being swapped for malicious ones.
  • Unauthorized Access: Tools with open endpoints or leaked credentials.

Image

To address this:

  • ETDI (Extended Tool Declaration Identity) adds cryptographic tool signing, verification, and policy constraints, ensuring only trusted tools are executed 8.
  • MCP Guardian wraps your MCP server with auth enforcement, WAF protection, rate limiting, and logging, ideal for cloud deployments 9.

These tools turn MCP from a flexible protocol into a trustworthy cloud-native runtime.

Behind the Scenes

MCP is built on JSON-RPC 2.0, using structured input schemas and output formats. Each tool has a name, schema, and transport type (stdio, HTTP, SSE). Cloud-hosted MCP servers also support:

  • list_tools: Dynamic discovery of available operations.
  • Secure transports: HTTPS + IAM (Cloud Run), or authenticated Lambda endpoints.
  • Stateless execution: Suitable for ephemeral containers and on-demand use.

Image

This allows agents to reason about tool availability and schema requirements—dynamically integrating cloud tools into LLM workflows.

My Thoughts

The shift toward MCP-powered cloud-native agents is profound. Developers can now publish cloud tools that speak the same language as AI agents-natural language and run on secure, autoscaled infrastructure.

But this power demands discipline. Just as you wouldn’t expose unprotected APIs to the public, you must secure and audit MCP endpoints. Using ETDI, signed declarations, and tools like MCP Guardian will become standard in production deployments.

The result? LLMs that do more than talk, they act. With MCP, AI agents become true operators across the cloud stack.

References


  1. Model Context Protocol (MCP) – Wikipedia 

  2. What is Model Context Protocol (MCP)? – Google Cloud guide 

  3. Announcing new MCP Servers for AWS Serverless and Containers – AWS News 

  4. Run Model Context Protocol servers with AWS Lambda – GitHub 

  5. Build and deploy a remote MCP Server to Google Cloud Run in under 10 minutes – Google Cloud Blog 

  6. Host MCP servers on Cloud Run – Cloud Run Documentation 

  7. Unlocking the power of MCP on AWS (Amazon Bedrock integration) 

  8. ETDI: Mitigating Tool Squatting and Rug Pull Attacks in MCP – arXiv 

  9. MCP Guardian: A Security‑First Layer for MCP‑Based AI Systems – arXiv 

Top comments (6)

Collapse
 
thedeepseeker profile image
Anna kowoski

Nice Article Om!, but why mcp uses this json-rpc instead of simple using rest or grpc? Wont they be much simpler

Collapse
 
om_shree_0709 profile image
Om Shree

Thanks Anna, Glad you liked it!, JSON-RPC is lightweight, schema-driven that is ideal for tool chaining in AI workflows but REST on other hand supports bidirectional method calls and structured payloads without needing complex route definitions. Hence JSON-RPC is a better option.

Collapse
 
thedeepseeker profile image
Anna kowoski

thanks but can u elaborate more on this ...

Thread Thread
 
om_shree_0709 profile image
Om Shree

Ofcourse Anna, My next Article will focus onit.

Collapse
 
anik_sikder_313 profile image
Anik Sikder

This is a fantastic deep dive into how MCP is revolutionizing AI integrations in cloud-native environments! The idea of a unified, natural language interface that handles everything from schema validation to security and scaling really addresses the biggest pain points in AI ops today.

I especially appreciate the focus on security with ETDI and MCP Guardian as AI workflows grow, guarding against tool squatting and unauthorized access will be critical. Plus, MCP’s compatibility with serverless platforms like Lambda and Cloud Run makes it incredibly accessible for modern architectures.

Looking forward to seeing how this protocol matures and how developers start building more autonomous, secure AI-driven cloud operations. Thanks for sharing such a comprehensive overview!

Collapse
 
om_shree_0709 profile image
Om Shree

Thanks Anik, Glad you liked it!