DEV Community

xbill for Google Developer Experts

Posted on

MCP Development with Gemini CLI and Amazon Fargate

Leveraging Gemini CLI and the underlying Gemini LLM to build Model Context Protocol (MCP) AI applications with Python from a local development environment deployed to the Fargate serverless container system on AWS.

Aren’t There a Billion Python MCP Demos?

Yes there are.

Python has traditionally been the main coding language for ML and AI tools. The goal of this article is to provide a minimal viable basic working MCP stdio server that can be run locally without any unneeded extra code or extensions.

What Is Python?

Python is an interpreted language that allows for rapid development and testing and has deep libraries for working with ML and AI:

Welcome to Python.org

Python Version Management

One of the downsides of the wide deployment of Python has been managing the language versions across platforms and maintaining a supported version.

The pyenv tool enables deploying consistent versions of Python:

GitHub - pyenv/pyenv: Simple Python version management

As of writing — the mainstream python version is 3.13. To validate your current Python:

admin@ip-172-31-70-211:~/gemini-cli-aws/mcp-fargate-python-aws$ python --version
Python 3.13.12
Enter fullscreen mode Exit fullscreen mode

Gemini CLI

If not pre-installed you can download the Gemini CLI to interact with the source files and provide real-time assistance:

npm install -g @google/gemini-cli
Enter fullscreen mode Exit fullscreen mode

Testing the Gemini CLI Environment

Once you have all the tools and the correct Node.js version in place- you can test the startup of Gemini CLI. You will need to authenticate with a Key or your Google Account:

gemini

admin@ip-172-31-70-211:~/gemini-cli-aws/mcp-fargate-python-aws$ gemini

▝▜▄ Gemini CLI v0.33.1
    ▝▜▄
   ▗▟▀ Logged in with Google /auth
  ▝▀ Gemini Code Assist Standard /upgrade

? for shortcuts 
──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
 shift+tab to accept edits 3 GEMINI.md files | 1 MCP server
──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
 > Type your message or @path/to/file
──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
 ~/.../mcp-fargate-python-aws (main*) no sandbox (see /docs) /model Auto (Gemini 3) | 239.8 MB
Enter fullscreen mode Exit fullscreen mode

Node Version Management

Gemini CLI needs a consistent, up to date version of Node. The nvm command can be used to get a standard Node environment:

GitHub - nvm-sh/nvm: Node Version Manager - POSIX-compliant bash script to manage multiple active node.js versions

Python MCP Documentation

The official GitHub Repo provides samples and documentation for getting started:

GitHub - modelcontextprotocol/python-sdk: The official Python SDK for Model Context Protocol servers and clients

The most common MCP Python deployment path uses the FASTMCP library:

Welcome to FastMCP - FastMCP

Docker Version Management

The AWS CLI tools and Fargate extensions need current version of Docker. If your environment does not provide a recent docker tool- the Docker Version Manager can be used to downlaod the latest supported Docker:

Install

Amazon Fargate

AWS Fargate is a serverless, pay-as-you-go compute engine for containers that works with Amazon Elastic Container Service (ECS) or Elastic Kubernetes Service (EKS). It eliminates the need to manage, patch, or scale underlying EC2 virtual machines. Fargate automatically allocates, scales, and manages compute infrastructure, allowing developers to focus solely on designing and operating applications.

Details are here:

Serverless Compute - AWS Fargate - AWS

AWS CLI

The AWS CLI provides a command line tool to directly access AWS services from your current environment. Full details on the CLI are available here:

Install Docker, AWS CLI, and the Lightsail Control plugin for containers

Where do I start?

The strategy for starting MCP development is a incremental step by step approach.

First, the basic development environment is setup with the required system variables, and a working Gemini CLI configuration.

Then, a minimal Hello World Style Python MCP Server is built with HTTP transport. This server is validated with Gemini CLI in the local environment.

This setup validates the connection from Gemini CLI to the local process via MCP. The MCP client (Gemini CLI) and the Python MCP server both run in the same local environment.

Next- the MCP server is wrapped in a container with Docker and deployed to Amazon Fargate. This remote deployment is validated with Gemini CLI running as a MCP client.

Setup the Basic Environment

At this point you should have a working Python interpreter and a working Gemini CLI installation. The next step is to clone the GitHub samples repository with support scripts:

cd ~
git clone https://github.com/xbill9/gemini-cli-aws
Enter fullscreen mode Exit fullscreen mode

Then run init.sh from the cloned directory.

The script will attempt to determine your shell environment and set the correct variables:

cd gemini-cli-aws
source init.sh
Enter fullscreen mode Exit fullscreen mode

If your session times out or you need to re-authenticate- you can run the set_env.sh script to reset your environment variables:

cd gemini-cli-aws
source set_env.sh
Enter fullscreen mode Exit fullscreen mode

Variables like PROJECT_ID need to be setup for use in the various build scripts- so the set_env script can be used to reset the environment if you time-out.

Hello World with HTTP Transport

One of the key features that the standard MCP libraries provide is abstracting various transport methods.

The high level MCP tool implementation is the same no matter what low level transport channel/method that the MCP Client uses to connect to a MCP Server.

The simplest transport that the SDK supports is the stdio (stdio/stdout) transport — which connects a locally running process. Both the MCP client and MCP Server must be running in the same environment.

The HTTP transport allows the MCP client and server to run in the same environment or distributed over the Internet.

The connection over HTTP will look similar to this:

mcp.run(
        transport="http",
        host="0.0.0.0",
        port=port,
    )
Enter fullscreen mode Exit fullscreen mode

Running the Python Code

First- switch the directory with the Python MCP sample code:

cd ~/gemini-cli-aws/mcp-fargate-python-aws
Enter fullscreen mode Exit fullscreen mode

Refresh the AWS credentials:

xbill@penguin:~/gemini-cli-aws/mcp-fargate-python-aws$ aws login --remote
Browser will not be automatically opened.

xbill@penguin:~/gemini-cli-aws/mcp-fargate-python-aws$ source save-aws-creds.sh 
Exporting AWS credentials...
Successfully saved credentials to .aws_creds
The Makefile will now automatically use these for deployments.
xbill@penguin:~/gemini-cli-aws/mcp-fargate-python-aws$
Enter fullscreen mode Exit fullscreen mode

Run the deploy version on the local system:

xbill@penguin:~/gemini-cli-aws/mcp-fargate-python-aws$ make deploy
Ensuring CloudWatch log group /ecs/mcp-fargate-task exists...
Ensuring ECS cluster mcp-fargate-cluster exists...
Building the Docker image... 0.0s
Enter fullscreen mode Exit fullscreen mode

You can validate the final result by checking the messages:

Login Succeeded
Tagging and pushing the Docker image to ECR...
{
    "repositories": [
        {
            "repositoryArn": "arn:aws:ecr:us-east-1:106059658660:repository/mcp-server-image",
            "registryId": "106059658660",
            "repositoryName": "mcp-server-image",
Enter fullscreen mode Exit fullscreen mode

Once the container is deployed:

xbill@penguin:~/gemini-cli-aws/mcp-fargate-python-aws$ make status
Checking AWS ECS service status for mcp-fargate-service in cluster mcp-fargate-cluster...
---------------------------------------------------------------------------------------------
| DescribeServices |
+----------------+--------------------------------------------------------------------------+
| Desired | 1 |
| Running | 1 |
| ServiceName | mcp-fargate-service |
| Status | ACTIVE |
| TaskDefinition| arn:aws:ecs:us-east-1:106059658660:task-definition/mcp-fargate-task:4 |
+----------------+--------------------------------------------------------------------------+
xbill@penguin:~/gemini-cli-aws/mcp-fargate-python-aws$ 
Enter fullscreen mode Exit fullscreen mode

You can then get the endpoint:

xbill@penguin:~/gemini-cli-aws/mcp-fargate-python-aws$ make endpoint
Querying ECS tasks for mcp-fargate-service...
3.94.10.15
Enter fullscreen mode Exit fullscreen mode

Gemini CLI settings.json

Once you have the deployed endpoint — update the Gemini CLI MCP settings:

{
  "mcpServers": {
    "mcp-fargate-python-aws": {
      "httpUrl": "http://3.94.10.15:8080/mcp"
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Remote MCP Server Testing

Restart Gemini CLI and check for the new MCP tools:

xbill@penguin:~/gemini-cli-aws/mcp-apprunner-python-aws$ gemini

 > /mcp list
Configured MCP servers:

🟢 mcp-fargate-python-aws - Ready (1 tool)
  Tools:
  - mcp_mcp-fargate-python-aws_greet

Enter fullscreen mode Exit fullscreen mode

Then the remote MCP tool in Fargate can be called:

> greet Fargate1
✦ I will call the greet tool with the argument Fargate1.

╭────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ Action Required │
│ │
│ ? greet (mcp-fargate-python-aws MCP Server) {"param":"Fargate1"} │
│ │
│ ... first 3 lines hidden (Ctrl+O to show) ... │
│ MCP Tool Details: │
│ (press Ctrl+O to expand MCP tool details) │
│ Allow execution of MCP tool "greet" from server "mcp-fargate-python-aws"? │
│ │
│ 1. Allow once │
│ 2. Allow tool for this session │
│ 3. Allow all server tools for this session │
│ ● 4. Allow tool for all future sessions │
│ 5. No, suggest changes (esc) │
│ │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯

✦ I will call the greet tool with the argument Fargate1.

╭────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ ✓ greet (mcp-fargate-python-aws MCP Server) {"param":"Fargate1"} │
│ │
│ Fargate1 │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
✦ OK. I've greeted Fargate1.
Enter fullscreen mode Exit fullscreen mode

Gemini CLI can also summarize the deployment:

 > summarize the AWS Fargate deployment for the Greet MCP tool
✦ I will examine the Makefile and task-definition.json.template to provide a detailed summary of the AWS Fargate
  deployment process.

╭────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ ✓ ReadFile Makefile │
│ │
│ │
│ ✓ ReadFile task-definition.json.template │
│ │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
✦ The AWS Fargate deployment for the Greet MCP tool is a streamlined process managed by a Makefile and
  containerized via Docker.

  Architecture Overview
   * Service Type: AWS ECS with Fargate launch type (Serverless).
   * Networking: Uses the default VPC, subnets, and security groups with a public IP enabled.
   * Container Port: Runs on port 8080.
   * Health Checks: Integrated with ECS, performing a GET request to /health within the container.
   * Logging: Centralized in AWS CloudWatch Logs under the /ecs/mcp-fargate-task group.
   * Security: Uses the ecsTaskExecutionRole for task and execution permissions.

  Deployment Workflow (make deploy)
   1. Infrastructure Prep: Ensures the CloudWatch log group and ECS cluster exist.
   2. Containerization: Builds a Docker image using the local Dockerfile.
   3. ECR Management: Authenticates with Amazon ECR, creates the repository if missing, and pushes the tagged
      image.
   4. Task Registration: Generates a task-definition.json from a template, injecting environment variables
      (Account ID, ECR URI, etc.), and registers it with ECS.
   5. Service Update/Creation: Checks if the ECS service exists; if so, it updates it to the new task definition.
      If not, it creates a new service, automatically discovering default VPC networking settings.

  Key Commands
   * make status: Displays the current running/desired count and status of the ECS service.
   * make endpoint: Retrieves the public IP of the running Fargate task.
   * make logs: Tails real-time logs from CloudWatch.
   * make delete-service: Safely shuts down and deletes the Fargate service.
Enter fullscreen mode Exit fullscreen mode

Summary

The strategy for using Python for MCP development with Gemini CLI on Amazon Fargate was validated with a incremental step by step approach.

A minimal HTTP MCP Server was started from Python source code and validated with Gemini CLI running as a MCP client in the same local environment.

This MCP server was then wrapped in a Docker container and deployed to Amazon Fargate. The local copy of Gemini CLI was used as a MCP client to validate the connection.

This approach can be extended to more complex deployments using other MCP transports and Cloud based options.

Top comments (0)