Leveraging Gemini CLI and the underlying Gemini LLM to build Model Context Protocol (MCP) AI applications with Python with a local development environment hosted on AWS ECS Express.
Aren’t There a Billion Python MCP Demos?
Yes there are.
Python has traditionally been the main coding language for ML and AI tools. The goal of this article is to provide a minimal viable basic working MCP stdio server that can be run locally without any unneeded extra code or extensions.
What Is Python?
Python is an interpreted language that allows for rapid development and testing and has deep libraries for working with ML and AI:
Python Version Management
One of the downsides of the wide deployment of Python has been managing the language versions across platforms and maintaining a supported version.
The pyenv tool enables deploying consistent versions of Python:
GitHub - pyenv/pyenv: Simple Python version management
As of writing — the mainstream python version is 3.13. To validate your current Python:
admin@ip-172-31-70-211:~/gemini-cli-aws/mcp-stdio-python-aws$ python --version
Python 3.13.12
Amazon ECS Express Configuration
Amazon ECS Express Mode (announced Nov 2025) is a simplified deployment feature for Amazon Elastic Container Service (ECS) designed to rapidly launch containerized applications, APIs, and web services on AWS Fargate. It automates infrastructure setup — including load balancing, networking, scaling, and HTTPS endpoints — allowing developers to deploy from container image to production in a single step.
More details are available here:
Gemini CLI
If not pre-installed you can download the Gemini CLI to interact with the source files and provide real-time assistance:
npm install -g @google/gemini-cli
Testing the Gemini CLI Environment
Once you have all the tools and the correct Node.js version in place- you can test the startup of Gemini CLI. You will need to authenticate with a Key or your Google Account:
gemini
admin@ip-172-31-70-211:~/gemini-cli-aws/mcp-stdio-python-aws$ gemini
▝▜▄ Gemini CLI v0.33.1
▝▜▄
▗▟▀ Logged in with Google /auth
▝▀ Gemini Code Assist Standard /upgrade
? for shortcuts
──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
shift+tab to accept edits 3 GEMINI.md files | 1 MCP server
──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
> Type your message or @path/to/file
──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
~/.../mcp-stdio-python-aws (main*) no sandbox (see /docs) /model Auto (Gemini 3) | 239.8 MB
Node Version Management
Gemini CLI needs a consistent, up to date version of Node. The nvm command can be used to get a standard Node environment:
Python MCP Documentation
The official GitHub Repo provides samples and documentation for getting started:
The most common MCP Python deployment path uses the FASTMCP library:
Where do I start?
The strategy for starting MCP development is a incremental step by step approach.
First, the basic development environment is setup with the required system variables, and a working Gemini CLI configuration.
Then, a minimal Hello World Style Python MCP Server is built with stdio transport. This server is validated with Gemini CLI in the local environment.
This setup validates the connection from Gemini CLI to the local process via MCP. The MCP client (Gemini CLI) and the Python MCP server both run in the same local environment.
Next- the basic MCP server is extended with Gemini CLI to add several new tools in standard Python code.
Setup the Basic Environment
At this point you should have a working Python interpreter and a working Gemini CLI installation. The next step is to clone the GitHub samples repository with support scripts:
cd ~
git clone https://github.com/xbill9/gemini-cli-aws
Then run init.sh from the cloned directory.
The script will attempt to determine your shell environment and set the correct variables:
cd gemini-cli-aws
source init.sh
If your session times out or you need to re-authenticate- you can run the set_env.sh script to reset your environment variables:
cd gemini-cli-aws
source set_env.sh
Variables like PROJECT_ID need to be setup for use in the various build scripts- so the set_env script can be used to reset the environment if you time-out.
Hello World with HTTP Transport
One of the key features that the standard MCP libraries provide is abstracting various transport methods.
The high level MCP tool implementation is the same no matter what low level transport channel/method that the MCP Client uses to connect to a MCP Server.
The simplest transport that the SDK supports is the stdio (stdio/stdout) transport — which connects a locally running process. Both the MCP client and MCP Server must be running in the same environment.
The HTTP transport allows the MCP Client and Server to be in the same environment or distributed over the Internet.
The connection over HTTP will look similar to this:
mcp.run(
transport="http",
host="0.0.0.0",
port=port,
)
Running the Python Code
First- switch the directory with the Python MCP sample code. Then, run the release version on the local system:
cd ~/gemini-cli-aws/mcp-https-python-ecs-express
make release
If everything is running correctly — you will see the FASTMCP banner:
╭──────────────────────────────────────────────────────────────────────────────╮
│ │
│ │
│ ▄▀▀ ▄▀█ █▀▀ ▀█▀ █▀▄▀█ █▀▀ █▀█ │
│ █▀ █▀█ ▄▄█ █ █ ▀ █ █▄▄ █▀▀ │
│ │
│ │
│ FastMCP 3.2.0 │
│ https://gofastmcp.com │
│ │
│ 🖥 Server: hello-world-server, 3.2.0 │
│ 🚀 Deploy free: https://horizon.prefect.io │
│ │
╰──────────────────────────────────────────────────────────────────────────────╯
[04/04/26 19:10:50] INFO Starting MCP server 'hello-world-server' with transport 'http' on transport.py:299
http://0.0.0.0:8080/mcp
INFO: Started server process [27502]
INFO: Waiting for application startup.
{"message": "StreamableHTTP session manager started"}
INFO: Application startup complete.
INFO: Uvicorn running on http://0.0.0.0:8080 (Press CTRL+C to quit)
At this point — hit Cntrl-C to exit the local HTTP server.
Deploy to ECS Express
Now that the server has been tested locally- start the remote deployment to ECS Express.
Use the deploy target:
> make deploy
✦ I will execute the full deployment cycle, which includes building the Docker image, pushing it to ECR, and
deploying to ECS Express Mode.
Then check the status:
> make status
✦ I will check the status of the ECS Express Mode service.
╭────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ ✓ Shell make status [in /home/xbill/gemini-cli-aws/mcp-ecsexpress-python-aws] │
│ │
│ Checking AWS ECS service status for mcp-express-python-aws... │
│ Service: mcp-express-python-aws │
│ Status: ACTIVE │
│ Endpoint: mc-8d69e7dfa87344be98fdb3f7a8fbbbba.ecs.us-east-1.on.aws │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
✦ The ECS Express Mode service mcp-express-python-aws is ACTIVE and available at:
mc-8d69e7dfa87344be98fdb3f7a8fbbbba.ecs.us-east-1.on.aws
And the endpoint:
Based on the make status output earlier, the actual endpoint is:
mc-8d69e7dfa87344be98fdb3f7a8fbbbba.ecs.us-east-1.on.aws
Gemini CLI settings.json
The default Gemini CLI settings.json has an entry for the Python source:
{
"mcpServers": {
"aws-ecs-express-python": {
"httpUrl": "https://mc-8d69e7dfa87344be98fdb3f7a8fbbbba.ecs.us-east-1.on.aws/mcp"
}
}
}
Validation with Gemini CLI
Leave the HTTP server running and open a new terminal.
Gemini CLI is restarted and the MCP connection over HTTP to the Python Code is validated. The full Gemini CLI Session will start:
xbill@penguin:~/gemini-cli-aws/mcp-ecsexpress-python-aws$ gemini
▝▜▄ Gemini CLI v0.36.0
▝▜▄
▗▟▀ Signed in with Google: xbill@glitnir.com /auth
▝▀ Plan: Gemini Code Assist Standard /upgrade
> /mcp list
Configured MCP servers:
🟢 aws-ecs-express-python - Ready (1 tool)
Tools:
- mcp_aws-ecs-express-python_greet
And you can then connect to the MCP Server over HTTP:
> greet ECS Express!
✦ I will call the greet tool with "ECS Express!".
╭────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ ✓ greet (aws-ecs-express-python MCP Server) {"param":"ECS Express!"} │
│ │
│ ECS Express! │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
✦ The greet tool returned: ECS Express!
Validate Project Setup with Gemini CLI
Gemini CLI was used to summarize the project:
> summarize the project deployment to Amazon ECS Express
✦ The deployment of the mcp-express-python-aws project to Amazon ECS Express Mode is a streamlined process that
automates the container lifecycle from build to public endpoint.
Deployment Workflow (make deploy)
1. Containerization: Builds a Python 3.13-slim Docker image and pushes it to an Amazon ECR repository
(mcp-server-repo).
2. Identity & Access: Automatically provisions necessary IAM roles (ecsTaskExecutionRole and
ecsInfrastructureRoleForExpressServices) if they do not already exist.
3. Orchestration: Deploys the container to ECS Express Mode, which simplifies AWS Fargate deployments by
automatically managing the Application Load Balancer, Security Groups, and Networking.
Current Infrastructure State
* Service Name: mcp-express-python-aws
* Status: ACTIVE
* Compute: AWS Fargate (Task size: 1 vCPU / 2GB RAM)
* Scaling: Configured for 1 to 20 tasks, triggered at 60% average CPU usage.
* Public Access: mc-8d69e7dfa87344be98fdb3f7a8fbbbba.ecs.us-east-1.on.aws (Port 8080)
The service is currently running and reachable via its assigned ECS Express Gateway endpoint.
Summary
The strategy for using Python for MCP development with Gemini CLI and AWS was validated with a incremental step by step approach.
A minimal HTTP transport MCP Server was started from Python source code and validated with Gemini CLI running as a MCP client in the same local environment.
The HTTP server was then tested remotely from the AWS ECS Express instance.
This approach can be extended to more complex deployments using other MCP transports and Cloud based options.

Top comments (0)