DEV Community

xbill for AWS Community Builders

Posted on • Originally published at xbill999.Medium on

Local MCP Development with Python, Gemini CLI, and Amazon AWS EC2

Leveraging Gemini CLI and the underlying Gemini LLM to build Model Context Protocol (MCP) AI applications with Python with a local development environment hosted on AWS.

Aren’t There a Billion Python MCP Demos?

Yes there are.

Python has traditionally been the main coding language for ML and AI tools. The goal of this article is to provide a minimal viable basic working MCP stdio server that can be run locally without any unneeded extra code or extensions.

What Is Python?

Python is an interpreted language that allows for rapid development and testing and has deep libraries for working with ML and AI:

Welcome to Python.org

Python Version Management

One of the downsides of the wide deployment of Python has been managing the language versions across platforms and maintaining a supported version.

The pyenv tool enables deploying consistent versions of Python:

GitHub - pyenv/pyenv: Simple Python version management

As of writing — the mainstream python version is 3.13. To validate your current Python:

admin@ip-172-31-70-211:~/gemini-cli-aws/mcp-stdio-python-aws$ python --version
Python 3.13.12
Enter fullscreen mode Exit fullscreen mode

Amazon EC2 Instance Configuration

To configure your EC2 Instance with the base system tools- this article provides a reference:

Cross Cloud Agent Development with the Google ADK and Amazon EC2

Gemini CLI

If not pre-installed you can download the Gemini CLI to interact with the source files and provide real-time assistance:

npm install -g @google/gemini-cli
Enter fullscreen mode Exit fullscreen mode

Testing the Gemini CLI Environment

Once you have all the tools and the correct Node.js version in place- you can test the startup of Gemini CLI. You will need to authenticate with a Key or your Google Account:

gemini

admin@ip-172-31-70-211:~/gemini-cli-aws/mcp-stdio-python-aws$ gemini

  ▝▜▄ Gemini CLI v0.33.1
    ▝▜▄
   ▗▟▀ Logged in with Google /auth
  ▝▀ Gemini Code Assist Standard /upgrade

                                                                                                                              ? for shortcuts 
──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
 shift+tab to accept edits 3 GEMINI.md files | 1 MCP server
──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
 > Type your message or @path/to/file
──────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
 ~/.../mcp-stdio-python-aws (main*) no sandbox (see /docs) /model Auto (Gemini 3) | 239.8 MB
Enter fullscreen mode Exit fullscreen mode

Node Version Management

Gemini CLI needs a consistent, up to date version of Node. The nvm command can be used to get a standard Node environment:

GitHub - nvm-sh/nvm: Node Version Manager - POSIX-compliant bash script to manage multiple active node.js versions

Python MCP Documentation

The official GitHub Repo provides samples and documentation for getting started:

GitHub - modelcontextprotocol/python-sdk: The official Python SDK for Model Context Protocol servers and clients

The most common MCP Python deployment path uses the FASTMCP library:

Welcome to FastMCP - FastMCP

Where do I start?

The strategy for starting MCP development is a incremental step by step approach.

First, the basic development environment is setup with the required system variables, and a working Gemini CLI configuration.

Then, a minimal Hello World Style Python MCP Server is built with stdio transport. This server is validated with Gemini CLI in the local environment.

This setup validates the connection from Gemini CLI to the local process via MCP. The MCP client (Gemini CLI) and the Python MCP server both run in the same local environment.

Next- the basic MCP server is extended with Gemini CLI to add several new tools in standard Python code.

Setup the Basic Environment

At this point you should have a working Python interpreter and a working Gemini CLI installation. The next step is to clone the GitHub samples repository with support scripts:

cd ~
git clone https://github.com/xbill9/gemini-cli-aws
Enter fullscreen mode Exit fullscreen mode

Then run init.sh from the cloned directory.

The script will attempt to determine your shell environment and set the correct variables:

cd gemini-cli-aws
source init.sh
Enter fullscreen mode Exit fullscreen mode

If your session times out or you need to re-authenticate- you can run the set_env.sh script to reset your environment variables:

cd gemini-cli-aws
source set_env.sh
Enter fullscreen mode Exit fullscreen mode

Variables like PROJECT_ID need to be setup for use in the various build scripts- so the set_env script can be used to reset the environment if you time-out.

Hello World with STDIO Transport

One of the key features that the standard MCP libraries provide is abstracting various transport methods.

The high level MCP tool implementation is the same no matter what low level transport channel/method that the MCP Client uses to connect to a MCP Server.

The simplest transport that the SDK supports is the stdio (stdio/stdout) transport — which connects a locally running process. Both the MCP client and MCP Server must be running in the same environment.

The connection over stdio will look similar to this:

if __name__ == " __main__":
    # Explicitly use stdio transport
    mcp.run(transport="stdio")
Enter fullscreen mode Exit fullscreen mode

Running the Python Code

First- switch the directory with the Python MCP sample code:

cd ~/gemini-cli-aws/mcp-stdio-python-aws
Enter fullscreen mode Exit fullscreen mode

Run the release version on the local system:

cd ~/gemini-cli-aws/mcp-stdio-python-aws
make release
Enter fullscreen mode Exit fullscreen mode

You can validate the final result by checking the messages:

admin@ip-172-31-70-211:~/gemini-cli-aws/mcp-stdio-python-aws$ make release
Installing dependencies...
Requirement already satisfied: mcp==1.20.0 in /home/admin/.pyenv/versions/3.13.12/lib/python3.13/site-packages (from -r requirements.txt (line 1)) (1.20.0)
Enter fullscreen mode Exit fullscreen mode

Gemini CLI settings.json

The default Gemini CLI settings.json has an entry for the Python source:

{
  "mcpServers": {
    "aws-stdio-python": {
      "command": "python",
      "args": [
        "$HOME/gemini-cli-aws/mcp-stdio-python-aws/main.py"
      ]
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Validation with Gemini CLI

Finally- Gemini CLI is restarted and the MCP connection over stdio to the Python Code is validated, The full Gemini CLI Session will start:

admin@ip-172-31-70-211:~/gemini-cli-aws/mcp-stdio-python-aws$ gemini

  ▝▜▄ Gemini CLI v0.33.1
    ▝▜▄
   ▗▟▀ Logged in with Google /auth
  ▝▀ Gemini Code Assist Standard /upgrade

 > /mcp list
Configured MCP servers:

🟢 aws-stdio-python - Ready (1 tool)
  Tools:
  - mcp_aws-stdio-python_greet

> /mcp schema
Configured MCP servers:

🟢 aws-stdio-python - Ready (1 tool)
  Tools:
  - mcp_aws-stdio-python_greet
    Get a greeting from a local stdio server.
      Parameters:
      {
        "type": "object",
        "properties": {
          "param": {
            "title": "Param",
            "type": "string"
          }
        },
        "required": [
          "param"
        ],
        "title": "greetArguments"
      } ? for shortcuts                            
Enter fullscreen mode Exit fullscreen mode

And you can then connect to the MCP Server over stdio:

 > greet two snakes
╭────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ Action Required │
│ │
│ ? greet (aws-stdio-python MCP Server) {"param":"two snakes"} │
│ │
│ MCP Server: aws-stdio-python │
│ Tool: greet │
│ │
│ MCP Tool Details: │
│ (press Ctrl+O to expand MCP tool details) │
│ Allow execution of MCP tool "greet" from server "aws-stdio-python"? │
│ │
│ ● 1. Allow once │
│ 2. Allow tool for this session │
│ 3. Allow all server tools for this session │
│ 4. Allow tool for all future sessions │
│ 5. No, suggest changes (esc) │
│ │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
 > greet two snakes
╭────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ ✓ greet (aws-stdio-python MCP Server) {"param":"two snakes"} │
│ │
│ two snakes │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
✦ The server says: two snakes

Enter fullscreen mode Exit fullscreen mode

Extending the Python MCP Server with Gemini CLI

Now that the basic MCP functionality has been tested and verified with Gemini CLI- the Python server code can be extended with new MCP tools:

> with @main.py add 2 new MCP tools - one that returns the current system time and the other returns detailled information on the current
   system info like OS, memory, etc include detailled AWS EC2 information
Enter fullscreen mode Exit fullscreen mode

Restart Gemini CLI and check for the new MCP tools:

 > /mcp list
Configured MCP servers:

🟢 aws-stdio-python - Ready (3 tools)
  Tools:
  - mcp_aws-stdio-python_get_system_info
  - mcp_aws-stdio-python_get_system_time
  - mcp_aws-stdio-python_greet

 > get_system_time
✦ I will call the get_system_time tool to retrieve the current system time.

╭────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ ✓ get_system_time (aws-stdio-python MCP Server) {} │
│ │
│ 2026-03-13T15:54:12.825507 │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
✦ The current system time is Friday, March 13, 2026, at 15:54:12.


✦ I will call the get_system_info tool to retrieve detailed information about the current system.

╭────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ ✓ get_system_info (aws-stdio-python MCP Server) {} │
│ │
│ { │
│ "os": "Linux", │
│ "os_release": "6.12.48+deb13-cloud-amd64", │
│ "os_version": "#1 SMP PREEMPT_DYNAMIC Debian 6.12.48-1 (2025-09-20)", │
│ "machine": "x86_64", │
│ "processor": "", │
│ "cpu_count": 2, │
│ "memory": { │
│ "total": 2030292992, │
│ "available": 1225879552, │
│ "percent": 39.6 │
│ }, │
│ "disk": { │
│ "total": 16665681920, │
│ "used": 9401159680, │
│ "free": 6481428480, │
│ "percent": 59.2 │
│ }, │
│ "aws_status": "AWS credentials not configured or not an AWS environment" │
│ } │
╰────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
✦ The system is running Linux (Debian 6.12.48) on an x86_64 architecture with 2 CPUs. It has approximately 2GB of total RAM (about 1.2GB
  available) and 16GB of disk space (about 6.5GB free). AWS credentials are not currently configured.

Enter fullscreen mode Exit fullscreen mode

The information from the MCP tool execution comes back into the current Gemini CLI context. You can use the information returned from the tools along with the underlying Gemini Models:

> can i play Cyberpunk 2077 on this machine

✦ The system is running Linux (Debian 6.12.48) on an x86_64 architecture with 2 CPUs. It has approximately 2GB of total RAM (about 1.2GB
  available) and 16GB of disk space (about 6.5GB free). AWS credentials are not currently configured.

 > > can i play Cyberpunk 2077 on this machine
✦ No, you cannot run Cyberpunk 2077 on this machine. It falls significantly below the minimum requirements in every category:

   * Memory: You have 2 GB of RAM, while the game requires at least 12 GB.
   * Storage: You have 6.5 GB of free space, but the game needs 70 GB (and recommends an SSD).
   * CPU: You have 2 cores, which is insufficient for a modern AAA title that typically requires a minimum of 4–6 high-performance cores.
   * Graphics: This appears to be a lightweight Linux environment (likely a cloud instance or container) without a dedicated gaming GPU.

  This system is designed for development and running lightweight services, not high-end gaming.
Enter fullscreen mode Exit fullscreen mode

But you can still play Doom!

> can I play Classic Doom on this machine?

✦ Yes, you can play Classic Doom on this machine, but with some conditions:

   1. Hardware Requirements: This machine (2 GB RAM, 2 CPU cores) far exceeds the original requirements (486 CPU, 8 MB RAM). It could easily
      run thousands of instances of the original game simultaneously.
   2. Display Environment: Currently, this is a headless CLI-only environment. You cannot run graphical versions of Doom (like GZDoom or
      Chocolate Doom) without a windowing system (X11 or Wayland).
   3. Terminal Options: If you want to play right now, you would need to use a terminal-based source port like doom-ascii or fbdoom.
   4. Missing Files: The game data files (.wad files) are not present on this system, so you would need to download or upload them first.

  In short: Hardware-wise, absolutely. Environment-wise, only if you use a text-based version.


Enter fullscreen mode Exit fullscreen mode

Summary

The strategy for using Python for MCP development with Gemini CLI was validated with a incremental step by step approach.

A minimal stdio transport MCP Server was started from Python source code and validated with Gemini CLI running as a MCP client in the same local environment.

Gemini CLI was then used to extend the sample Python code with several MCP tools and use these tools inside the context for the underlying LLM.

This approach can be extended to more complex deployments using other MCP transports and Cloud based options.

Top comments (0)