Leveraging Gemini CLI and the underlying Gemini LLM to build Model Context Protocol (MCP) AI applications with Python from a local development environment deployed to the Amazon Bedrock AgentCore service on AWS.
Is this yet another Python MCP Demo?
Yes it is.
Python has traditionally been the main coding language for ML and AI tools. The goal of this article is to provide a minimal viable basic working MCP stdio server that can be run locally without any unneeded extra code or extensions.
What Is Python?
Python is an interpreted language that allows for rapid development and testing and has deep libraries for working with ML and AI:
Python Version Management
One of the downsides of the wide deployment of Python has been managing the language versions across platforms and maintaining a supported version.
The pyenv tool enables deploying consistent versions of Python:
GitHub - pyenv/pyenv: Simple Python version management
As of writing — the mainstream python version is 3.13. To validate your current Python:
admin@ip-172-31-70-211:~/gemini-cli-aws/mcp-lms-python-aws$ python --version
Python 3.13.13
Gemini CLI
If not pre-installed you can download the Gemini CLI to interact with the source files and provide real-time assistance:
npm install -g @google/gemini-cli
Testing the Gemini CLI Environment
Once you have all the tools and the correct Node.js version in place- you can test the startup of Gemini CLI. You will need to authenticate with a Key or your Google Account:
xbill@penguin:~/gemini-cli-aws/mcp-bedrock-python-aws$ gemini
▝▜▄ Gemini CLI v0.41.2
▝▜▄
▗▟▀ Signed in with Google /auth
▝▀ Plan: Gemini Code Assist Standard /upgrade
Python MCP Documentation
The official GitHub Repo provides samples and documentation for getting started:
The most common MCP Python deployment path uses the FASTMCP library:
Docker Version Management
The AWS Cli tools and Lightsail extensions need current version of Docker. If your environment does not provide a recent docker tool- the Docker Version Manager can be used to downlaod the latest supported Docker:
To check the version of Docker:
xbill@penguin:~/gemini-cli-aws/mcp-bedrock-python-aws$ dvm --version
Docker Version Manager version 1.0.3 (e59ba7b)
Amazon Bedrock AgentCore
Amazon Bedrock AgentCore is a fully managed AWS platform designed for building, deploying, and operating AI agents securely at enterprise scale. It removes the “undifferentiated heavy lifting” of infrastructure management, allowing developers to focus on agent logic while providing enterprise-grade security and observability.
Full details are here:
Amazon Bedrock AgentCore - AWS
AWS CLI
The AWS CLI provides a command line tool to directly access AWS services from your current environment. Full details on the CLI are available here:
Install Docker, AWS CLI, and the Lightsail Control plugin for containers
You can version check the tool after installation:
xbill@penguin:~/gemini-cli-aws/mcp-bedrock-python-aws$ aws --version
aws-cli/2.34.44 Python/3.14.4 Linux/6.6.99-09128-g14e87a8a9b71 exe/x86_64.debian.12
xbill@penguin:~/gemini-cli-aws/mcp-bedrock-python-aws$
Where do I start?
The strategy for starting MCP development is a incremental step by step approach.
First, the basic development environment is setup with the required system variables, and a working Gemini CLI configuration.
Then, a minimal Hello World Style Python MCP Server is built with HTTP transport. This server is validated with Gemini CLI in the local environment.
This setup validates the connection from Gemini CLI to the local process via MCP. The MCP client (Gemini CLI) and the Python MCP server both run in the same local environment.
Next- the MCP server is wrapped in a container with docker and deployed to Amazon Bedrock Agent Core. This remote deployment is validated with Gemini CLI running as a MCP client.
Setup the Basic Environment
At this point you should have a working Python interpreter and a working Gemini CLI installation. The next step is to clone the GitHub samples repository with support scripts:
cd ~
git clone https://github.com/xbill9/gemini-cli-aws
Then run init.sh from the cloned directory.
The script will attempt to determine your shell environment and set the correct variables:
cd gemini-cli-aws
source init.sh
If your session times out or you need to re-authenticate- you can run the set_env.sh script to reset your environment variables:
cd gemini-cli-aws
source set_env.sh
Variables like PROJECT_ID need to be setup for use in the various build scripts- so the set_env script can be used to reset the environment if you time-out.
Hello World with HTTP Transport
One of the key features that the standard MCP libraries provide is abstracting various transport methods.
The high level MCP tool implementation is the same no matter what low level transport channel/method that the MCP Client uses to connect to a MCP Server.
The simplest transport that the SDK supports is the stdio (stdio/stdout) transport — which connects a locally running process. Both the MCP client and MCP Server must be running in the same environment.
The HTTP transport allows the MCP client and server to run in the same environment or distributed over the Internet.
The connection over HTTP will look similar to this:
# Initialize FastMCP server
mcp = FastMCP(
"hello_world_server", host="0.0.0.0", port=port, stateless_http=True
)
Running the Python Code
First- switch the directory with the Python MCP sample code:
cd ~/gemini-cli-aws/mcp-bedrock-python-aws
Refresh the AWS credentials:
xbill@penguin:~/gemini-cli-aws/mcp-bedrock-python-aws$ aws login --remote
Browser will not be automatically opened.
xbill@penguin:~/gemini-cli-aws/mcp-bedrock-python-aws$ source save-aws-creds.sh
Exporting AWS credentials...
Successfully saved credentials to .aws_creds
The Makefile will now automatically use these for deployments.
xbill@penguin:~/gemini-cli-aws/mcp-bedrock-python-aws$
Run the install target on the local system:
xbill@penguin:~/gemini-cli-aws/mcp-bedrock-python-aws$ make install
Installing dependencies...
cd app/hello_world_server && pip install -r requirements.txt
Then Lint:
xbill@penguin:~/gemini-cli-aws/mcp-bedrock-python-aws$ make lint
Running linting (flake8)...
flake8 app/hello_world_server app/testagent --exclude .venv
xbill@penguin:~/gemini-cli-aws/mcp-bedrock-python-aws$
and test:
xbill@penguin:~/gemini-cli-aws/mcp-bedrock-python-aws$ make test
Running tests (pytest)...
PYTHONPATH=app/hello_world_server:app/testagent pytest app/hello_world_server app/testagent
=============================================================== test session starts ===============================================================
platform linux -- Python 3.13.11, pytest-9.0.2, pluggy-1.6.0
rootdir: /home/xbill
configfile: pyproject.toml
plugins: anyio-4.11.0, asyncio-1.3.0, langsmith-0.8.0
asyncio: mode=Mode.STRICT, debug=False, asyncio_default_fixture_loop_scope=None, asyncio_default_test_loop_scope=function
collected 3 items
app/hello_world_server/tests/test_main.py ... [100%]
================================================================ 3 passed in 0.39s ================================================================
xbill@penguin:~/gemini-cli-aws/mcp-bedrock-python-aws$
Then deploy:
xbill@penguin:~/gemini-cli-aws/mcp-bedrock-python-aws$ make deploy
Deploying to AgentCore...
agentcore deploy
AgentCore Deploy
Project: McpBedrockPythonAws
Target: us-east-1:106059658660
[done] Validate project
[done] Check dependencies
Build CDK project
Synthesize CloudFormation
Check stack status
Computing diff changes...
Publish assets
Deploy to AWS
Log: agentcore/.cli/logs/deploy/deploy-20260507-105552.log
Esc back · Ctrl+C quit
Once the everything is deployed:
xbill@penguin:~/gemini-cli-aws/mcp-bedrock-python-aws$ make status
Showing AgentCore status...
agentcore status
AgentCore Status (target: default, us-east-1)
Gateways
hello-world-gateway: Deployed (1 target) (mcpbedrockpythonaws-hello-world-gateway-3fielpjqo4)
You can then get the endpoint:
xbill@penguin:~/gemini-cli-aws/mcp-bedrock-python-aws$ make endpoint
Retrieving gateway endpoints...
https://mcpbedrockpythonaws-hello-world-gateway-3fielpjqo4.gateway.bedrock-agentcore.us-east-1.amazonaws.com/mcp
Gemini CLI settings.json
Once you have the deployed endpoint — update the Gemini CLI MCP settings:
"mcp-bedrock-remote": {
"httpUrl": "https://mcpbedrockpythonaws-hello-world-gateway-3fielpjqo4.gateway.bedrock-agentcore.us-east-1.amazonaws.com/mcp"
}
Remote MCP Server Testing
Restart Gemini CLI and check for the new MCP tools:
gemini
/mcp list
🟢 mcp-bedrock-remote - Ready (2 tools)
Tools:
- mcp_mcp-bedrock-remote_hello-world-target___greet
- mcp_mcp-bedrock-remote_x_amz_bedrock_agentcore_search
Then the remote MCP tool in AWS Bedrock can be called:
> greet Amazon Bedrock Agentcore!
Topic: Greeting Amazon Bedrock Agentcore using the hello-world-target greet tool.
╭─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╮
│ ✓ hello-world-target___greet (mcp-bedrock-remote MCP Server) {"param":"Amazon Bedrock Agentcore"} │
│ │
│ Amazon Bedrock Agentcore │
╰─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────╯
✦ Hello, Amazon Bedrock Agentcore!
Project Review
Gemini CLI can also summarize the deployment:
✦ I have completed the documentation updates for the entire project.
Final Summary of Changes:
1. Root Documentation (GEMINI.md & README.md):
* New Features: Documented the /health endpoint and the add_numbers local tool.
* Hybrid Architecture: Explained how the testagent combines discovered MCP tools with local tools.
* Authentication: Added instructions for save-aws-creds.sh to facilitate Makefile-based deployments.
* Utility Commands: Documented make endpoint for retrieving the gateway URL.
2. Server Documentation (app/hello_world_server/README.md):
* Completely rewrote the previously empty README to include implementation details (FastMCP, Streamable HTTP), tool definitions, and local
execution instructions.
3. Agent Documentation (app/testagent/README.md):
* Updated local development instructions to emphasize the no-venv rule, aligning with the project's managed container approach.
4. Verification:
* Ran make test to ensure code integrity (3/3 tests passed).
* Verified the agentcore.json configuration matches the documented endpoints.
The project is now fully documented, providing clear guidance for both developers and AI assistants.
Project Documentation Complete: Final documentation review and cleanup completed.
and a code review:
✦ Based on my review of the codebase, project configuration, and infrastructure code, here is a project review with best practices and recommended
improvements for your Model Context Protocol (MCP) ecosystem on Amazon Bedrock AgentCore.
1. Architecture & Design Patterns
* Gateway-Target Decoupling: You have correctly implemented the "Gateway" pattern using AgentCore. This decouples your Strands-based agent from
the Python-based tool provider, allowing them to scale and be updated independently.
* Stateless Tool Runtimes: Your use of FastMCP(..., stateless_http=True) and transport="streamable-http" in server.py is perfectly aligned with
the requirements for managed runtimes in Amazon Bedrock AgentCore.
* Dynamic Tool Discovery: The implementation in app/testagent/mcp_client/client.py using injected environment variables for gateway URLs is the
idiomatic way to handle discovery within the AgentCore ecosystem.
2. Code Quality & Implementation
* Structured Logging: Using python-json-logger is excellent. It ensures that logs from your managed runtimes are easily searchable in
CloudWatch.
* Improvement: In server.py, transition from configuring the root logger to using a named logger: logger = logging.getLogger( __name__ ). This
avoids conflicts with library loggers.
* Robust Streaming in Agents: In app/testagent/agent.py, the invoke entrypoint currently only yields event["data"] if it is a string.
* Best Practice: Update this to handle other event types (like tool calls or usage metadata) or use a more robust streaming utility from the
strands SDK to ensure the agent's full reasoning process is captured or correctly filtered.
3. Infrastructure & Security (CDK)
* IAM Least Privilege: Your CDK code in agentcore/cdk/lib/cdk-stack.ts uses a wildcard for invocation permissions: resources:
[arn:aws:bedrock-agentcore:${this.region}:${this.account}:runtime/*].
* Best Practice: For production, narrow this scope to only the specific runtime ARNs that the Gateway needs to access.
* Auth Patches: The patch in your CDK to enable GATEWAY_IAM_ROLE for SigV4 authentication is a sophisticated and correct way to secure the
"internal" communication between the Gateway and your MCP Server.
4. Dependency Management & DX
* Unified Tooling: You have a uv.lock and pyproject.toml files, which are modern and efficient. However, your Makefile still uses pip install -r
requirements.txt.
* Recommendation: Transition the Makefile and installation workflows fully to uv (e.g., uv sync or uv pip install .). This ensures
deterministic builds and faster setup.
* Testing Strategy:
* Mocking: Your current agent tests are limited because they require a live model. Use pytest-mock to mock the BedrockModel and MCPClient
responses. This allows your CI/CD pipeline to verify agent logic without AWS credentials.
* Protocol Testing: In app/hello_world_server/tests/test_server.py, add a test that sends a raw JSON-RPC payload to the / endpoint to verify
the MCP protocol layer, ensuring your server handles the protocol correctly, not just the tool logic.
5. Observability
* OTEL Integration: You have enableOtel: true in agentcore.json. This is a top-tier best practice. Ensure you are viewing these traces in AWS
X-Ray or Amazon CloudWatch ServiceLens to identify latency bottlenecks in your tool calls.
Summary
The strategy for using Python for MCP development with Gemini CLI on Amazon Bedrock AgentCore was validated with a incremental step by step approach.
A minimal HTTP transport MCP Server was started from Python source code and validated with Gemini CLI running as a MCP client in the same local environment.
This MCP server was then deployed to Amazon Bedrock AgentCore. The local copy of Gemini CLI was used as a MCP client to validate the connection.
This approach can be extended to more complex deployments using other MCP transports and Cloud based options.

Top comments (0)