DEV Community

Amartya Gaur
Amartya Gaur

Posted on

Building a Dynamic MCP Proxy Server in Python

The Model Context Protocol (MCP) is rapidly becoming the standard for connecting AI models to external tools and data. As you start building or using more MCP servers—one for your database, one for web search, another for local files—you might find yourself juggling multiple connections.

What if you could have a single entry point? A "Proxy" server that sits in the middle and dynamically routes your AI's requests to the right place?

In this post, I'll show you how to build a simple, dynamic MCP Proxy Server using Python and the fastmcp library.

What We're Building

We are going to build a server that:

  1. Acts as an MCP Server itself: Your AI client (like Claude Desktop or an IDE) connects to this one server.
  2. Manages other Servers: It has tools to add_proxy and list_proxies.
  3. Persists Configuration: It saves your proxy list to a JSON file so they survive restarts.
  4. Mounts Dynamically: It automatically connects to your registered proxies on startup.

Prerequisities

You'll need Python installed. I highly recommend using uv for modern Python package management, but pip works too.

uv init mcp-proxy
cd mcp-proxy
uv add "fastmcp>=2.14.5" "uvicorn>=0.40.0"
Enter fullscreen mode Exit fullscreen mode

Step 1: The Configuration

First, we need a way to save our proxy configurations. We'll use a simple proxies.json file.

import json
import os

CONFIG_FILE = os.path.join(os.path.dirname(os.path.abspath(__file__)), "proxies.json")

def load_proxies():
    if not os.path.exists(CONFIG_FILE):
        return []
    try:
        with open(CONFIG_FILE, "r") as f:
            data = json.load(f)
            return data.get("proxies", [])
    except Exception as e:
        print(f"Error loading config: {e}")
        return []

def save_proxy(name, url):
    proxies = load_proxies()
    # Update if exists, otherwise append
    for p in proxies:
        if p["name"] == name:
            p["url"] = url
            break
    else:
        proxies.append({"name": name, "url": url})

    with open(CONFIG_FILE, "w") as f:
        json.dump({"proxies": proxies}, f, indent=4)
Enter fullscreen mode Exit fullscreen mode

Step 2: The Proxy Server

Now for the magic. We use FastMCP to create our server and FastMCPProxy to wrap remote servers.

from fastmcp import FastMCP
from fastmcp.server.proxy import ProxyClient as _ProxyClient
from fastmcp.server.proxy import FastMCPProxy

# Initialize our main server
mcp = FastMCP("MCP Proxy")

@mcp.tool()
async def add_proxy(name: str, url: str) -> str:
    """Register a new MCP server to proxy."""
    try:
        save_proxy(name, url)

        # Create a factory that returns a client for the URL
        def client_factory():
            return _ProxyClient(url)

        # Create a proxy wrapper
        proxy_server = FastMCPProxy(client_factory=client_factory, name=name)

        # Mount it to our main server
        mcp.mount(proxy_server)

        return f"Added and mounted {name}"
    except Exception as e:
        return f"Error adding proxy: {str(e)}"

@mcp.tool()
def list_proxies() -> str:
    """List all registered proxy servers."""
    proxies = load_proxies()
    if not proxies:
        return "No proxies registered."
    return "\n".join([f"{p['name']}: {p['url']}" for p in proxies])
Enter fullscreen mode Exit fullscreen mode

Step 3: Startup Logic

We want our saved proxies to be available immediately when we restart the server.

def mount_existing_proxies():
    """Mount proxies defined in the config file on startup."""
    proxies = load_proxies()
    for p in proxies:
        try:
            def make_factory(u):
                return lambda: _ProxyClient(u)

            ps = FastMCPProxy(client_factory=make_factory(p["url"]), name=p["name"])
            mcp.mount(ps)
            print(f"Mounted proxy: {p['name']}")
        except Exception as e:
            print(f"Startup mount error for {p['name']}: {e}")

# Run this before starting the server
mount_existing_proxies()

if __name__ == "__main__":
    mcp.run()
Enter fullscreen mode Exit fullscreen mode

Implementation Details

The key component here is mcp.mount(). FastMCP allows you to mount one server onto another.

When we use FastMCPProxy, we are essentially saying: "Create a virtual MCP server that just forwards everything to this URL." By mounting that virtual server onto our main mcp instance, our main server instantly gains all the tools and resources of the remote server.

Running It

Run your server using uv:

uv run server.py
Enter fullscreen mode Exit fullscreen mode

Now, connect your MCP client (like Claude Desktop) to this server. You can ask Claude:

"Add a proxy named 'weather' pointing to http://localhost:8100/sse"

And just like that, Claude now has access to the weather tools through your proxy.

Conclusion

This pattern is incredibly powerful. It allows you to build a "Router" for your AI agents, decoupling the implementation of specific tools from the main connection. You can add, remove, or update backend services without ever disconnecting your AI client.

Check out the full code on GitHub!

Top comments (0)