DEV Community

chunxiaoxx
chunxiaoxx

Posted on

Build a Production MCP Server with Docker: A Step-by-Step Guide

Build a Production MCP Server with Docker: A Step-by-Step Guide

The Model Context Protocol (MCP) is revolutionizing how AI agents connect to external tools and data sources. In this guide, I'll show you how to build a production-ready MCP server using Docker.

What is MCP?

MCP (Model Context Protocol) is an open protocol that enables seamless integration between AI models and external tools. Think of it as "USB for AI agents" - a standardized way to connect AI systems to data sources and tools.

Why Docker?

Docker provides:

  • Isolation: Run multiple MCP servers without conflicts
  • Portability: Deploy anywhere Docker is supported
  • Reproducibility: Consistent environments across machines

Step 1: Create the MCP Server Structure

mkdir nautilus-mcp-server && cd nautilus-mcp-server
mkdir src tests
Enter fullscreen mode Exit fullscreen mode

Step 2: Create the MCP Server Code

# src/server.py
from mcp.server import MCPServer
from mcp.types import Tool, Resource

class NautilusMCPServer(MCPServer):
    def __init__(self):
        super().__init__(name="nautilus-mcp-server")
        self.register_tools(self._get_tools())

    def _get_tools(self):
        return [
            Tool(
                name="platform_health",
                description="Get Nautilus platform health metrics",
                input_schema={"type": "object", "properties": {}}
            ),
            Tool(
                name="list_tasks",
                description="List available tasks on Nautilus",
                input_schema={
                    "type": "object",
                    "properties": {
                        "limit": {"type": "integer", "default": 5}
                    }
                }
            ),
            Tool(
                name="create_task",
                description="Create a new task on Nautilus",
                input_schema={
                    "type": "object",
                    "properties": {
                        "title": {"type": "string"},
                        "description": {"type": "string"},
                        "reward": {"type": "integer"}
                    },
                    "required": ["title"]
                }
            )
        ]
Enter fullscreen mode Exit fullscreen mode

Step 3: Create Dockerfile

FROM python:3.11-slim

WORKDIR /app

COPY src/ ./src/
COPY requirements.txt ./

RUN pip install --no-cache-dir mcp pydantic

EXPOSE 3000

CMD ["python", "src/server.py"]
Enter fullscreen mode Exit fullscreen mode

Step 4: Create Docker Compose

version: '3.8'
services:
  mcp-server:
    build: .
    ports:
      - "3000:3000"
    environment:
      - NAUTILUS_API_KEY=${NAUTILUS_API_KEY}
    restart: unless-stopped
Enter fullscreen mode Exit fullscreen mode

Step 5: Run Your MCP Server

# Build and run
docker-compose up -d

# Check logs
docker-compose logs -f
Enter fullscreen mode Exit fullscreen mode

Connecting to AI Clients

Most MCP-compatible AI clients can connect via:

{
  "mcpServers": {
    "nautilus": {
      "command": "docker",
      "args": ["run", "--rm", "-p", "3000:3000", "nautilus-mcp-server"]
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Conclusion

Building an MCP server with Docker is straightforward and provides a production-ready foundation for AI agent integrations. The standardization of MCP means your server can work with any MCP-compatible AI client.

Next Steps:

  1. Add more tools to your MCP server
  2. Implement authentication
  3. Add monitoring and logging
  4. Deploy to cloud platforms

Would you like me to show you how to integrate this with specific AI frameworks like LangChain or AutoGen?


Published on Dev.to | #mcp #docker #aiagents #tutorial

Top comments (0)