Drop this into a file and run it:
from mcp.server.fastmcp import FastMCP
import httpx
mcp = FastMCP("mini-tools")
@mcp.tool()
def read_file(path: str) -> str:
"""Read the contents of a file."""
with open(path) as f:
return f.read()
@mcp.tool()
async def fetch_url(url: str) -> str:
"""Fetch a URL and return the response text."""
async with httpx.AsyncClient() as client:
resp = await client.get(url)
return resp.text
if __name__ == "__main__":
mcp.run()
Save as server.py, install the one dependency, and start it:
pip install mcp httpx
python server.py
You now have an MCP server that gives any AI agent the ability to read files and fetch web pages. In 13 lines of code.
How the FastMCP API works
The mcp package ships two APIs. The older one uses @app.list_tools() and @app.call_tool() decorators that receive raw JSON. The FastMCP API (what you just used) is the Pythonic way — you write plain functions with typed parameters and the framework handles the JSON-RPC plumbing.
Three decorators cover everything you need:
| Decorator | Purpose |
|---|---|
@mcp.tool() |
Expose a function as a tool the agent can call |
@mcp.resource() |
Expose a static or computed resource (file, data, config) |
@mcp.prompt() |
Expose a prompt template the agent can inject |
Extending it
Adding a third tool is one function:
@mcp.tool()
def list_dir(directory: str = ".") -> str:
"""List files in a directory."""
import os
return "\n".join(os.listdir(directory))
That's it. FastMCP uses the function's type hints and docstring to build the tool schema automatically. The parameter name becomes the JSON Schema key, the type hint determines the schema type, and the docstring becomes the tool description. No manual schema writing.
Connecting to Claude Desktop
Add this to your claude_desktop_config.json:
{
"mcpServers": {
"mini-tools": {
"command": "python",
"args": ["/absolute/path/to/server.py"]
}
}
}
Restart Claude Desktop. The /read_file and /fetch_url commands appear immediately. Claude can inspect local files and pull live web data through your custom tools.
Connecting to other platforms
Any platform that supports the MCP protocol can consume your server. The transport is stdio — the platform spawns your process and communicates over stdin/stdout. No HTTP endpoints, no webhooks, no API keys to rotate.
Platforms like Nebula let you register MCP tools as agent skills. Your server registers tools, and Nebula's agent runtime discovers them automatically — no per-tool configuration needed.
What to build next
- SQLite MCP server — let agents query your local database
- GitHub MCP server — expose repo operations as tools
- Slack MCP server — let agents search messages and post updates
- CLI wrapper MCP server — wrap any command-line tool as an MCP tool
The MCP protocol makes tool-building boring in the best sense. One file, typed functions, and your AI agent just gained a capability it didn't have before. That's the whole idea.
Top comments (1)
Some comments may only be visible to logged-in visitors. Sign in to view all comments.