DEV Community

Cover image for FastMCP: Simplifying AI Context Management with the Model Context Protocol
Mayank Gupta
Mayank Gupta

Posted on

1

FastMCP: Simplifying AI Context Management with the Model Context Protocol

As AI applications grow more sophisticated, integrating external data sources and tools efficiently becomes crucial. The Model Context Protocol (MCP) offers a standardized way to connect AI applications with local and remote resources. FastMCP, a Python SDK, simplifies the implementation of MCP, allowing developers to build MCP clients and servers seamlessly. In this blog, we explore what FastMCP is, how it works, and why it’s an essential tool for AI-powered applications.


Understanding the Model Context Protocol (MCP)

MCP is designed to standardize the way AI models access external data and tools. Instead of manually copying and pasting information, AI applications can now use MCP to dynamically fetch relevant context, ensuring more accurate and informed responses.

mcp architecture
Credit - modelcontextprotocol.io

Key Components of MCP:

  • MCP Hosts: Applications like Claude Desktop, IDEs, or AI tools that use MCP to access data.
  • MCP Clients: Protocol clients that establish 1:1 connections with servers.
  • MCP Servers: Lightweight programs that expose specific capabilities via MCP.
  • Local Data Sources: Files, databases, and services on your machine that MCP servers can securely access.
  • Remote Services: APIs and external data sources accessible over the internet.

By defining a universal way for LLMs to interact with structured data, MCP eliminates the need for fragmented, custom-built integrations.


What is FastMCP?

FastMCP is a Python SDK that implements the full MCP specification, making it easier to:

  • Build MCP clients that can connect to any MCP server.
  • Create MCP servers that expose prompts, tools, and data sources.
  • Use standard transports like Stdio and SSE.
  • Handle all MCP protocol messages and lifecycle events seamlessly.

With FastMCP, developers can focus on extending AI capabilities rather than managing low-level communication protocols.


Why Use FastMCP?

1. Seamless AI Integrations

FastMCP enables AI tools to pull in relevant data dynamically, eliminating manual context loading.

2. Scalability

With a standardized approach, MCP-powered applications can scale efficiently across multiple data sources.

3. Interoperability

Since MCP follows a universal protocol, it ensures seamless communication between different tools and AI platforms.

4. Security

FastMCP allows secure access to local and remote data sources, preventing unnecessary data exposure.


Example: AI-Powered Weather Information System

To demonstrate the power of FastMCP, we’ll take a weather information system use case and implement all three components:

  • @tool → Fetches live weather data
  • @resource → Stores historical weather reports
  • @prompt → Provides a structured response format for LLMs

Image description


Step 1: @tool - Fetching Live Weather Data

The @tool decorator is used when LLMs need to call an external function, such as fetching live weather from an API.

from mcp.server.fastmcp import FastMCP
import requests

mcp = FastMCP("WeatherAssistant")

@mcp.tool()
def get_weather(city: str):
    """Fetches the current weather for a given city using OpenWeather API."""
    API_KEY = "your_openweather_api_key"
    url = f"http://api.openweathermap.org/data/2.5/weather?q={city}&appid={API_KEY}&units=metric"
    response = requests.get(url)

    if response.status_code == 200:
        data = response.json()
        return {
            "temperature": data["main"]["temp"],
            "weather": data["weather"][0]["description"]
        }
    else:
        return {"error": "City not found"}
Enter fullscreen mode Exit fullscreen mode

Step 2: @resource - Fetching Historical Weather Data

The @resource decorator is used to expose stored data, such as historical weather reports from a file or database.

import json

@mcp.resource("weather://{city}")
def get_weather_history(city: str):
    """Fetches past weather records from a local database (JSON file)."""
    with open("weather_data.json", "r") as file:
        weather_db = json.load(file)

    return weather_db.get(city, {"error": "No historical data available"})
Enter fullscreen mode Exit fullscreen mode

Step 3: @prompt - Structuring the Response Format

The @prompt decorator is used to store predefined response templates.

@mcp.prompt()
def weather_report_template():
    """Provides a structured format for displaying weather reports."""
    return (
        "🌍 City: {city}\n"
        "🌡️ Temperature: {temperature}°C\n"
        "🌦️ Condition: {weather}\n"
        "📅 Historical Data: {history}\n"
    )
Enter fullscreen mode Exit fullscreen mode

LLM Integration: Function Calling MCP Resources, Prompts & Tools

import requests
import json

MCP_SERVER_URL = "http://localhost:5000"

def fetch_live_weather(city):
    response = requests.post(f"{MCP_SERVER_URL}/tool/get_weather", json={"city": city})
    return response.json()

def fetch_historical_weather(city):
    response = requests.get(f"{MCP_SERVER_URL}/resource/weather/{city}")
    return response.json()

def fetch_weather_template():
    response = requests.get(f"{MCP_SERVER_URL}/prompt/weather_report_template")
    return response.text

def generate_weather_report(city):
    live_weather = fetch_live_weather(city)
    history = fetch_historical_weather(city)
    template = fetch_weather_template()

    return template.format(
        city=city,
        temperature=live_weather["temperature"],
        weather=live_weather["weather"],
        history=history
    )

# Example Usage:
print(generate_weather_report("New York"))
Enter fullscreen mode Exit fullscreen mode

How Everything Works Together?

Component Functionality Example Usage
@tool Calls external APIs dynamically Fetching live weather
@resource Retrieves stored data securely Fetching historical weather
@prompt Formats responses for better readability Displaying weather nicely

Conclusion

FastMCP is a game-changer for developers looking to integrate AI models with structured data efficiently. By simplifying the implementation of the Model Context Protocol, it paves the way for more dynamic and context-aware AI applications. Whether you're building AI-powered search tools, smart assistants, or enterprise solutions, FastMCP provides the flexibility and scalability needed to enhance AI interactions.


Ready to Build Smarter AI Applications?

FastMCP is your ticket to building context-aware, scalable AI solutions. With its user-friendly SDK, you can redefine the way your applications interact with data.


Heroku

Built for developers, by developers.

Whether you're building a simple prototype or a business-critical product, Heroku's fully-managed platform gives you the simplest path to delivering apps quickly — using the tools and languages you already love!

Learn More

Top comments (0)

AWS Q Developer image

Your AI Code Assistant

Automate your code reviews. Catch bugs before your coworkers. Fix security issues in your code. Built to handle large projects, Amazon Q Developer works alongside you from idea to production code.

Get started free in your IDE

👋 Kindness is contagious

DEV shines when you're signed in, unlocking a customized experience with features like dark mode!

Okay