DEV Community

Cover image for MCP (Model Context Protocol): Connect AI Agents to Any Tool or API
toolfreebie
toolfreebie

Posted on • Originally published at toolfreebie.com

MCP (Model Context Protocol): Connect AI Agents to Any Tool or API

What is MCP (Model Context Protocol)?

Model Context Protocol (MCP) is an open standard developed by Anthropic that defines how AI models communicate with external tools, APIs, and data sources. Think of it as a universal adapter — instead of building custom integrations for every AI model and every tool, MCP gives you one standard protocol that works everywhere.

Launched in late 2024 and now widely adopted across the AI ecosystem, MCP is quickly becoming the foundation for building production-grade AI agents. It’s supported by Claude, and increasingly integrated into frameworks like LangChain, LlamaIndex, CrewAI, and Dify.

Why MCP Matters for Developers

Before MCP, every AI agent framework had its own way of connecting to tools. If you switched from LangChain to CrewAI, you had to rewrite all your tool integrations. MCP solves this by standardizing the interface between AI models and tools:

  • Write a tool once, use it anywhere — any MCP-compatible AI client can use your MCP server
  • No vendor lock-in — swap models or frameworks without rebuilding integrations
  • Better security — tools run in isolated servers, AI models can’t directly execute arbitrary code
  • Composable architecture — mix and match MCP servers like building blocks

Core Concepts

MCP Architecture

MCP follows a client-server architecture with three components:

Component Role Example
MCP Host AI application that needs tools Claude Desktop, your custom AI agent
MCP Client Manages connections to servers Built into the host application
MCP Server Provides tools and resources GitHub server, database server, file system server

What MCP Servers Can Provide

  • Tools — functions the AI can call (search web, run code, query database)
  • Resources — data the AI can read (files, database records, API responses)
  • Prompts — reusable prompt templates for specific tasks

Official MCP Servers (Free to Use)

Anthropic provides a growing library of official MCP servers you can use immediately:

MCP Server What It Does Free?
filesystem Read/write local files and directories Yes
github Search repos, create issues, manage PRs Yes (with GitHub token)
brave-search Web search via Brave Search API Yes (2,000 free queries/month)
sqlite Query and modify SQLite databases Yes
postgresql Connect to PostgreSQL databases Yes (bring your own DB)
slack Send messages, search channels Yes (with Slack token)
google-maps Search places, get directions Yes (free tier API)
puppeteer Control headless Chrome for web automation Yes
memory Persistent key-value memory for AI agents Yes
fetch Fetch any web URL and return content Yes

Getting Started: Install MCP in 10 Minutes

Prerequisites

# Install Node.js (required for most MCP servers)
# Download from nodejs.org or use a package manager

# Install uv (for Python-based MCP servers)
pip install uv

# Install MCP CLI (optional, for development)
npm install -g @modelcontextprotocol/cli
Enter fullscreen mode Exit fullscreen mode

Using MCP with Claude Desktop

Claude Desktop has built-in MCP support. Add servers to ~/Library/Application Support/Claude/claude_desktop_config.json (macOS) or %APPDATA%/Claude/claude_desktop_config.json (Windows):

{
  "mcpServers": {
    "filesystem": {
      "command": "npx",
      "args": [
        "-y",
        "@modelcontextprotocol/server-filesystem",
        "/Users/yourname/Documents"
      ]
    },
    "github": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-github"],
      "env": {
        "GITHUB_PERSONAL_ACCESS_TOKEN": "your_token_here"
      }
    },
    "brave-search": {
      "command": "npx",
      "args": ["-y", "@modelcontextprotocol/server-brave-search"],
      "env": {
        "BRAVE_API_KEY": "your_brave_api_key"
      }
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Restart Claude Desktop and you’ll see the tools appear automatically in your conversations.

Building Your First MCP Server (Python)

Install the MCP Python SDK:

pip install mcp
Enter fullscreen mode Exit fullscreen mode

Create a simple weather tool server:

from mcp.server import Server
from mcp.server.stdio import stdio_server
from mcp.types import Tool, TextContent
import json
import httpx

server = Server("weather-server")

@server.list_tools()
async def list_tools():
    return [
        Tool(
            name="get_weather",
            description="Get current weather for a city",
            inputSchema={
                "type": "object",
                "properties": {
                    "city": {
                        "type": "string",
                        "description": "City name"
                    }
                },
                "required": ["city"]
            }
        )
    ]

@server.call_tool()
async def call_tool(name: str, arguments: dict):
    if name == "get_weather":
        city = arguments["city"]
        # Using Open-Meteo (free, no API key needed)
        async with httpx.AsyncClient() as client:
            # First geocode the city
            geo = await client.get(
                f"https://geocoding-api.open-meteo.com/v1/search?name={city}&count=1"
            )
            geo_data = geo.json()
            lat = geo_data["results"][0]["latitude"]
            lon = geo_data["results"][0]["longitude"]

            # Then get weather
            weather = await client.get(
                f"https://api.open-meteo.com/v1/forecast"
                f"?latitude={lat}&longitude={lon}"
                f"&current=temperature_2m,wind_speed_10m"
            )
            data = weather.json()
            temp = data["current"]["temperature_2m"]
            wind = data["current"]["wind_speed_10m"]

            return [TextContent(
                type="text",
                text=f"{city}: {temp}°C, Wind: {wind} km/h"
            )]

async def main():
    async with stdio_server() as (read, write):
        await server.run(read, write, server.create_initialization_options())

if __name__ == "__main__":
    import asyncio
    asyncio.run(main())
Enter fullscreen mode Exit fullscreen mode

Building Your First MCP Server (TypeScript)

import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";
import { z } from "zod";

const server = new McpServer({
  name: "calculator",
  version: "1.0.0"
});

server.tool(
  "calculate",
  "Perform basic arithmetic",
  {
    expression: z.string().describe("Math expression like '2 + 2'")
  },
  async ({ expression }) => {
    try {
      // Safe evaluation using Function constructor
      const result = new Function(`"use strict"; return (${expression})`)();
      return {
        content: [{ type: "text", text: `Result: ${result}` }]
      };
    } catch (error) {
      return {
        content: [{ type: "text", text: `Error: ${error.message}` }],
        isError: true
      };
    }
  }
);

const transport = new StdioServerTransport();
await server.connect(transport);
Enter fullscreen mode Exit fullscreen mode

Using MCP with Python AI Frameworks

Using MCP in LangChain

from langchain_mcp_adapters.tools import load_mcp_tools
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client
from langchain_anthropic import ChatAnthropic

model = ChatAnthropic(model="claude-3-5-sonnet-20241022")

# Connect to an MCP server
server_params = StdioServerParameters(
    command="npx",
    args=["-y", "@modelcontextprotocol/server-brave-search"],
    env={"BRAVE_API_KEY": "your_key"}
)

async with stdio_client(server_params) as (read, write):
    async with ClientSession(read, write) as session:
        await session.initialize()
        # Load MCP tools as LangChain tools
        tools = await load_mcp_tools(session)
        model_with_tools = model.bind_tools(tools)

        response = await model_with_tools.ainvoke(
            "Search for the latest news about Model Context Protocol"
        )
Enter fullscreen mode Exit fullscreen mode

Using MCP in CrewAI

from crewai import Agent, Task, Crew
from crewai_tools import MCPServerAdapter
from mcp import StdioServerParameters

# Connect to MCP server
server_params = StdioServerParameters(
    command="npx",
    args=["-y", "@modelcontextprotocol/server-github"],
    env={"GITHUB_PERSONAL_ACCESS_TOKEN": "your_token"}
)

# Wrap MCP server as CrewAI tools
with MCPServerAdapter(server_params) as mcp_tools:
    developer = Agent(
        role="GitHub Developer",
        goal="Research and analyze GitHub repositories",
        tools=mcp_tools,
        llm="anthropic/claude-3-5-sonnet-20241022"
    )

    task = Task(
        description="Search for trending Python AI repositories on GitHub",
        agent=developer
    )

    crew = Crew(agents=[developer], tasks=[task])
    result = crew.kickoff()
Enter fullscreen mode Exit fullscreen mode

MCP vs Traditional Tool Calling

Feature Traditional Tool Calling MCP
Reusability Framework-specific Universal — works across all MCP hosts
Security Code runs inline Isolated server process
Discovery Manually defined Auto-discovery via protocol
Ecosystem Per-framework libraries Shared MCP server registry
Transport In-process function calls stdio, HTTP with SSE
Maintenance Update for each framework Update once, works everywhere

Popular Community MCP Servers

Beyond official servers, the community has built hundreds of MCP servers. Top picks:

Server Function Install
mcp-server-qdrant Vector database search (RAG) uvx mcp-server-qdrant
mcp-server-docker Manage Docker containers uvx mcp-server-docker
mcp-server-redis Redis key-value operations npx @gptscript-ai/redis-mcp
mcp-server-notion Read/write Notion pages npx @modelcontextprotocol/server-notion
mcp-server-playwright Browser automation npx @playwright/mcp
mcp-server-linear Project management (Linear) uvx mcp-server-linear

Browse the full ecosystem at github.com/modelcontextprotocol/servers and mcp.so.

MCP Transport Methods

stdio (Standard Input/Output)

The most common transport for local MCP servers. The host process spawns the MCP server as a child process and communicates via stdin/stdout. Best for desktop applications and local development.

HTTP with SSE (Server-Sent Events)

For remote MCP servers accessible over the network. Allows hosting your MCP server as a web service that multiple clients can connect to. Best for production deployments and cloud-hosted tools.

# Start an HTTP-based MCP server
from mcp.server.fastmcp import FastMCP

mcp = FastMCP("My API Server")

@mcp.tool()
def search_database(query: str) -> str:
    """Search the product database"""
    # Your DB logic here
    return f"Results for: {query}"

# Runs on http://localhost:8000/sse
mcp.run(transport="sse")
Enter fullscreen mode Exit fullscreen mode

Combine MCP with OpenClaw for Autonomous Agents

OpenClaw is an open-source AI agent tool that works seamlessly with MCP servers, letting you build autonomous workflows without writing orchestration code.

A powerful pattern: use OpenClaw as your MCP host and connect multiple MCP servers to give your agent access to a full suite of tools:

# OpenClaw agent with multiple MCP servers
{
  "agent": "research_agent",
  "mcp_servers": [
    "brave-search",     # Web search
    "github",           # Code repository access
    "filesystem",       # Local file read/write
    "memory"            # Persistent memory between runs
  ],
  "task": "Research the top 5 open-source AI frameworks by GitHub stars,
           summarize their pros/cons, and save the report to research.md"
}
Enter fullscreen mode Exit fullscreen mode

OpenClaw automatically discovers the tools from each MCP server and routes tasks to the appropriate tools — no manual wiring required.

Security Best Practices

  • Principle of least privilege — only grant MCP servers access to what they need (e.g., limit filesystem server to specific directories)
  • Review server code — before running community MCP servers, inspect the source code
  • Use environment variables — never hardcode API keys in config files
  • Audit tool calls — log what tools your AI is calling in production
  • Sandbox untrusted servers — run third-party MCP servers in Docker containers

MCP Ecosystem in 2026

MCP adoption has exploded in early 2026. Key milestones:

  • Claude Desktop, Claude Code: Full MCP support built-in
  • Cursor, Windsurf: MCP integration for AI-powered coding
  • LangChain, LlamaIndex: Official MCP adapters available
  • CrewAI, Dify: Native MCP server support
  • 1,000+ community MCP servers published on GitHub and mcp.so

Who Should Use MCP?

Use Case Recommended Setup
Personal productivity Claude Desktop + filesystem + brave-search servers
Developer tooling Cursor/Claude Code + GitHub + sqlite servers
AI agents (Python) LangChain or CrewAI + MCP adapters
No-code AI apps Dify + MCP server integration
Enterprise automation Custom MCP servers + OpenClaw orchestration

Related Reads

Conclusion

Model Context Protocol is rapidly becoming the standard for connecting AI models to the real world. Whether you’re building a personal AI assistant or a production multi-agent system, MCP gives you a clean, secure, and portable way to add tool access to any AI application.

The best part: the official MCP servers from Anthropic are completely free and open-source. Start with the filesystem and brave-search servers in Claude Desktop, then build your own custom MCP servers as your needs grow.

Resources to get started:


Originally published at toolfreebie.com.

Top comments (0)