DEV Community

alapati suryapruthvi
alapati suryapruthvi

Posted on

A Guide to Multi-Server MCP with React & FastMCP

The Model Context Protocol (MCP) is rapidly becoming the "universal connector" for AI. But while many tutorials focus on using MCP with Claude Desktop, the real power lies in building your own custom client-server architecture.

Today, we’re building a full-stack local agentic hub:

  • MCP Server (Python): A tools-heavy server using fastmcp to fetch real-time news and weather.
  • MCP Client (Node.js): A backend gateway using mcp-use-ts to manage multiple server connections.
  • Frontend (React): A sleek chat interface for interacting with your AI agent.

Part 1: The Python MCP Server
We’ll use FastMCP, a high-level framework that makes tool definition as easy as writing a Python function.

  • Setup
pip install fastmcp httpx
Enter fullscreen mode Exit fullscreen mode
  • Server
from fastmcp import FastMCP
import httpx

mcp = FastMCP("LocalInsights")

@mcp.tool()
async def get_weather(latitude: float, longitude: float) -> str:
    """Get the weather forecast for specific coordinates."""
    async with httpx.AsyncClient() as client:
        # NWS API Example
        resp = await client.get(f"https://api.weather.gov/points/{latitude},{longitude}")
        forecast_url = resp.json()["properties"]["forecast"]
        forecast = await client.get(forecast_url)
        return forecast.json()["properties"]["periods"][0]["detailedForecast"]

@mcp.tool()
async def get_news(category: str = "technology") -> str:
    """Get latest news headlines for a category."""
    # Example using a mock/free news aggregator
    return f"Latest {category} news: MCP is taking over the world! (Mocked response)"

if __name__ == "__main__":
    mcp.run()
Enter fullscreen mode Exit fullscreen mode

Part 2: The Node.js Client Gateway
Next, we build the "orchestrator" using mcp-use-ts. This layer connects to our Python server (and others) and exposes a unified API for our React frontend.

  • Setup
npm install @mcp-use/mcp-use-ts express cors
Enter fullscreen mode Exit fullscreen mode
  • The Node Client (client.ts)
import { McpClient } from "@mcp-use/mcp-use-ts";
import express from "express";
import cors from "cors";

const app = express();
app.use(cors());
app.use(express.json());

// Configure MCP servers
const client = MCPClient.fromDict({
  mcpServers: {
    filesystem: {
      command: 'npx',
      args: ['@modelcontextprotocol/server-filesystem']
    },
    github: {
      command: 'npx',
      args: ['@modelcontextprotocol/server-github'],
      env: { GITHUB_TOKEN: process.env.GITHUB_TOKEN }
    },
    insights: {
      command: "python",
      args: ["server.py"],
    }
  }
})

// Create an AI agent
const agent = new MCPAgent({
  llm: new ChatOpenAI({ model: 'gpt-4' }),
  client,
  maxSteps: 10
})

app.post('/chat', async (req, res) => {
  try {
    const { message } = req.body;

    // Process message with MCP servers
    const response = await agent.run(message);

    // Send AI response back to client
    res.json({
      type: 'assistant',
      message: response,
      timestamp: new Date().toISOString(),
      userId: 'assistant',
    });
  } catch (error) {
    console.error('Error processing message:', error);
    res.status(500).json({
      type: 'error',
      message: 'Sorry, I encountered an error processing your request.',
      timestamp: new Date().toISOString(),
    });
  }
});

app.listen(3001, () => console.log("Node Gateway running on port 3001"));
Enter fullscreen mode Exit fullscreen mode

Part 3: The React Chat Interface

Finally, we need a UI. We’ll build a simple React chat that sends messages to our Node.js gateway.

  • The Chat Component (Chat.jsx) We want a clean, agent-style interface.
import React, { useState } from 'react';

const ChatInterface = () => {
  const [messages, setMessages] = useState([]);
  const [input, setInput] = useState("");

  const handleSend = async () => {
    const userMsg = { role: 'user', text: input };
    setMessages([...messages, userMsg]);

    const response = await fetch("http://localhost:3001/chat", {
      method: "POST",
      headers: { "Content-Type": "application/json" },
      body: JSON.stringify({ message: input })
    });

    const data = await response.json();
    setMessages(prev => [...prev, { role: 'bot', text: data.response }]);
    setInput("");
  };

  return (
    <div className="p-4 max-w-2xl mx-auto border rounded shadow">
      <div className="h-64 overflow-y-auto mb-4 border-b">
        {messages.map((m, i) => (
          <div key={i} className={`p-2 ${m.role === 'user' ? 'text-right' : 'text-left'}`}>
            <span className={`inline-block p-2 rounded ${m.role === 'user' ? 'bg-blue-500 text-white' : 'bg-gray-200'}`}>
              {m.text}
            </span>
          </div>
        ))}
      </div>
      <div className="flex gap-2">
        <input 
          className="flex-1 p-2 border rounded" 
          value={input} 
          onChange={(e) => setInput(e.target.value)} 
          placeholder="Ask about the weather..."
        />
        <button onClick={handleSend} className="bg-blue-600 text-white px-4 py-2 rounded">Send</button>
      </div>
    </div>
  );
};

export default ChatInterface;
Enter fullscreen mode Exit fullscreen mode

Why This Architecture Wins
By decoupling the Client (Node) from the Server (Python), you gain two massive advantages:

  • Language Polyglotism: Your server can be Python (great for data/ML), but your client can be TypeScript (great for web/concurrency).
  • Scalability: You can connect your Node.js gateway to 10 different MCP servers—one for GitHub, one for Postgres, one for your local Weather tool—and the React frontend only needs to talk to one endpoint.

Top comments (0)