DEV Community

2x lazymac
2x lazymac

Posted on

MCP Servers: The Missing Link Between AI Tools and Your APIs

MCP (Model Context Protocol) lets AI assistants like Claude and Cursor call your APIs directly. Instead of copying curl commands, your AI just calls the function. Here is how it works and why you should care.

What is MCP?

MCP is a protocol that connects AI tools to external services. Think of it as "plugins for AI" but with a standard interface.

User → AI Assistant → MCP Server → Your API → Response → AI → User
Enter fullscreen mode Exit fullscreen mode

Instead of telling Claude "go to this URL and parse the JSON," you just ask it to do the task. The MCP server handles the API call.

Setting Up an MCP Server

1. Add to Claude Desktop Config

{
  "mcpServers": {
    "ai-spend": {
      "url": "https://api.lazy-mac.com/ai-spend/mcp"
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

That is it. Claude can now call the AI Spend API directly.

2. Add to Cursor / Windsurf

Same config format. Drop it into your .cursor/mcp.json:

{
  "mcpServers": {
    "ai-spend": {
      "url": "https://api.lazy-mac.com/ai-spend/mcp"
    },
    "tech-stack": {
      "url": "https://api.lazy-mac.com/tech-stack/mcp"
    }
  }
}
Enter fullscreen mode Exit fullscreen mode

Now your AI coding assistant can detect tech stacks and calculate AI costs mid-conversation.

Building Your Own MCP Server

An MCP server exposes "tools" that AI can call. Here is a minimal example:

// mcp-server.js (Cloudflare Worker)
export default {
  async fetch(request, env) {
    const url = new URL(request.url);

    // MCP discovery endpoint
    if (url.pathname === '/mcp' && request.method === 'GET') {
      return Response.json({
        name: "my-api",
        version: "1.0.0",
        tools: [{
          name: "calculate_cost",
          description: "Calculate AI API cost for a given model and token count",
          inputSchema: {
            type: "object",
            properties: {
              model: { type: "string", description: "AI model name" },
              input_tokens: { type: "integer" },
              output_tokens: { type: "integer" }
            },
            required: ["model", "input_tokens", "output_tokens"]
          }
        }]
      });
    }

    // MCP tool execution
    if (url.pathname === '/mcp' && request.method === 'POST') {
      const { tool, arguments: args } = await request.json();

      if (tool === 'calculate_cost') {
        const cost = calculateCost(args.model, args.input_tokens, args.output_tokens);
        return Response.json({ result: cost });
      }
    }

    return Response.json({ error: 'Not found' }, { status: 404 });
  }
};
Enter fullscreen mode Exit fullscreen mode

Why MCP Matters

For Developers

  • Your API gets discovered by AI assistants automatically
  • Users interact with natural language instead of reading docs
  • Higher engagement — lower friction means more API calls

For AI Users

  • No more copy-pasting curl commands
  • AI can chain multiple API calls intelligently
  • Works across Claude, Cursor, Windsurf, and more

Real-World Example

# Traditional way: manually construct curl
curl "https://api.lazy-mac.com/ai-spend/calculate?model=gpt-4&input_tokens=5000&output_tokens=1000"

# MCP way: just ask Claude
# "What would it cost to process 5000 input tokens with GPT-4?"
# Claude calls the MCP tool automatically and gives you the answer.
Enter fullscreen mode Exit fullscreen mode
# Python: use the API directly or let your AI assistant handle it
import requests

# Direct API call
resp = requests.get("https://api.lazy-mac.com/ai-spend/calculate", params={
    "model": "gpt-4",
    "input_tokens": 5000,
    "output_tokens": 1000
})
print(resp.json())
Enter fullscreen mode Exit fullscreen mode

Getting Started

  1. Pick an API you want AI to access
  2. Add the MCP endpoint to your Claude/Cursor config
  3. Start asking your AI assistant to use it

The AI Spend API and Tech Stack API both support MCP out of the box.

AI FinOps API on Gumroad | Tech Stack API on Gumroad

Top comments (0)