DEV Community

Cover image for 6 MCP Server Patterns That Turn JavaScript APIs Into AI-Callable Tools
JSGuruJobs
JSGuruJobs

Posted on

6 MCP Server Patterns That Turn JavaScript APIs Into AI-Callable Tools

MCP made AI tools call your code directly. Not your UI. Not your REST client. Your functions. Here are 6 patterns that convert existing JavaScript services into MCP servers you can ship this week.

1. Wrap a REST API as an MCP tool

Most teams already have REST APIs. MCP sits on top of them.

Before (Express route)

// routes/jobs.ts
app.get("/jobs", async (req, res) => {
  const { tech, location } = req.query;
  const jobs = await db.jobs.findMany({
    where: {
      tech,
      location
    }
  });

  res.json(jobs);
});
Enter fullscreen mode Exit fullscreen mode

After (MCP tool wrapper)

import { McpServer } from "@modelcontextprotocol/sdk/server/mcp.js";
import { z } from "zod";

const server = new McpServer({ name: "jobs", version: "1.0.0" });

server.tool(
  "search_jobs",
  "Search jobs by tech and location",
  {
    tech: z.string().optional(),
    location: z.string().optional()
  },
  async ({ tech, location }) => {
    const res = await fetch(
      `https://api.myapp.com/jobs?tech=${tech}&location=${location}`
    );
    const data = await res.json();

    return {
      content: [{ type: "text", text: JSON.stringify(data) }]
    };
  }
);
Enter fullscreen mode Exit fullscreen mode

You did not rewrite logic. You exposed it to AI. That is usually under 50 lines.

2. Add strict validation with Zod to prevent prompt injection

AI will send weird inputs. Assume hostile strings.

Before (no validation)

server.tool(
  "query_jobs",
  "Run SQL query",
  { query: z.string() },
  async ({ query }) => {
    const rows = await db.query(query);
    return { content: [{ type: "text", text: JSON.stringify(rows) }] };
  }
);
Enter fullscreen mode Exit fullscreen mode

After (safe validation)

server.tool(
  "query_jobs",
  "Run read-only SQL query",
  {
    query: z.string()
      .max(500)
      .refine(
        (q) => q.trim().toLowerCase().startsWith("select"),
        "Only SELECT queries allowed"
      )
  },
  async ({ query }) => {
    const safe = query.toLowerCase().includes("limit")
      ? query
      : `${query} LIMIT 100`;

    const rows = await db.query(safe);

    return {
      content: [{ type: "text", text: JSON.stringify(rows) }]
    };
  }
);
Enter fullscreen mode Exit fullscreen mode

This is not optional. Without it, your MCP server becomes a production incident generator.

3. Separate read-only data as resources instead of tools

Not everything should be callable. Some data should just be readable.

Before (tool for everything)

server.tool(
  "latest_jobs",
  "Get latest jobs",
  {},
  async () => {
    const jobs = await getLatestJobs();
    return {
      content: [{ type: "text", text: JSON.stringify(jobs) }]
    };
  }
);
Enter fullscreen mode Exit fullscreen mode

After (resource)

server.resource(
  "latest_jobs",
  "jobs://latest",
  async (uri) => {
    const jobs = await getLatestJobs();

    return {
      contents: [
        {
          uri: uri.href,
          mimeType: "application/json",
          text: JSON.stringify(jobs)
        }
      ]
    };
  }
);
Enter fullscreen mode Exit fullscreen mode

Resources reduce accidental mutations and improve discovery. Models browse them like files.

4. Use stdio correctly or your server breaks silently

This one bites everyone once.

Before (broken stdio)

console.log("Server started");
Enter fullscreen mode Exit fullscreen mode

After (correct logging)

console.error("Server started");
Enter fullscreen mode Exit fullscreen mode
import { StdioServerTransport } from "@modelcontextprotocol/sdk/server/stdio.js";

const transport = new StdioServerTransport();
await server.connect(transport);
Enter fullscreen mode Exit fullscreen mode

stdio uses stdout for JSON RPC. One stray console.log corrupts the protocol. Zero errors. Just a dead server.

5. Move to HTTP transport for production

stdio is local only. Real deployments need HTTP.

Before (local only)

const transport = new StdioServerTransport();
await server.connect(transport);
Enter fullscreen mode Exit fullscreen mode

After (production HTTP server)

import express from "express";
import { StreamableHTTPServerTransport } from "@modelcontextprotocol/sdk/server/streamableHttp.js";

const app = express();

const transport = new StreamableHTTPServerTransport({
  sessionIdGenerator: () => crypto.randomUUID()
});

await server.connect(transport);

app.post("/mcp/messages", async (req, res) => {
  await transport.handlePostRequest(req, res);
});

app.listen(3001);
Enter fullscreen mode Exit fullscreen mode

Now you can put it behind a load balancer and scale horizontally.

This pattern compounds with the AI augmented JavaScript developer workflows because once your tools are MCP-compatible, every AI IDE can call them.

6. Add rate limiting because models spam tools

Humans click once. Models call tools 20 times.

Before (no limits)

server.tool("search_jobs", "Search jobs", { q: z.string() }, async ({ q }) => {
  const results = await searchJobs(q);
  return { content: [{ type: "text", text: JSON.stringify(results) }] };
});
Enter fullscreen mode Exit fullscreen mode

After (rate limited)

const calls = new Map<string, number[]>();

function allow(name: string, limit: number) {
  const now = Date.now();
  const window = 60000;

  const arr = calls.get(name) || [];
  const filtered = arr.filter((t) => now - t < window);

  if (filtered.length >= limit) return false;

  filtered.push(now);
  calls.set(name, filtered);
  return true;
}

server.tool("search_jobs", "Search jobs", { q: z.string() }, async ({ q }) => {
  if (!allow("search_jobs", 10)) {
    return {
      content: [{ type: "text", text: "Rate limit exceeded" }],
      isError: true
    };
  }

  const results = await searchJobs(q);

  return {
    content: [{ type: "text", text: JSON.stringify(results) }]
  };
});
Enter fullscreen mode Exit fullscreen mode

Without this, one conversation can DDOS your own database.

7. Wrap internal APIs for real business value

This is what companies actually pay for.

Before (manual internal calls)

async function getCustomer(email: string) {
  return fetch(`${API}/customers?email=${email}`).then((r) => r.json());
}
Enter fullscreen mode Exit fullscreen mode

After (AI-callable internal tool)

server.tool(
  "get_customer",
  "Lookup customer by email",
  {
    email: z.string().email()
  },
  async ({ email }) => {
    const res = await fetch(`${API}/customers?email=${email}`, {
      headers: { Authorization: `Bearer ${process.env.API_KEY}` }
    });

    if (!res.ok) {
      return {
        content: [{ type: "text", text: "Customer not found" }],
        isError: true
      };
    }

    const data = await res.json();

    return {
      content: [{ type: "text", text: JSON.stringify(data) }]
    };
  }
);
Enter fullscreen mode Exit fullscreen mode

Now support, sales, and engineers can query customer data through AI without writing code.


If you already have a Node.js API, you are one thin wrapper away from MCP. Start with one tool. Then add validation. Then deploy over HTTP. By the time others are still reading docs, you already have a working AI integration layer.

Top comments (0)