DEV Community

NeuroLink AI
NeuroLink AI

Posted on

MCP is the Future of AI Tools: Here's How to Use It in TypeScript

MCP is the Future of AI Tools: Here's How to Use It in TypeScript

MCP (Model Context Protocol) is how AI agents talk to the real world. If you're building AI tools in TypeScript, you need to understand it — and here's the fastest way to get started.

Think of MCP as USB for AI. Before USB, every device had its own connector. Before MCP, every AI tool integration was custom code. MCP standardizes how AI models discover, call, and receive results from external tools.

This isn't theoretical. Anthropic released MCP in late 2024, and the ecosystem already has 58+ production servers covering GitHub, PostgreSQL, Slack, Google Drive, file systems, and more.


What is MCP in 30 Seconds

MCP defines a protocol for AI-tool communication:

  1. AI model wants to do something (read a file, query a database, create a GitHub issue)
  2. MCP server exposes tools the AI can call
  3. MCP client (your SDK) handles the connection and protocol
  4. Tools are described with JSON schemas so the AI knows how to use them

The AI doesn't need custom code for each tool. It reads the tool descriptions, decides when to use them, and calls them through MCP. New tools appear automatically — no code changes needed.


Getting Started with MCP in TypeScript

NeuroLink has built-in MCP support with 58+ server integrations. Here's how to connect your first MCP server:

import { NeuroLink } from "@juspay/neurolink";

const ai = new NeuroLink({
  provider: "anthropic",
  model: "claude-sonnet-4-6",
  apiKey: process.env.ANTHROPIC_KEY,
});

// Connect a local MCP server (stdio transport)
await ai.addExternalMCPServer("filesystem", {
  command: "npx",
  args: ["-y", "@modelcontextprotocol/server-filesystem", "./data"],
  transport: "stdio",
});

// The AI can now read and write files in ./data
const result = await ai.generate({
  input: {
    text: "Read the file sales-q1.csv and tell me the top 3 products by revenue",
  },
});

console.log(result.content);
// "Based on the CSV data, the top 3 products are:
//  1. Enterprise License — $45,200
//  2. Pro Subscription — $32,100
//  3. API Credits — $28,750"
Enter fullscreen mode Exit fullscreen mode

That's it. The AI automatically discovers the filesystem tools, reads the CSV, and analyzes it. No custom file-reading code. No CSV parsing library. The MCP server handles it.


Three Practical MCP Integrations

1. GitHub: AI-Powered Issue Management

await ai.addExternalMCPServer("github", {
  command: "npx",
  args: ["-y", "@modelcontextprotocol/server-github"],
  transport: "stdio",
  env: { GITHUB_TOKEN: process.env.GITHUB_TOKEN },
});

// Create issues, read PRs, search code — all through natural language
const result = await ai.generate({
  input: {
    text: 'Create a GitHub issue in juspay/neurolink titled "Add WebSocket MCP transport" with a description of the feature and label it as enhancement',
  },
});
Enter fullscreen mode Exit fullscreen mode

The AI calls create_issue with the right parameters. No Octokit setup, no REST API wrangling.

2. PostgreSQL: Natural Language Database Queries

await ai.addExternalMCPServer("database", {
  command: "npx",
  args: ["-y", "@modelcontextprotocol/server-postgres"],
  transport: "stdio",
  env: {
    POSTGRES_URL: "postgresql://user:pass@localhost:5432/mydb",
  },
});

const result = await ai.generate({
  input: {
    text: "Which customers signed up in the last 7 days and haven't made a purchase yet?",
  },
});

// The AI writes and executes the SQL query, then summarizes results
Enter fullscreen mode Exit fullscreen mode

3. Remote MCP Servers via HTTP

Not all MCP servers run locally. NeuroLink supports HTTP transport for remote servers:

await ai.addExternalMCPServer("remote-tools", {
  transport: "http",
  url: "https://mcp.example.com/v1",
  headers: { Authorization: "Bearer " + process.env.MCP_TOKEN },
  retries: 3,
  timeout: 15000,
});
Enter fullscreen mode Exit fullscreen mode

This opens the door to SaaS-hosted MCP servers — shared tool infrastructure your entire team can use.


Advanced: Tool Routing

When you have multiple MCP servers, you need to decide which server handles which tool call. NeuroLink's ToolRouter supports 6 routing strategies:

import { ToolRouter, ToolCache, RequestBatcher } from "@juspay/neurolink";

const router = new ToolRouter({
  strategy: "capability-based", // Route by what each server can do
  servers: [
    { name: "github", url: "https://mcp-github.example.com" },
    { name: "database", url: "https://mcp-postgres.example.com" },
    { name: "filesystem", command: "npx", args: ["@modelcontextprotocol/server-filesystem", "."] },
  ],
});
Enter fullscreen mode Exit fullscreen mode

Available strategies:

  • capability-based — Route based on tool names and descriptions
  • round-robin — Distribute across servers evenly
  • priority — Always try the first server, fall back to others
  • latency — Route to the fastest responding server
  • cost — Route to the cheapest option
  • random — Random selection (useful for load testing)

Advanced: Caching and Batching

MCP tool calls can be expensive (database queries, API calls). NeuroLink provides built-in optimization:

// Cache repeated tool results
const cache = new ToolCache({
  strategy: "lru",    // Least Recently Used eviction
  maxSize: 500,       // Max cached results
  ttl: 60_000,        // Cache for 60 seconds
});

// Batch concurrent tool calls
const batcher = new RequestBatcher({
  maxBatchSize: 10,   // Batch up to 10 calls
  maxWaitMs: 50,      // Wait max 50ms to form a batch
});
Enter fullscreen mode Exit fullscreen mode

In production, caching alone can reduce MCP server load by 40-60% for read-heavy workloads.


Building Your Own MCP Server

Want to expose your own API as MCP tools? NeuroLink provides a base class:

import { MCPServerBase } from "@juspay/neurolink";

class WeatherServer extends MCPServerBase {
  name = "weather";

  tools = [
    {
      name: "get_weather",
      description: "Get current weather for a city",
      inputSchema: {
        type: "object",
        properties: {
          city: { type: "string", description: "City name" },
        },
        required: ["city"],
      },
      handler: async (input: { city: string }) => {
        const response = await fetch(
          `https://api.weather.com/v1/current?city=${input.city}`
        );
        return await response.json();
      },
    },
  ];
}

// Register and use
const weather = new WeatherServer();
ai.registerMCPServer(weather);

const result = await ai.generate({
  input: { text: "What's the weather like in Tokyo right now?" },
});
Enter fullscreen mode Exit fullscreen mode

Your custom tools are now available to any AI model through the same MCP protocol.


The MCP Ecosystem

The ecosystem is growing fast. Here are some notable MCP servers you can use today:

Server What It Does Transport
@modelcontextprotocol/server-filesystem Read/write local files stdio
@modelcontextprotocol/server-github GitHub issues, PRs, code search stdio
@modelcontextprotocol/server-postgres PostgreSQL queries stdio
@modelcontextprotocol/server-slack Slack messages, channels stdio
@modelcontextprotocol/server-gdrive Google Drive files stdio
@modelcontextprotocol/server-brave-search Web search stdio
@modelcontextprotocol/server-puppeteer Browser automation stdio
@modelcontextprotocol/server-sqlite SQLite databases stdio

NeuroLink supports all of these out of the box. Just addExternalMCPServer() and the tools appear.

For the full list, check awesome-mcp-servers — currently tracking 1,400+ servers.


Why MCP Matters for TypeScript Developers

Three reasons:

1. Composability. MCP servers are modular. Need GitHub + Slack + your custom API? Add three servers. Each is independent, testable, and reusable.

2. Ecosystem leverage. Someone already built the MCP server for PostgreSQL. You don't need to write database tool integration code — just connect the server.

3. Future-proofing. MCP is becoming the standard. Claude, GPT, Gemini — they all support tool calling. MCP standardizes the tool side. Build once, work everywhere.


Getting Started

# Install NeuroLink
npm install @juspay/neurolink

# Interactive setup (configures providers + MCP servers)
npx @juspay/neurolink setup

# Or use the CLI directly
npx @juspay/neurolink generate "List my recent GitHub issues" \
  --mcp github
Enter fullscreen mode Exit fullscreen mode

Resources:

Are you using MCP in your projects? What servers have you found most useful? Let me know in the comments.

Top comments (0)