DEV Community

Abhishek Gupta
Abhishek Gupta

Posted on • Originally published at devblogs.microsoft.com

Build AI Tooling in Go with the MCP SDK – Connecting AI Apps to Databases

A hands‑on walkthrough of building MCP servers that can plug AI applications into Azure Cosmos DB

The Model Context Protocol (MCP) has established itself as the ubiquitous standard for connecting AI applications to external systems. Since its release, there have been implementations across various programming languages and frameworks, enabling developers to build solutions that expose data sources, tools, and workflows to AI applications.

For Go developers, however, the journey to an official MCP SDK took longer (compared to other SDKs like Python and TypeScript). Discussions and design/implementation work on the official Go implementation began during early to mid 2025. At the time of writing (January 2026) it stands at version 1.2.0. As a Gopher, I'm excited (and relieved!) to finally have a stable, official MCP Go SDK that the Go community can rely on.

To explore its capabilities, I built an MCP server for Azure Cosmos DB. This blog post will dive into the MCP Go SDK fundamentals by walking through its specifics, and exploring concepts such as tools, servers, etc. By the end, you'll understand how to use the MCP Go SDK to build your own MCP servers, with Azure Cosmos DB serving as a practical example.

Note: This project is not intended to replace the Azure MCP Server or Azure Cosmos DB MCP Toolkit. Rather, it serves as an experimental learning tool that demonstrates how to combine the Azure and MCP Go SDKs to build AI tooling for Azure Cosmos DB.

MCP Basics

Let's briefly cover what MCP is and how the MCP Go SDK works.

What is the Model Context Protocol?

The Model Context Protocol (MCP) is an open-source standard for connecting AI applications to external systems. It's often referred to as a USB-C port for AI applications — just as USB-C provides a standardized way to connect devices, MCP provides a standardized way to connect AI applications to data sources, tools, and workflows.

With MCP, AI applications (ranging from IDEs like VS Code, CLI coding tools like GitHub Copilot or apps like Claude web/desktop) can:

  • Access data sources (local files, databases, APIs)
  • Use tools (search engines, calculators, external services)
  • Execute workflows (specialized prompts, multi-step operations)

This standardization means developers can build MCP servers once and have them work with any MCP-compatible AI application, rather than creating custom integrations for each platform.

MCP Go SDK

The official Go MCP SDK provides the building blocks to create MCP servers and clients in Go. Here's a minimal example of an MCP server with a simple tool:

package main

import (
    "context"
    "log"
    "strings"

    "github.com/modelcontextprotocol/go-sdk/mcp"
)

type ReverseInput struct {
    Text string `json:"text" jsonschema:"the text to reverse"`
}

type ReverseOutput struct {
    Reversed string `json:"reversed" jsonschema:"the reversed text"`
}

func ReverseText(ctx context.Context, req *mcp.CallToolRequest, input ReverseInput) (
    *mcp.CallToolResult,
    ReverseOutput,
    error,
) {
    runes := []rune(input.Text)
    for i, j := 0, len(runes)-1; i < j; i, j = i+1, j-1 {
        runes[i], runes[j] = runes[j], runes[i]
    }
    return nil, ReverseOutput{Reversed: string(runes)}, nil
}

func main() {
    // Create server
    server := mcp.NewServer(&mcp.Implementation{
        Name:    "text-tools",
        Version: "v1.0.0",
    }, nil)

    // Add a tool
    mcp.AddTool(server, &mcp.Tool{
        Name:        "reverse",
        Description: "reverses the input text",
    }, ReverseText)

    // Run over stdio
    if err := server.Run(context.Background(), &mcp.StdioTransport{}); err != nil {
        log.Fatal(err)
    }
}
Enter fullscreen mode Exit fullscreen mode

This example demonstrates the key concepts:

  • Tool definition: A mcp.Tool with a name and description
  • Input/Output types: Structs with JSON schema tags that define the tool's interface
  • Handler function: The actual logic that executes when the tool is called
  • Server: Created with mcp.NewServer() and configured with tools
  • Transport: How the server communicates (here using stdio)

These concepts will be covered later on the blog.

MCP Server in Action

▶️ To get a sense of what the server can do, take a look at this short demo of using the MCP server with Agent Mode in Visual Studio Code:

This server exposes several tools that enable AI applications to interact with Azure Cosmos DB:

  • list_databases - List all databases in a Cosmos DB account
  • list_containers - List all containers in a specific database
  • read_item - Read a specific item using its ID and partition key
  • execute_query - Execute SQL queries against containers
  • create_container - Create new containers with partition keys
  • add_item_to_container - Add items to containers
  • read_container_metadata - Retrieve container configuration details

If you want to setup and configure the server, check out the GitHub repository.

Alright, let's dive into how it's built.

Understanding the Implementation

Tools are the building blocks of an MCP server. Each tool represents a specific operation that the server can perform.

Let's use the read_item tool as an example to understand the fundamental concepts of the MCP Go SDK and how it integrates with the Azure Cosmos DB Go SDK.

MCP Tools: Definition, Handler, and Execution Flow

An MCP tool consists of these components:

Tool Definition

The tool definition describes the tool to the AI application. Here's how we define the read_item tool:

func ReadItem() *mcp.Tool {
    return &mcp.Tool{
        Name:        "read_item",
        Description: "Read a specific item from a container in an Azure Cosmos DB database using the item ID and partition key",
    }
}
Enter fullscreen mode Exit fullscreen mode

The Tool struct contains:

  • Name: A unique identifier for the tool
  • Description: Helps the AI understand when to use this tool

The SDK can automatically infer input and output schemas from your handler function's types, which we'll see next.

Input and Output Types

Type-safe input and output structures define the tool's interface:

type ReadItemToolInput struct {
    Account      string `json:"account" jsonschema:"Azure Cosmos DB account name"`
    Database     string `json:"database" jsonschema:"Name of the database"`
    Container    string `json:"container" jsonschema:"Name of the container to read data from"`
    ItemID       string `json:"itemID" jsonschema:"ID of the item to read"`
    PartitionKey string `json:"partitionKey" jsonschema:"Partition key value of the item"`
}

type ReadItemToolResult struct {
    Item string `json:"item" jsonschema:"The item data as JSON string"`
}
Enter fullscreen mode Exit fullscreen mode

The SDK uses these types to automatically generate JSON schemas and handle validation. JSON tags define how fields are serialized, and jsonschema tags provide descriptions that help AI applications understand what each field represents

Tool Handler

The handler is where the actual work happens. The MCP Go SDK provides a generic AddTool function that can bind tools to functions with this signature:

func(ctx context.Context, request *CallToolRequest, input InputType) (result *CallToolResult, output OutputType, error)
Enter fullscreen mode Exit fullscreen mode

Here's the read_item handler:

func ReadItemToolHandler(ctx context.Context, _ *mcp.CallToolRequest, input ReadItemToolInput) (*mcp.CallToolResult, ReadItemToolResult, error) {
    // 1. Validate inputs
    if input.Account == "" {
        return nil, ReadItemToolResult{}, errors.New("cosmos db account name missing")
    }
    if input.Database == "" {
        return nil, ReadItemToolResult{}, errors.New("database name missing")
    }
    // ... more validation

    // 2. Get Cosmos DB client
    client, err := GetCosmosClientFunc(input.Account)
    if err != nil {
        return nil, ReadItemToolResult{}, err
    }

    // 3. Navigate to the container
    databaseClient, err := client.NewDatabase(input.Database)
    if err != nil {
        return nil, ReadItemToolResult{}, fmt.Errorf("error creating database client: %v", err)
    }

    containerClient, err := databaseClient.NewContainer(input.Container)
    if err != nil {
        return nil, ReadItemToolResult{}, fmt.Errorf("error creating container client: %v", err)
    }

    // 4. Read the item using Cosmos DB SDK
    partitionKey := azcosmos.NewPartitionKeyString(input.PartitionKey)
    itemResponse, err := containerClient.ReadItem(ctx, partitionKey, input.ItemID, nil)
    if err != nil {
        return nil, ReadItemToolResult{}, fmt.Errorf("error reading item: %v", err)
    }

    // 5. Return the result
    return nil, ReadItemToolResult{Item: string(itemResponse.Value)}, nil
}
Enter fullscreen mode Exit fullscreen mode

The handler handles (pun intended!) several things:

  • Validates input parameters
  • Interacts with Azure Cosmos DB
  • Returns structured output

Notice we return nil for the *mcp.CallToolResult. The SDK automatically handles the response marshaling for us. If we return an error, the SDK sets IsError: true in the result automatically.

Authenticating with Azure Cosmos DB

The MCP server uses NewDefaultAzureCredential from the Azure Identity SDK, which automatically handles multiple authentication methods, such as Azure CLI credentials (for local development), Managed Identity (for production), environment variables, and more:

func GetCosmosDBClient(accountName string) (*azcosmos.Client, error) {
    endpoint := fmt.Sprintf("https://%s.documents.azure.com:443/", accountName)

    cred, err := azidentity.NewDefaultAzureCredential(nil)
    if err != nil {
        return nil, fmt.Errorf("error creating credential: %v", err)
    }

    client, err := azcosmos.NewClient(endpoint, cred, nil)
    if err != nil {
        return nil, fmt.Errorf("error creating Cosmos client: %v", err)
    }

    return client, nil
}
Enter fullscreen mode Exit fullscreen mode

Once we have the client, we use the standard Azure Cosmos DB SDK patterns:

  • client.NewDatabase() to get a database client
  • databaseClient.NewContainer() to get a container client
  • containerClient.ReadItem() to perform the actual read operation

MCP Server: Bringing Tools Together

The beauty here is that MCP provides the standardized interface for AI interactions, while the Azure Cosmos DB SDK handles all the database operations – the handler acts as the bridge between these two worlds.

Now that we understand individual tools, let's see how they're organized within an MCP server. An MCP server exposes specific capabilities (tools, resources, prompts) to AI applications through the standardized MCP protocol.

Creating the Server

Here's how we create and configure the MCP server in main.go:

func main() {
    // Create the server with metadata
    server := mcp.NewServer(&mcp.Implementation{
        Name:       "mcp_azure_cosmosdb_go",
        Title:      "Go based MCP server for Azure Cosmos DB using the Azure SDK for Go and the MCP Go SDK",
        Version:    "0.0.1",
        WebsiteURL: "https://github.com/abhirockzz/mcp_cosmosdb_go",
    }, nil)

    // Register all tools with their handlers
    mcp.AddTool(server, tools.ListDatabases(), tools.ListDatabasesToolHandler)
    mcp.AddTool(server, tools.ListContainers(), tools.ListContainersToolHandler)
    mcp.AddTool(server, tools.ReadContainerMetadata(), tools.ReadContainerMetadataToolHandler)
    mcp.AddTool(server, tools.CreateContainer(), tools.CreateContainerToolHandler)
    mcp.AddTool(server, tools.AddItemToContainer(), tools.AddItemToContainerToolHandler)
    mcp.AddTool(server, tools.ReadItem(), tools.ReadItemToolHandler)
    mcp.AddTool(server, tools.ExecuteQuery(), tools.ExecuteQueryToolHandler)

    // ... transport setup (covered next)
}
Enter fullscreen mode Exit fullscreen mode

Breaking this down:

  1. mcp.NewServer() creates a new server instance with:

    • Implementation metadata: Name, title, and version that identify the server
    • ServerOptions: Additional configuration (we use nil for defaults)
  2. mcp.AddTool() registers each tool with the server:

    • Takes the server instance
    • The tool definition (from functions like tools.ReadItem())
    • The handler function (like tools.ReadItemToolHandler)

When the server connects to a client, it automatically advertises the tools capability, making all registered tools discoverable by the AI application.

Transport: Connecting Server to Client

A transport defines how the server communicates with clients. It's the communication channel that carries JSON-RPC messages between the server and client. The MCP Go SDK supports multiple transport types.

HTTP Streamable Transport

The server also supports http transport, which is ideal for web-based AI applications. Here's how we set it up:

// Create the streamable HTTP handler
handler := mcp.NewStreamableHTTPHandler(func(req *http.Request) *mcp.Server {
    return server
}, nil)

// Start the HTTP server
if err := http.ListenAndServe(":9090", handler); err != nil {
    log.Fatalf("Server failed: %v", err)
}
Enter fullscreen mode Exit fullscreen mode

The NewStreamableHTTPHandler creates an HTTP handler that accepts incoming HTTP requests from MCP clients, and returns the appropriate server instance for each request. It handles the streamable transport protocol automatically, and supports multiple concurrent client sessions

This transport is ideal when you want to support web-based AI applications and the server needs to be accessible over HTTP/HTTPS. This allows multiple clients to connect simultaneously.

Stdio Transport

Another common MCP transport is stdio, used when the server runs as a subprocess:

err := server.Run(context.Background(), &mcp.StdioTransport{})
if err != nil {
    log.Fatal(err)
}
Enter fullscreen mode Exit fullscreen mode

The stdio transport runs as a subprocess started by the client and communicates via standard input/output streams. It's perfect for local MCP clients like GitHub Copilot CLI, Claude Code (or Desktop), etc. Both transports implement the same MCP protocol, so the server's tools work identically regardless of which transport you choose. The difference is purely in how the server connects to and communicates with clients.

With the server created, tools registered, and transport configured, the MCP server is ready to accept connections from AI applications and execute operations against Azure Cosmos DB.

Testing the MCP Server

This involves verifying functionality at different layers of the stack. This server uses integration tests at two levels: tests that verify the MCP protocol aspects, and tests that focus on handler logic with database interactions. Let's explore both approaches.

Before diving into testing, let's briefly understand what an MCP client is.

Understanding MCP Clients

An MCP client is the component that connects to an MCP server to consume its capabilities. In the context of the MCP server:

  • In production: The client is typically an AI application (like Claude Desktop or VS Code) that discovers and calls our tools
  • In testing: We create programmatic clients to verify our server works correctly

The MCP Go SDK provides a Client type that we can use to connect to our server and call its tools, simulating how a real AI application would interact with it.

Handler-Level Integration Testing with Azure Cosmos DB vNext Emulator

Let's start by looking at tests that focus on handler logic and database interactions. It uses the Azure Cosmos DB vNext Emulator with testcontainers-go.

From tools_test.go:

func TestListDatabases(t *testing.T) {
    tests := []struct {
        name           string
        input          ListDatabasesToolInput
        expectError    bool
        expectedResult string
        expectedErrMsg string
    }{
        {
            name: "valid account name",
            input: ListDatabasesToolInput{
                Account: "dummy_account_does_not_matter",
            },
            expectError:    false,
            expectedResult: testOperationDBName,
        },
        {
            name: "empty account name",
            input: ListDatabasesToolInput{
                Account: "",
            },
            expectError:    true,
            expectedErrMsg: "cosmos db account name missing",
        },
    }

    for _, test := range tests {
        t.Run(test.name, func(t *testing.T) {
            _, response, err := ListDatabasesToolHandler(
                context.Background(), 
                nil, 
                test.input,
            )

            if test.expectError {
                require.Error(t, err)
                assert.Contains(t, err.Error(), test.expectedErrMsg)
                return
            }

            require.NoError(t, err)
            assert.Contains(t, response.Databases, test.expectedResult)
        })
    }
}
Enter fullscreen mode Exit fullscreen mode

These tests call handlers directly (bypassing the MCP protocol layer) and use table-driven tests for input validation and error handling, business logic correctness, database operations and edge cases.

func setupCosmosEmulator(ctx context.Context) (testcontainers.Container, error) {
    req := testcontainers.ContainerRequest{
        Image:        "mcr.microsoft.com/cosmosdb/linux/azure-cosmos-emulator:vnext-preview",
        ExposedPorts: []string{"8081:8081", "8080:8080"},
        WaitingFor:   wait.ForListeningPort(nat.Port("8080")),
        Env: map[string]string{
            "PROTOCOL": "http",
        },
    }

    container, err := testcontainers.GenericContainer(ctx, testcontainers.GenericContainerRequest{
        ContainerRequest: req,
        Started:          true,
    })
    // ... error handling

    return container, nil
}
Enter fullscreen mode Exit fullscreen mode

The testcontainers-go library automatically pulls the emulator image, starts the container, and cleans up after tests complete. This is set up once in TestMain function and shared across all tests.

MCP Protocol Integration Testing

Beyond handler testing, we also verify the complete MCP protocol stack—from client request through server processing to response. Here's an example from mcp_integration_test.go:

func TestMCPIntegration_ReadItem(t *testing.T) {
    ctx := context.Background()

    // 1. Create MCP server and register the read_item tool
    server := mcp.NewServer(&mcp.Implementation{
        Name:    "test-cosmosdb-server",
        Version: "0.0.1",
    }, nil)

    mcp.AddTool(server, ReadItem(), ReadItemToolHandler)

    // 2. Create in-memory transports for testing
    serverTransport, clientTransport := mcp.NewInMemoryTransports()

    // 3. Connect server
    serverSession, err := server.Connect(ctx, serverTransport, nil)
    require.NoError(t, err)
    defer serverSession.Close()

    // 4. Create and connect client
    client := mcp.NewClient(&mcp.Implementation{
        Name:    "test-client",
        Version: "0.0.1",
    }, nil)

    clientSession, err := client.Connect(ctx, clientTransport, nil)
    require.NoError(t, err)
    defer clientSession.Close()

    // 5. Call the tool via MCP protocol
    result, err := clientSession.CallTool(ctx, &mcp.CallToolParams{
        Name: "read_item",
        Arguments: map[string]any{
            "account":      "dummy_account_does_not_matter",
            "database":     testOperationDBName,
            "container":    testOperationContainerName,
            "itemID":       id,
            "partitionKey": partitionKeyValue,
        },
    })

    // 6. Verify the response
    require.NoError(t, err)
    require.False(t, result.IsError)
    require.NotEmpty(t, result.Content)

    // 7. Parse and validate the JSON response
    textContent, ok := result.Content[0].(*mcp.TextContent)
    require.True(t, ok)

    var response ReadItemToolResult
    err = json.Unmarshal([]byte(textContent.Text), &response)
    require.NoError(t, err)

    assert.NotEmpty(t, response.Item)
}
Enter fullscreen mode Exit fullscreen mode

This test demonstrates several key concepts:

  1. In-Memory Transports: mcp.NewInMemoryTransports() creates a pair of connected transports without requiring actual network communication—perfect for testing
  2. Client-Server Connection: Both server and client connect to their respective transports, establishing a session
  3. Tool Invocation: clientSession.CallTool() sends a properly formatted MCP request
  4. Response Handling: The result is parsed from the MCP protocol format back to our domain types
  5. Full Protocol Verification: This tests the complete round trip: request serialization → tool execution → response serialization → client parsing

Both handler-level and protocol-level tests use the Azure Cosmos DB vNext emulator, not mocks. Handler-level tests provide feedback on business logic, while protocol-level tests ensure MCP compliance and end-to-end functionality.

Wrap Up

With the MCP Go SDK, building MCP servers has become significantly more accessible for Go developers. You don't have to go for Python anymore (sorry Pythonistas, pun intended!).

This MCP server demonstrates how to combine the MCP Go SDK with domain-specific tools — in this case, the Azure Cosmos DB Go SDK. While this server provides useful functionality for interacting with Cosmos DB from AI applications, its primary purpose is educational. As mentioned before, this is a learning tool that shows how to integrate MCP with real-world services, not a replacement for solutions like the Azure MCP Server or the Azure Cosmos DB MCP Toolkit.

The specific patterns we covered (defining tools, implementing handlers, managing authentication, choosing transports, and writing integration tests) apply to any MCP server you might build. The same concepts apply, whether you're exposing APIs, databases, file systems, or custom business logic.

Next Steps

Ready to build your own MCP server? Here are some resources to get you started:

The MCP ecosystem is growing rapidly, and I am excited for Go developers who now have first-class support for participating in this evolution!

Top comments (0)