DEV Community

ohalay
ohalay

Posted on

Integrating MCP Tools with AWS Bedrock in an ASP.NET Core Minimal API

In this article, we'll build an ASP.NET Core Minimal API that integrates AWS Bedrock with MCP client capabilities, enabling the dynamic invocation of AI tools through standardized MCP interfaces.

How to work with LLM models

There are several ways to interact with LLM models in .NET:

  1. Direct LLM provider client (API or SDK) - lower-level abstraction
  2. Microsoft.Extensions.AI - middle-level abstraction
  3. Semantic Kernel - higher-level abstraction

We'll use the mid-level abstraction (Microsoft.Extensions.AI), which balances simplicity and flexibility while keeping the implementation decoupled from specific model providers. To start working, we need to install dependencies:

Then register IChatClient abstraction

services.AddDefaultAWSOptions(builder.Configuration.GetAWSOptions());
services.TryAddAWSService<IAmazonBedrockRuntime>();
services.AddSingleton<IChatClient>(sp =>
{
  var runtime = sp.GetRequiredService<IAmazonBedrockRuntime>();
  var chatClient = runtime.AsIChatClient().AsBuilder().UseFunctionInvocation().Build();
  return chatClient;
}
Enter fullscreen mode Exit fullscreen mode

Add MCP client

The IChatClient interface supports AI tools that extend its capabilities. We can dynamically load these tools from an MCP server.

var httpClient = new HttpClient();
var transport = new HttpClientTransport(new
{
    Endpoint = new Uri("https://my-mcp-server-enpoint.com"),
    Name = "Mcp Client",
}, httpClient);

var mcpClient = await McpClient.CreateAsync(transport);
var mcpTools = await mcpClient.ListToolsAsync();
Enter fullscreen mode Exit fullscreen mode

A possible MCP server configuration looks like this:

Chat API

Now, let’s combine everything into a single endpoint /api/message that sends user messages to the LLM while using MCP tools.

app.MapPost("/api/message", async (ChatRequest request, IChatClient chatClient) =>
{
  ...
  var response = await chatClient.GetResponseAsync(
    [new ChatMessage(ChatRole.User, request.Message)],
    new ChatOptions
    {
        ModelId = "anthropic.claude-3-5-sonnet-20240620-v1:0",
        Tools = [.. mcpTools],
    }
);

  return Results.Ok(response);
})
Enter fullscreen mode Exit fullscreen mode

In the end, to authenticate the user request to MCP, we will use an HTTP handler for our transport

public class OnBehalfOfHttpHandler(IHttpContextAccessor httpContextAccessor) : DelegatingHandler
{
  override protected Task<HttpResponseMessage> SendAsync(HttpRequestMessage request, CancellationToken cancellationToken)
  {
    var authorizationHeader = httpContextAccessor.HttpContext?.Request.Headers.Authorization.FirstOrDefault();
    if (authorizationHeader is not null && AuthenticationHeaderValue.TryParse(authorizationHeader, out var headerValue))
    {
      request.Headers.Authorization = new AuthenticationHeaderValue("Bearer", headerValue.Parameter);
    }
    return base.SendAsync(request, cancellationToken);
  }
}
Enter fullscreen mode Exit fullscreen mode

Solution Limitation

  • Only Bedrock models supported by Microsoft.Extensions.AI can be used.
  • Some models may not support tool invocation
  • The MCP SDK is still in preview and subject to change

Conclusion

By combining IChatClient with MCP tools, you create a modular and extensible architecture for AI integration in .NET.
This approach lets you:

  • Keep your code provider-agnostic
  • Dynamically load new AI tools via MCP
  • Securely delegate authentication using "on-behalf-of" tokens As MCP and Microsoft.Extensions.AI evolve, this pattern positions your application to easily adopt new models, providers, and AI capabilities with minimal refactoring.

Top comments (0)