TL;DR:
dotnet add package Kebechet.Api.ToMcp+ 3 lines in Program.cs = your ASP.NET controllers become AI-callable MCP tools. No separate server, no OpenAPI spec, compile-time generated. GitHub repo
Three lines of code. That's what it took to let Claude talk to my ASP.NET API:
builder.Services.AddMcpTools(Assembly.GetExecutingAssembly());
app.UseMcpLoopPrevention();
app.MapMcpEndpoint("mcp");
No hand-written tool classes. No separate proxy server. No OpenAPI spec to maintain. The controllers I already had became AI-callable tools automatically.
I built Api.ToMcp - a C# source generator that reads your existing controllers at compile time and generates MCP-compatible tool classes. This is the story of how it went from an idea to a published NuGet package in a single day.
Why not just use OpenAPI?
Before diving in, let's address the elephant in the room. There are already ways to bridge REST APIs to MCP, and most of them work through OpenAPI/Swagger specs.
OpenAPI/Swagger JSON approach - Tools like openapi-mcp-server (887 stars) and mcp-link (605 stars) read your swagger.json and dynamically generate MCP tools at runtime. If your API already produces a Swagger document (and most ASP.NET Core APIs do), you point the tool at the JSON and it creates MCP endpoints on the fly.
API gateway approach - Tyk's api-to-mcp works at the gateway level. Language-agnostic - doesn't matter if your backend is .NET, Node, Python, or Go.
Framework-specific code transforms - spring-rest-to-mcp uses OpenRewrite to transform Spring REST controllers into MCP servers. Similar idea to Api.ToMcp but for the Java ecosystem.
Manual HTTP configuration - http-4-mcp provides a visual interface where you configure HTTP-to-MCP mappings without writing code.
These all work. But they share common trade-offs:
- Runtime overhead - they parse specs and create tools dynamically
- Loose typing - they work with JSON schemas, not your actual C# types
- No compile-time safety - if your API changes, mismatches surface at runtime
- Separate process - they run as a standalone server or proxy between the AI client and your API. That's an extra service to deploy, monitor, and keep running
I wanted something that felt native to .NET - compile-time generated, type-safe, and deployed as part of the same application. One service, one deployment, one process.
The problem: MCP is great, bridging is not
If you haven't heard of Model Context Protocol, it's an open standard that lets AI assistants (like Claude, Cursor, etc.) call tools - essentially functions - in a structured way. Think of it as a universal plugin system for AI.
The .NET ecosystem already has great MCP support through the official ModelContextProtocol package. You define tool classes, decorate them with attributes, and you're live. But here's the catch - each tool is a separate class that you write manually:
[McpServerToolType]
public static class Products_GetByIdTool
{
[McpServerTool(Name = "Products_GetById")]
[Description("Gets a product by its unique identifier.")]
public static async Task<string> InvokeAsync(
IMcpHttpInvoker invoker,
[Description("Parameter: id")] Guid id)
{
var route = $"/api/products/{Uri.EscapeDataString(id.ToString())}";
return await invoker.GetAsync(route);
}
}
Now multiply that by every endpoint you want to expose. It's boilerplate. Pure, mechanical, error-prone boilerplate.
I was building a .NET MAUI fitness app with a standard ASP.NET Core backend. Dozens of controllers, hundreds of endpoints. The routes were defined. The parameters were typed. The XML docs were written. All the information was right there in the source code - I just needed something to read it and generate the glue.
The "what if" moment
C# source generators can inspect your code at compile time and emit new source files. They see your classes, methods, attributes, parameters - everything Roslyn knows, you know.
So the idea was simple: scan controllers, find HTTP action methods, and for each one, generate an MCP tool class that calls the original endpoint via HTTP internally. The API stays untouched. The MCP layer is purely additive.
From zero to NuGet in one day
I started on January 16th. The commit history tells the story:
Morning - scaffolded the project structure and implemented the basic generator. Three projects: Abstractions (attributes and config), Generator (the source generator itself), and Runtime (HTTP invoker, middleware, DI extensions).
Midday - hit the first real bugs. Parsing controller routes with constraints like {id:guid} was trickier than expected. Tool registration wasn't wiring up correctly. Fixed both, added Swagger to the demo project so I could verify the API side independently.
Afternoon - added tests, linked everything to a solution, set up GitHub Actions for build and NuGet publish. The first preview landed on NuGet that same evening.
By end of day, you could do this:
dotnet add package Kebechet.Api.ToMcp
Add a generator.json, three lines in Program.cs, and your API was speaking MCP.
How it actually works
The generator reads a generator.json config that controls which endpoints to expose (allowlist via SelectedOnly or blocklist via AllExceptExcluded), scans your controllers for HTTP action methods, and emits MCP tool classes at compile time.
{
"schemaVersion": 1,
"mode": "SelectedOnly",
"include": [
"ProductsController.GetAll",
"ProductsController.GetById"
],
"naming": {
"toolNameFormat": "{Controller}_{Action}",
"removeControllerSuffix": true
}
}
Attributes always win over config - [McpExpose] forces inclusion, [McpIgnore] forces exclusion. This gives you granular control over what AI can and can't touch:
[HttpDelete("{id:guid}")]
[McpIgnore] // AI should never delete products
public Task<ActionResult> Delete(Guid id) { ... }
Here's what the generator produces for ProductsController.GetAll:
[McpServerToolType]
public static class ProductsController_GetAllTool
{
[McpServerTool(Name = "Products_GetAll")]
[Description("Gets all products with optional category filter.")]
public static async Task<string> InvokeAsync(
IMcpHttpInvoker invoker,
[Description("Parameter: category")] string? category = null)
{
var queryParts = new List<string>();
if (category != null)
queryParts.Add($"category={Uri.EscapeDataString(category)}");
var route = "/api/products";
if (queryParts.Count > 0)
route += "?" + string.Join("&", queryParts);
await invoker.BeforeInvokeAsync(McpScope.Read);
var response = await invoker.GetAsync(route);
return response;
}
}
Notice that the XML doc comment /// <summary>Gets all products...</summary> became the [Description]. The AI assistant sees a well-described tool, not a cryptic endpoint.
The loop prevention problem
Here's something I didn't anticipate until it happened: what if the AI calls an MCP tool, which calls the API, which somehow triggers another MCP call? Infinite loop.
The solution is a simple middleware + header combo. Every internal HTTP call from the MCP invoker adds an X-MCP-Internal-Call header. The middleware checks for this on MCP endpoints and returns 400 Bad Request if present:
public async Task InvokeAsync(HttpContext context)
{
if (context.Request.Path.StartsWithSegments("/mcp") &&
context.Request.Headers.ContainsKey("X-MCP-Internal-Call"))
{
context.Response.StatusCode = 400;
await context.Response.WriteAsync(
"MCP endpoints cannot be called internally to prevent loops.");
return;
}
await _next(context);
}
Simple, effective, zero configuration.
Authentication forwarding
If your API uses JWT authentication, the MCP tools need to forward those credentials. The McpHttpInvoker automatically grabs the Authorization header from the incoming MCP request and attaches it to the internal API call. Your [Authorize] attributes just work.
Scope-based access control
After the initial release, I added a feature I hadn't planned but quickly realized was necessary: not every AI session should have access to every tool.
A read-only analytics dashboard shouldn't be able to call POST /api/products. So I mapped HTTP methods to scopes:
| Scope | HTTP Methods |
|---|---|
| Read | GET, HEAD, OPTIONS |
| Write | POST, PUT, PATCH |
| Delete | DELETE |
You configure a JWT claim mapper, and the invoker validates scopes before each call:
builder.Services.AddMcpTools(Assembly.GetExecutingAssembly(), options =>
{
options.ClaimName = "permissions";
options.ClaimToScopeMapper = claimValue =>
{
var scope = McpScope.None;
if (claimValue.Contains("read")) scope |= McpScope.Read;
if (claimValue.Contains("write")) scope |= McpScope.Write;
return scope;
};
});
If the scope doesn't match, the tool returns an UnauthorizedAccessException before any HTTP call is made.
Lessons from shipping fast
Looking back at the commit log, a few things stand out:
Source generators are powerful but unforgiving. Debugging is painful - you're writing code that writes code, and errors show up as compile-time diagnostics, not runtime exceptions. Snapshot testing (comparing generated output against expected files) saved me.
The "just ship it" approach works. The first version had rough edges. Parameterless method generation was broken (fixed next day). The URL handling had issues in production behind reverse proxies (fixed by a community PR a month later). But having a working package on NuGet meant people could try it and report real problems instead of theoretical ones.
Community feedback matters immediately. Within a month, I got a PR from Saurus119 fixing URL normalization for production environments. I had been testing locally - they were running it behind a load balancer. That's the kind of bug you only find with real users.
What's next
There are still open issues I'm thinking about:
- Should the
isErrorflag in MCP responses reflect HTTP status codes automatically? - How should cancellation tokens be handled across the MCP-to-HTTP boundary?
- Extracting XML doc comments for richer tool descriptions (issue #1 - one of the first things I filed)
Try it yourself
dotnet add package Kebechet.Api.ToMcp
The GitHub repo has a full demo project you can run. Add the package, create a generator.json, add three lines to Program.cs, and your API speaks MCP.
If you're building something with it, or if you have ideas for improvement, open an issue or submit a PR. The codebase is intentionally small - the entire generator is under 500 lines.
Every ASP.NET API already has everything MCP needs - typed endpoints, route metadata, XML docs. The only question is whether you extract that information yourself, or let the compiler do it for you.
If this was useful, feel free to connect with me on X/Twitter or check out my other .NET open-source work on GitHub.
Top comments (0)