DEV Community

Cover image for Microsoft Agent Framework with Ollama (.NET/C#)
Thang Chung
Thang Chung

Posted on

Microsoft Agent Framework with Ollama (.NET/C#)

Introduction

This week, I published a blog post explaining how to use Microsoft Agent Framework with Foundry Local. Because Foundry Local does not support native tool calling, we must implement the tool-calling loop manually. Specifically, the LLM is first given a list of available tools and asked to decide whether a tool invocation is required. We then parse the model’s response, execute the selected tool, and feed the results back into the LLM for final generation. These extra steps are necessary because Foundry Local lacks built-in tool-calling support, meaning the UseFunctionInvocation capability on the ChatClient in Microsoft.Extensions.AI cannot handle this automatically.

Just yesterday, I read the excellent post Building Multi-Agent Workflows in .NET with AgentFactory and Handoff by El Bruno. In that article, he highlights using Microsoft Agent Framework with Ollama via the native Ollama provider included in the framework (sample here). This immediately sparked an idea: we can extend my previous sample at
https://github.com/thangchung/agent-engineering-experiment/tree/main/foundry-local-agent-fx
by adding Ollama as an additional provider, allowing us to switch seamlessly between Foundry Local and Ollama depending on the use case.

This post walks through the design, implementation, and code required to make that approach work in practice.

Prerequisites

> ollama --version
ollama version is 0.13.5
Enter fullscreen mode Exit fullscreen mode
  • Now I have ollama run on my local machine at http://localhost:11434/
  • Because I will use mistral LLM model, I have to run:
> ollama pull mistral
pulling manifest
pulling f5074b1221da: 100% ▕██████████████████████████████████████████████████████████▏ 4.4 GB
pulling 43070e2d4e53: 100% ▕██████████████████████████████████████████████████████████▏  11 KB
pulling 1ff5b64b61b9: 100% ▕██████████████████████████████████████████████████████████▏  799 B
pulling ed11eda7790d: 100% ▕██████████████████████████████████████████████████████████▏   30 B
pulling 1064e17101bd: 100% ▕██████████████████████████████████████████████████████████▏  487 B
verifying sha256 digest
writing manifest
success
Enter fullscreen mode Exit fullscreen mode

Setup the code structure

Aspire's AppHost

// Active provider: "FoundryLocal" or "Ollama"
var activeProvider = builder.AddParameter("agent-provider", "Ollama");

// Ollama configuration
var ollamaEndpoint = builder.AddParameter("ollama-endpoint", "http://localhost:11434/");
var ollamaModel = builder.AddParameter("ollama-model", "mistral");

var mcpToolServer = builder.AddProject<Projects.McpToolServer>("mcp-tools")
    .WithExternalHttpEndpoints();

var agentService = builder.AddProject<Projects.AgentService>("agentservice")
    .WithReference(mcpToolServer)
    // Provider selection
    .WithEnvironment("AGENT_PROVIDER", activeProvider)
    // Ollama settings
    .WithEnvironment("OLLAMA_ENDPOINT", ollamaEndpoint)
    .WithEnvironment("OLLAMA_MODEL", ollamaModel)
    // MCP Tools
    .WithEnvironment("MCP_TOOLS", mcpToolServer.GetEndpoint("http"))
    .WithExternalHttpEndpoints()
    .WaitFor(mcpToolServer);

builder.Build().Run();
Enter fullscreen mode Exit fullscreen mode

Check the full code at https://github.com/thangchung/agent-engineering-experiment/blob/main/foundry-local-agent-fx/AppHost/AppHost.cs

Create Microsoft Agent Framework's Ollama factory

public static ChatClientAgent Create(
    string endpoint,
    string model,
    IList<McpClientTool> mcpTools,
    string? instructions = null,
    string? name = null,
    string? description = null,
    ILoggerFactory? loggerFactory = null,
    bool? enableSensitiveData = null)
{
    var ollamaClient = new OllamaApiClient(new Uri(endpoint), model);

    var tools = mcpTools.Cast<AITool>().ToList();

    // Build the chat client pipeline with OpenTelemetry instrumentation
    IChatClient chatClient = new ChatClientBuilder(ollamaClient)
        .UseOpenTelemetry(
            loggerFactory: loggerFactory,
            sourceName: OtelSourceName,
            configure: otel =>
            {
                // Enable sensitive data capture if explicitly set, otherwise respect env var
                // OTEL_INSTRUMENTATION_GENAI_CAPTURE_MESSAGE_CONTENT=true
                if (enableSensitiveData.HasValue)
                {
                    otel.EnableSensitiveData = enableSensitiveData.Value;
                }
            })
        .UseFunctionInvocation(loggerFactory: loggerFactory)
        .Build();

    // Create the agent with the instrumented chat client
    return new ChatClientAgent(
        chatClient: chatClient,
        instructions: instructions ?? "You are a helpful assistant with access to tools.",
        name: name ?? "OllamaAgent",
        description: description,
        tools: tools);
}
Enter fullscreen mode Exit fullscreen mode

The full source code can be found at https://github.com/thangchung/agent-engineering-experiment/blob/main/foundry-local-agent-fx/AgentService/Providers/OllamaAgentFactory.cs

Create AgentFroviderFactory to choose Ollama or Foundry Local

return providerType switch
        {
            // Ollama uses official ChatClientAgent with FunctionInvokingChatClient
            AgentProviderType.Ollama => services.CreateOllamaAgent(
                ollamaEndpoint: config.Endpoint,
                model: config.Model,
                mcpToolsUrl: config.McpToolsUrl,
                mcpTools: mcpTools,
                instructions: instructions,
                name: name ?? "OllamaAgent",
                description: description ?? "AI Agent powered by Ollama"),

            AgentProviderType.FoundryLocal => services.CreateFoundryLocalAgent(
                foundryEndpoint: config.Endpoint,
                model: config.Model,
                mcpToolsUrl: config.McpToolsUrl,
                mcpTools: mcpTools,
                instructions: instructions,
                name: name ?? "FoundryLocalAgent",
                description: description ?? "AI Agent powered by Foundry Local"),

            _ => throw new ArgumentException($"Unknown provider type: {providerType}", nameof(providerType))
        };
Enter fullscreen mode Exit fullscreen mode

Full source code at https://github.com/thangchung/agent-engineering-experiment/blob/main/foundry-local-agent-fx/AgentService/Providers/AgentProviderFactory.cs

Wire it up

In the Program.cs, we can wire it all up as.

var agent = AgentProviderFactory.Create(
    services: app.Services,
    providerType: providerType,
    config: providerConfig,
    mcpTools: mcpTools,
    instructions: instructions,
    name: $"{providerType}Agent",
    description: $"AI Agent powered by {providerType} with MCP tools");
Enter fullscreen mode Exit fullscreen mode

And /chat endpoint can be easy to use it:

app.MapPost("/chat", async (ChatRequest request) =>
{
    try
    {
        logger.LogInformation("Processing chat: '{Message}'", request.Message);

        // Get or create thread
        AgentThread agentThread;
        string threadId;

        if (!string.IsNullOrEmpty(request.ThreadId) && threads.TryGetValue(request.ThreadId, out var existingThread))
        {
            agentThread = existingThread;
            threadId = request.ThreadId;
            logger.LogDebug("Using existing thread: {ThreadId}", threadId);
        }
        else
        {
            agentThread = agent.GetNewThread();
            threadId = GetThreadId(agentThread);
            threads[threadId] = agentThread;
            logger.LogDebug("Created new thread: {ThreadId}", threadId);
        }

        // Run the agent
        var messages = new[] { new ChatMessage(ChatRole.User, request.Message) };
        var response = await agent.RunAsync(messages, agentThread);

        // Get the assistant's response
        var responseText = response.Messages.LastOrDefault()?.Text ?? "No response generated.";

        return Results.Ok(new ChatResponse(responseText, threadId));
    }
    catch (Exception ex)
    {
        logger.LogError(ex, "Error processing chat request");
        return Results.Problem(ex.Message);
    }
});
Enter fullscreen mode Exit fullscreen mode

Some results

scalar-ui

chat-completion-tracing

genai-semantic-tracing

Conclusion

This wraps up what I explored using Ollama with Microsoft Agent Framework. As you can see, the integration is quite straightforward and does not require significant effort to get up and running.

Happy New Year to everyone reading this post. I’m looking forward to seeing Microsoft Agent Framework reach GA very soon and to the innovations the community will build on top of it.

Full source code of this post: https://github.com/thangchung/agent-engineering-experiment/tree/main/foundry-local-agent-fx

Top comments (0)