DEV Community

Kamil Riyas
Kamil Riyas

Posted on

Local AI WebAPI with Semantic Kernel and Ollama

Impatient? GitHub

In my last post we saw how to get started with local SLM using Ollama and Semantic Kernel where we called Llama3.2 model from a console applicaition.

In this write-up we'll see how to integrate Semantic Kernel with Asp.net WebAPI.

Please note that this is just a barebone demo, not the standard way to use Semantic Kernel with WebAPI. I'm planning to showcase that in a future post.

Make sure that you have the following things with you in your local:

  • dotnet 8.0 and above
  • local ollama instance with an SLM like llama3.2.

Project Setup

We'll initialize a bare bone Asp.Net WebAPI application and install the below packages as well.

dotnet new webapi -n sk-webapi -o sk-webapi
cd sk-webapi\
dotnet add package Microsoft.SemanticKernel
dotnet add package Microsoft.SemanticKernel.Connectors.Ollama --prerelease
Enter fullscreen mode Exit fullscreen mode

*At the time of this writing, Semantic Kernel's Ollama connector is still in preview. So you might want to update the package command.

Coding Time

In your program.cs file

  • Just like any other WebAPI apps, create the WebApplication builder.
  • Create an HttpClient object with the local ollama instance uri.
  • Inject AddOllamaChatCompletion("llama3.2", httpClient) to the service collection.
var builder = WebApplication.CreateBuilder(args);

// Add services to the container.
builder.Services.AddControllers();

var httpClient = new HttpClient() { 
        BaseAddress = new Uri("http://localhost:11434")
};

#pragma warning disable SKEXP0070 // Type is for evaluation purposes only and is subject to change or removal in future updates. Suppress this diagnostic to proceed.
builder.Services.AddOllamaChatCompletion("llama3.2", httpClient);
#pragma warning restore SKEXP0070 // Type is for evaluation purposes only and is subject to change or removal in future updates. Suppress this diagnostic to proceed.        

var app = builder.Build();
Enter fullscreen mode Exit fullscreen mode

Endpoint Setup

Create a controller and like any other service that you'd inject, invoke the chat completion service using IChatCompletionService.

public class ChatController : ControllerBase
    {
        public readonly IChatCompletionService _chatCompletionService;
        public ChatController(IChatCompletionService chatCompletionService)
        {
            _chatCompletionService = chatCompletionService;
        }

        [HttpGet]
        public async Task<string?> GetCharResponseAsync(string input)
        {
            if (input != null)
            {
                var chatResult = await _chatCompletionService.GetChatMessageContentsAsync(input);
                return chatResult[0].ToString();
            }
            else
            {
                return null;
            }
        }
    }
Enter fullscreen mode Exit fullscreen mode

That's it. For my next post i'll be implementing a sample showcasing Function Calling.

Do your career a big favor. Join DEV. (The website you're on right now)

It takes one minute, it's free, and is worth it for your career.

Get started

Community matters

Top comments (0)

A Workflow Copilot. Tailored to You.

Pieces.app image

Our desktop app, with its intelligent copilot, streamlines coding by generating snippets, extracting code from screenshots, and accelerating problem-solving.

Read the docs

👋 Kindness is contagious

Engage with a sea of insights in this enlightening article, highly esteemed within the encouraging DEV Community. Programmers of every skill level are invited to participate and enrich our shared knowledge.

A simple "thank you" can uplift someone's spirits. Express your appreciation in the comments section!

On DEV, sharing knowledge smooths our journey and strengthens our community bonds. Found this useful? A brief thank you to the author can mean a lot.

Okay