In Part 1, we explored Semantic Kernel's core architecture. Now it's time to dive deep into what makes SK truly powerful: plugins.
Plugins are how you extend your AI's capabilities beyond text generation. They transform your LLM from a conversational interface into an autonomous agent that can query databases, call APIs, manipulate files, and interact with the real world. By the end of this article, you'll understand how to build production-grade plugins using native functions, OpenAPI imports, and the game-changing Model Context Protocol (MCP).
Understanding Plugin Architecture
A plugin in Semantic Kernel is simply a named collection of functions. Each function can be:
- Native: Regular C# code decorated with attributes
- Prompt-based: Templates that invoke the LLM
- External: Imported from OpenAPI specs or MCP servers
The LLM sees all registered functions through their schemas and can decide which ones to call based on the conversation context.
// A plugin is just a class with attributed methods
public class WeatherPlugin
{
[KernelFunction("get_current_weather")]
[Description("Gets the current weather for a specified city")]
public async Task<WeatherData> GetCurrentWeatherAsync(
[Description("The city name, e.g., 'Seattle, WA'")] string city,
[Description("Temperature unit: 'celsius' or 'fahrenheit'")] string unit = "fahrenheit")
{
// Your implementation here
}
}
Native Functions: The Foundation
Native functions are the most common and flexible plugin type. Let's build a comprehensive example that demonstrates best practices.
Designing Effective Function Signatures
The LLM reads your function signatures to understand what each function does. Good descriptions are essential:
public class OrderManagementPlugin
{
private readonly IOrderRepository _orderRepository;
private readonly IPaymentService _paymentService;
private readonly ILogger<OrderManagementPlugin> _logger;
public OrderManagementPlugin(
IOrderRepository orderRepository,
IPaymentService paymentService,
ILogger<OrderManagementPlugin> logger)
{
_orderRepository = orderRepository;
_paymentService = paymentService;
_logger = logger;
}
[KernelFunction("get_order_status")]
[Description("Retrieves the current status and details of an order by its ID. Returns order status, items, shipping info, and estimated delivery date.")]
[return: Description("Complete order details including status, line items, and tracking information")]
public async Task<OrderDetails> GetOrderStatusAsync(
[Description("The unique order identifier (format: ORD-XXXXX)")] string orderId,
CancellationToken cancellationToken = default)
{
_logger.LogInformation("Fetching order {OrderId}", orderId);
var order = await _orderRepository.GetByIdAsync(orderId, cancellationToken);
if (order is null)
{
throw new OrderNotFoundException(orderId);
}
return new OrderDetails
{
OrderId = order.Id,
Status = order.Status.ToString(),
Items = order.Items.Select(i => new OrderItem(i.Name, i.Quantity, i.Price)).ToList(),
ShippingAddress = order.ShippingAddress,
TrackingNumber = order.TrackingNumber,
EstimatedDelivery = order.EstimatedDeliveryDate
};
}
[KernelFunction("cancel_order")]
[Description("Cancels an order if it hasn't shipped yet. Initiates refund process automatically.")]
public async Task<CancellationResult> CancelOrderAsync(
[Description("The order ID to cancel")] string orderId,
[Description("Reason for cancellation (e.g., 'changed mind', 'found cheaper', 'ordered wrong item')")] string reason,
CancellationToken cancellationToken = default)
{
_logger.LogInformation("Cancelling order {OrderId}, reason: {Reason}", orderId, reason);
var order = await _orderRepository.GetByIdAsync(orderId, cancellationToken);
if (order is null)
return new CancellationResult(false, "Order not found");
if (order.Status == OrderStatus.Shipped)
return new CancellationResult(false, "Cannot cancel - order has already shipped. Please initiate a return instead.");
if (order.Status == OrderStatus.Delivered)
return new CancellationResult(false, "Cannot cancel - order was delivered. Please initiate a return instead.");
await _orderRepository.UpdateStatusAsync(orderId, OrderStatus.Cancelled, cancellationToken);
// Initiate refund
var refundResult = await _paymentService.RefundAsync(order.PaymentId, cancellationToken);
return new CancellationResult(
true,
$"Order cancelled successfully. Refund of {refundResult.Amount:C} will be processed within 5-7 business days.");
}
[KernelFunction("list_recent_orders")]
[Description("Lists the customer's recent orders with summary information")]
public async Task<IReadOnlyList<OrderSummary>> ListRecentOrdersAsync(
[Description("Customer ID to look up orders for")] string customerId,
[Description("Maximum number of orders to return (default: 10, max: 50)")] int limit = 10,
CancellationToken cancellationToken = default)
{
limit = Math.Clamp(limit, 1, 50);
var orders = await _orderRepository.GetRecentByCustomerAsync(customerId, limit, cancellationToken);
return orders.Select(o => new OrderSummary
{
OrderId = o.Id,
Date = o.CreatedAt,
Status = o.Status.ToString(),
Total = o.Total,
ItemCount = o.Items.Count
}).ToList();
}
}
Complex Return Types
Semantic Kernel automatically serializes complex types to JSON for the LLM. Use records for clean, immutable data:
public record OrderDetails
{
public required string OrderId { get; init; }
public required string Status { get; init; }
public required List<OrderItem> Items { get; init; }
public required string ShippingAddress { get; init; }
public string? TrackingNumber { get; init; }
public DateTime? EstimatedDelivery { get; init; }
}
public record OrderItem(string Name, int Quantity, decimal Price);
public record OrderSummary
{
public required string OrderId { get; init; }
public required DateTime Date { get; init; }
public required string Status { get; init; }
public required decimal Total { get; init; }
public required int ItemCount { get; init; }
}
public record CancellationResult(bool Success, string Message);
Error Handling in Plugins
The LLM needs to understand errors to respond appropriately. Throw descriptive exceptions or return error results:
[KernelFunction("process_refund")]
[Description("Processes a refund for a returned item")]
public async Task<RefundResult> ProcessRefundAsync(
[Description("Order ID")] string orderId,
[Description("Item ID to refund")] string itemId,
[Description("Refund reason")] string reason)
{
try
{
var order = await _orderRepository.GetByIdAsync(orderId);
if (order is null)
return new RefundResult(false, "Order not found", null);
var item = order.Items.FirstOrDefault(i => i.Id == itemId);
if (item is null)
return new RefundResult(false, $"Item {itemId} not found in order {orderId}", null);
if (!item.IsRefundable)
return new RefundResult(false, "This item is not eligible for refund (final sale)", null);
var refund = await _paymentService.RefundItemAsync(order.PaymentId, item.Price);
return new RefundResult(true, "Refund processed successfully", refund.TransactionId);
}
catch (PaymentServiceException ex)
{
_logger.LogError(ex, "Payment service error during refund");
return new RefundResult(false, "Payment processing temporarily unavailable. Please try again later.", null);
}
}
Automatic Function Calling
Once plugins are registered, you can enable automatic function calling—the LLM decides when to use them:
// Register the plugin
kernel.Plugins.AddFromObject(
new OrderManagementPlugin(orderRepo, paymentService, logger),
"Orders");
// Enable auto function calling
var settings = new AzureOpenAIPromptExecutionSettings
{
FunctionChoiceBehavior = FunctionChoiceBehavior.Auto()
};
// The LLM will automatically call functions as needed
var response = await kernel.InvokePromptAsync(
"What's the status of my order ORD-12345, and can you cancel it?",
new KernelArguments(settings));
Console.WriteLine(response);
// Output: "Your order ORD-12345 is currently 'Processing' with 3 items totaling $127.50...
// I've cancelled the order. Your refund of $127.50 will be processed within 5-7 business days."
Controlling Function Selection
You can fine-tune which functions the LLM can access:
// Allow specific functions only
var settings = new AzureOpenAIPromptExecutionSettings
{
FunctionChoiceBehavior = FunctionChoiceBehavior.Auto(
functions: new[]
{
kernel.Plugins["Orders"]["get_order_status"],
kernel.Plugins["Orders"]["list_recent_orders"]
// Note: cancel_order is NOT included - read-only mode
})
};
// Require a specific function to be called
var settings = new AzureOpenAIPromptExecutionSettings
{
FunctionChoiceBehavior = FunctionChoiceBehavior.Required(
functions: new[] { kernel.Plugins["Orders"]["get_order_status"] })
};
// No function calling - just text generation
var settings = new AzureOpenAIPromptExecutionSettings
{
FunctionChoiceBehavior = FunctionChoiceBehavior.None()
};
OpenAPI Plugin Import
Have an existing REST API with an OpenAPI spec? Import it directly as a plugin:
// From a URL
var weatherPlugin = await kernel.ImportPluginFromOpenApiAsync(
"Weather",
new Uri("https://api.weather.service/openapi.json"),
new OpenApiFunctionExecutionParameters
{
HttpClient = httpClient,
AuthCallback = async (request, cancellationToken) =>
{
var token = await tokenProvider.GetTokenAsync(cancellationToken);
request.Headers.Authorization = new AuthenticationHeaderValue("Bearer", token);
},
EnablePayloadNamespacing = true
});
// From a local file
var inventoryPlugin = await kernel.ImportPluginFromOpenApiAsync(
"Inventory",
new FileInfo("./openapi/inventory-api.yaml"),
new OpenApiFunctionExecutionParameters
{
HttpClient = httpClient,
ServerUrlOverride = new Uri("https://api.internal.company.com"),
IgnoreNonCompliantErrors = true
});
Handling OpenAPI Authentication
Different APIs require different auth patterns:
// API Key in header
var parameters = new OpenApiFunctionExecutionParameters
{
HttpClient = httpClient,
AuthCallback = async (request, ct) =>
{
request.Headers.Add("X-API-Key", Environment.GetEnvironmentVariable("API_KEY"));
}
};
// OAuth2 Bearer token
var parameters = new OpenApiFunctionExecutionParameters
{
HttpClient = httpClient,
AuthCallback = async (request, ct) =>
{
var token = await tokenCredential.GetTokenAsync(
new TokenRequestContext(new[] { "api://my-api/.default" }), ct);
request.Headers.Authorization = new AuthenticationHeaderValue("Bearer", token.Token);
}
};
// Basic auth
var parameters = new OpenApiFunctionExecutionParameters
{
HttpClient = httpClient,
AuthCallback = async (request, ct) =>
{
var credentials = Convert.ToBase64String(Encoding.UTF8.GetBytes("user:password"));
request.Headers.Authorization = new AuthenticationHeaderValue("Basic", credentials);
}
};
Model Context Protocol (MCP) Integration
MCP is the 2025 game-changer. It's an open protocol that standardizes how LLMs connect to external tools and data sources. Instead of building custom integrations, you connect to MCP servers that expose capabilities in a standard way.
Understanding MCP
MCP servers expose three types of primitives:
- Tools: Functions the LLM can call (like native plugins)
- Resources: Data the LLM can read (files, databases)
- Prompts: Pre-built prompt templates
The protocol supports multiple transports:
- stdio: The server runs as a subprocess
- SSE (Server-Sent Events): HTTP-based streaming
- HTTP: Traditional request/response
Connecting to MCP Servers
using Microsoft.SemanticKernel.Connectors.MCP;
// Connect to an MCP server via stdio (subprocess)
var mcpClient = await McpClient.CreateAsync(
new StdioClientTransport(
command: "npx",
arguments: new[] { "-y", "@modelcontextprotocol/server-filesystem", "/data" }
),
clientInfo: new ClientInfo { Name = "MyApp", Version = "1.0.0" }
);
// Import all tools from the MCP server as SK functions
kernel.Plugins.AddFromMcp(mcpClient, pluginName: "FileSystem");
// Now you can use file operations in your prompts
var response = await kernel.InvokePromptAsync(
"List all markdown files in the docs folder and summarize the first one",
new KernelArguments(autoFunctionSettings));
Popular MCP Servers
The MCP ecosystem is growing rapidly. Here are some useful servers:
// Filesystem operations
var fsClient = await McpClient.CreateAsync(
new StdioClientTransport("npx", new[] { "-y", "@modelcontextprotocol/server-filesystem", "/allowed/path" }));
// GitHub integration
var githubClient = await McpClient.CreateAsync(
new StdioClientTransport("npx", new[] { "-y", "@modelcontextprotocol/server-github" }),
environment: new Dictionary<string, string> { ["GITHUB_TOKEN"] = token });
// PostgreSQL database
var pgClient = await McpClient.CreateAsync(
new StdioClientTransport("npx", new[] { "-y", "@modelcontextprotocol/server-postgres" }),
environment: new Dictionary<string, string> { ["DATABASE_URL"] = connectionString });
// Web browsing
var browserClient = await McpClient.CreateAsync(
new StdioClientTransport("npx", new[] { "-y", "@anthropic/mcp-server-puppeteer" }));
Building Your Own MCP Server in C#
You can create MCP servers that expose your business logic to any MCP-compatible client:
using ModelContextProtocol.Server;
var server = new McpServerBuilder()
.WithServerInfo("InventoryServer", "1.0.0")
.WithTool(
name: "check_stock",
description: "Check current stock level for a product",
inputSchema: new
{
type = "object",
properties = new
{
sku = new { type = "string", description = "Product SKU" }
},
required = new[] { "sku" }
},
handler: async (args, ct) =>
{
var sku = args["sku"]?.ToString();
var stock = await inventoryService.GetStockAsync(sku!, ct);
return new ToolResult($"Current stock for {sku}: {stock.Quantity} units");
})
.WithTool(
name: "reserve_stock",
description: "Reserve stock for an order",
inputSchema: new
{
type = "object",
properties = new
{
sku = new { type = "string", description = "Product SKU" },
quantity = new { type = "integer", description = "Quantity to reserve" }
},
required = new[] { "sku", "quantity" }
},
handler: async (args, ct) =>
{
var sku = args["sku"]!.ToString();
var qty = (int)args["quantity"]!;
var reservation = await inventoryService.ReserveAsync(sku!, qty, ct);
return new ToolResult($"Reserved {qty} units. Reservation ID: {reservation.Id}");
})
.Build();
// Run as stdio server
await server.RunAsync(Console.OpenStandardInput(), Console.OpenStandardOutput());
SSE Transport for Web Services
For HTTP-based MCP servers:
// Connect to an SSE-based MCP server
var mcpClient = await McpClient.CreateAsync(
new SseClientTransport(new Uri("https://mcp.yourservice.com/sse")),
clientInfo: new ClientInfo { Name = "MyApp", Version = "1.0.0" }
);
kernel.Plugins.AddFromMcp(mcpClient, "YourService");
Plugin Testing Strategies
Plugins are just classes—test them like any other code:
public class OrderManagementPluginTests
{
[Fact]
public async Task GetOrderStatus_ReturnsDetails_WhenOrderExists()
{
// Arrange
var mockRepo = new Mock<IOrderRepository>();
mockRepo.Setup(r => r.GetByIdAsync("ORD-123", It.IsAny<CancellationToken>()))
.ReturnsAsync(new Order
{
Id = "ORD-123",
Status = OrderStatus.Processing,
Items = new List<OrderItem> { new("Widget", 2, 29.99m) }
});
var plugin = new OrderManagementPlugin(
mockRepo.Object,
Mock.Of<IPaymentService>(),
Mock.Of<ILogger<OrderManagementPlugin>>());
// Act
var result = await plugin.GetOrderStatusAsync("ORD-123");
// Assert
Assert.Equal("ORD-123", result.OrderId);
Assert.Equal("Processing", result.Status);
Assert.Single(result.Items);
}
[Fact]
public async Task CancelOrder_ReturnsFalse_WhenAlreadyShipped()
{
// Arrange
var mockRepo = new Mock<IOrderRepository>();
mockRepo.Setup(r => r.GetByIdAsync("ORD-456", It.IsAny<CancellationToken>()))
.ReturnsAsync(new Order { Id = "ORD-456", Status = OrderStatus.Shipped });
var plugin = new OrderManagementPlugin(
mockRepo.Object,
Mock.Of<IPaymentService>(),
Mock.Of<ILogger<OrderManagementPlugin>>());
// Act
var result = await plugin.CancelOrderAsync("ORD-456", "Changed mind");
// Assert
Assert.False(result.Success);
Assert.Contains("already shipped", result.Message);
}
}
Integration Testing with the Kernel
Test how plugins behave when invoked through the kernel:
[Fact]
public async Task Kernel_InvokesPlugin_WithCorrectArguments()
{
// Arrange
var kernel = Kernel.CreateBuilder().Build();
var mockPlugin = new Mock<IOrderPlugin>();
mockPlugin
.Setup(p => p.GetOrderStatusAsync(It.IsAny<string>(), It.IsAny<CancellationToken>()))
.ReturnsAsync(new OrderDetails { OrderId = "ORD-789", Status = "Delivered" });
kernel.Plugins.AddFromObject(mockPlugin.Object, "Orders");
// Act
var function = kernel.Plugins["Orders"]["GetOrderStatusAsync"];
var result = await kernel.InvokeAsync(function, new KernelArguments
{
["orderId"] = "ORD-789"
});
// Assert
var details = result.GetValue<OrderDetails>();
Assert.Equal("Delivered", details!.Status);
mockPlugin.Verify(p => p.GetOrderStatusAsync("ORD-789", It.IsAny<CancellationToken>()), Times.Once);
}
Plugin Organization Best Practices
As your application grows, organize plugins thoughtfully:
Plugins/
├── Core/
│ ├── TextAnalysisPlugin.cs
│ └── DateTimePlugin.cs
├── Business/
│ ├── OrderManagementPlugin.cs
│ ├── CustomerPlugin.cs
│ └── InventoryPlugin.cs
├── External/
│ ├── WeatherPlugin.cs
│ └── ShippingPlugin.cs
└── Infrastructure/
├── EmailPlugin.cs
└── NotificationPlugin.cs
Register them in a structured way:
public static class PluginRegistration
{
public static IKernelBuilder AddBusinessPlugins(
this IKernelBuilder builder,
IServiceProvider services)
{
builder.Plugins.AddFromType<OrderManagementPlugin>(services);
builder.Plugins.AddFromType<CustomerPlugin>(services);
builder.Plugins.AddFromType<InventoryPlugin>(services);
return builder;
}
public static async Task<IKernel> AddMcpPluginsAsync(
this IKernel kernel,
McpConfiguration config)
{
if (config.EnableFilesystem)
{
var fsClient = await McpClient.CreateAsync(
new StdioClientTransport("npx", config.FilesystemArgs));
kernel.Plugins.AddFromMcp(fsClient, "Filesystem");
}
if (config.EnableGitHub)
{
var ghClient = await McpClient.CreateAsync(
new StdioClientTransport("npx", config.GitHubArgs));
kernel.Plugins.AddFromMcp(ghClient, "GitHub");
}
return kernel;
}
}
What's Next
In this article, we explored the full spectrum of Semantic Kernel plugins:
- Native Functions: C# methods with rich descriptions that become LLM tools
- OpenAPI Import: Instantly expose existing REST APIs to your AI
- MCP Integration: Connect to the growing ecosystem of standardized AI tools
- Testing: Treat plugins as first-class citizens in your test suite
In Part 3, we'll dive into memory and vector stores—how to give your AI persistent knowledge through embeddings, semantic search, and multiple storage backends.
This is Part 2 of a 5-part series on Semantic Kernel. Next up: Memory and Vector Stores
Top comments (1)
Great deep dive! One thing I'm curious about — when you're building production-grade plugins, how do you handle error cases when the LLM calls the wrong plugin or passes unexpected parameters? Is that something SK manages internally or do we need to build that safety net ourselves?