What is Semantic Kernel?
Semantic Kernel is Microsoft's open-source SDK for building AI agents and integrating LLMs into applications. It's the same framework powering Microsoft 365 Copilot, and it supports C#, Python, and Java.
Why Semantic Kernel Over LangChain?
While LangChain dominates the Python ecosystem, Semantic Kernel offers advantages:
- Enterprise-grade — battle-tested in Microsoft 365 Copilot at massive scale
- Multi-language — native C#, Python, Java support (not just Python wrappers)
- Plugin architecture — compose AI capabilities like LEGO blocks
- Built-in planning — AI automatically chains plugins to achieve goals
- OpenAI + Azure + local models — swap providers without code changes
Quick Start (Python)
pip install semantic-kernel
import semantic_kernel as sk
from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion
kernel = sk.Kernel()
kernel.add_service(OpenAIChatCompletion(
service_id="chat",
ai_model_id="gpt-4",
api_key="your-api-key"
))
# Create a semantic function
prompt = """Summarize the following text in 3 bullet points:
{{$input}}"""
summarize = kernel.add_function(
function_name="summarize",
plugin_name="TextPlugin",
prompt=prompt
)
result = await kernel.invoke(summarize, input="Your long text here...")
print(result)
Build a Plugin System
from semantic_kernel.functions import kernel_function
class WebSearchPlugin:
@kernel_function(description="Search the web for information")
def search(self, query: str) -> str:
# Your search implementation
return f"Results for: {query}"
class MathPlugin:
@kernel_function(description="Calculate mathematical expressions")
def calculate(self, expression: str) -> str:
return str(eval(expression)) # simplified
# Register plugins
kernel.add_plugin(WebSearchPlugin(), "web")
kernel.add_plugin(MathPlugin(), "math")
Auto-Planning: Let AI Chain Plugins
from semantic_kernel.planners import SequentialPlanner
planner = SequentialPlanner(kernel)
# AI automatically figures out which plugins to call
plan = await planner.create_plan(
"Find the current population of Tokyo and calculate what percentage it is of Japan's total population"
)
result = await plan.invoke(kernel)
print(result) # AI chains web.search → math.calculate
Memory and RAG
from semantic_kernel.connectors.memory.azure_cognitive_search import AzureCognitiveSearchMemoryStore
memory = AzureCognitiveSearchMemoryStore(
vector_size=1536,
search_endpoint="https://your-search.search.windows.net",
admin_key="your-key"
)
# Save memories
await kernel.memory.save_information(
collection="docs",
text="Semantic Kernel supports multiple AI providers",
id="doc1"
)
# Recall relevant memories
results = await kernel.memory.search("docs", "What AI providers are supported?")
Semantic Kernel vs LangChain vs CrewAI
| Feature | Semantic Kernel | LangChain | CrewAI |
|---|---|---|---|
| Languages | C#, Python, Java | Python, JS | Python |
| Enterprise backing | Microsoft | LangChain Inc | Community |
| Plugin system | Native | Tools/Chains | Tools |
| Auto-planning | Built-in | LangGraph | Built-in |
| Azure integration | Native | Adapter | Manual |
| Production scale | Microsoft 365 | Various | Early |
Real-World Use Case
An enterprise client needed to build an internal knowledge assistant. They tried LangChain first — it worked for prototyping but broke at scale. Switching to Semantic Kernel with Azure OpenAI gave them: native .NET integration with their existing stack, built-in retry/throttling for Azure, and the planner automatically composed 12 plugins into complex workflows.
Building AI agents for your team? I specialize in production AI systems with verified data sources. Contact spinov001@gmail.com or explore my automation tools on Apify.
Top comments (0)