๐ Today's Brief
Browse
Core Concepts
Reasoning
Memory & Retrieval
Agent Types
Design Patterns
Training & Alignment
Frameworks
Tools
Safety
Meta
๐ Today's Brief
Browse
Core Concepts
Reasoning
Memory & Retrieval
Agent Types
Design Patterns
Training & Alignment
Frameworks
Tools
Safety
Meta
Semantic Kernel (SK) is a lightweight, open-source SDK from Microsoft for building AI agents and integrating language models into enterprise applications. Available for C#, Python, and Java, it serves as the AI orchestration layer within the broader Microsoft Agent Framework. The Agent Framework reached general availability (GA) as version 1.0 by the end of Q1 2025.1)
Semantic Kernel is used by Microsoft and Fortune 500 companies for building production-grade AI applications with modular, secure, and observable architectures.2)
The Kernel is the central dependency injection container that manages all services, plugins, and components. It selects optimal AI services, builds prompts from templates, invokes models, and parses responses โ all configurable from a single place.
Plugins extend LLM capabilities with business logic, productivity workflows, and external function calls. Plugins are modular and reusable across different AI applications.
Planners enable the Kernel to automatically decompose complex goals into sequences of plugin calls. The Agent Framework extends this with event-driven task management and human approval workflows.
Connectors integrate with various AI models (OpenAI, Azure OpenAI, DeepSeek) and external services. Expanded connector support in 2025 includes the OpenAI Realtime Audio API.
The Kernel acts as a central orchestrator with a layered design:
This agent-centric design extends RAG workflows to multi-agent systems using unified declarative formats.
Semantic Kernel provides full 1.0+ support across three languages:
import semantic_kernel as sk from semantic_kernel.connectors.ai.open_ai import OpenAIChatCompletion from semantic_kernel.functions import kernel_function # Create the kernel kernel = sk.Kernel() # Add an AI service kernel.add_service( OpenAIChatCompletion(service_id="chat", ai_model_id="gpt-4o") ) # Define a plugin with kernel functions class MathPlugin: @kernel_function(description='Add two numbers') def add(self, a: float, b: float) -> float: return a + b @kernel_function(description='Multiply two numbers') def multiply(self, a: float, b: float) -> float: return a * b # Import the plugin kernel.add_plugin(MathPlugin(), plugin_name='Math') # Invoke a function result = await kernel.invoke( kernel.plugins['Math']['add'], a=15.0, b=27.0 ) print(result) # 42.0
Semantic Kernel and AutoGen/AG2 converge through three integration paths defined in the 2025 roadmap:
This unified approach supports deployment through Azure AI Foundry Agent Service, with paths from local prototyping to cloud-scale production.