Convert Figma logo to code with AI

microsoft logosemantic-kernel

Integrate cutting-edge LLM technology quickly and easily into your apps

21,280
3,123
21,280
551

Top Related Projects

Examples and guides for using the OpenAI API

92,071

🦜🔗 Build context-aware reasoning applications

Build high-quality LLM apps - from prototyping, testing to production deployment and monitoring.

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

16,603

:mag: AI orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.

Quick Overview

Semantic Kernel is an open-source SDK developed by Microsoft that integrates Large Language Models (LLMs) into applications. It provides a lightweight framework for orchestrating AI plugins, combining semantic and native functions, and enabling developers to create AI-powered experiences.

Pros

  • Seamless integration of LLMs into applications
  • Supports multiple AI services and models (e.g., OpenAI, Azure OpenAI)
  • Extensible plugin architecture for custom AI capabilities
  • Cross-platform compatibility (C#, Python, Java)

Cons

  • Relatively new project, still in active development
  • Limited documentation and examples compared to more established frameworks
  • Potential learning curve for developers new to AI integration
  • Dependency on external AI services may introduce latency or cost concerns

Code Examples

  1. Creating a kernel and running a semantic function:
using Microsoft.SemanticKernel;

var kernel = Kernel.Builder.Build();
kernel.Config.AddOpenAITextCompletionService("text-davinci-003", "YOUR_API_KEY");

var result = await kernel.RunAsync("What's the capital of France?");
Console.WriteLine(result);
  1. Using a pre-defined skill:
using Microsoft.SemanticKernel;
using Microsoft.SemanticKernel.Skills.Core;

var kernel = Kernel.Builder.Build();
kernel.Config.AddOpenAITextCompletionService("text-davinci-003", "YOUR_API_KEY");

var timeSkill = kernel.ImportSkill(new TimeSkill());
var result = await kernel.RunAsync("What time is it?", timeSkill["Now"]);
Console.WriteLine(result);
  1. Creating a custom semantic function:
using Microsoft.SemanticKernel;

var kernel = Kernel.Builder.Build();
kernel.Config.AddOpenAITextCompletionService("text-davinci-003", "YOUR_API_KEY");

string skPrompt = @"
Generate a short poem about {{$input}}.
Be creative and use metaphors.
";

var poetryFunction = kernel.CreateSemanticFunction(skPrompt);
var result = await kernel.RunAsync("artificial intelligence", poetryFunction);
Console.WriteLine(result);

Getting Started

  1. Install the NuGet package:

    dotnet add package Microsoft.SemanticKernel
    
  2. Create a new kernel and configure an AI service:

    using Microsoft.SemanticKernel;
    
    var kernel = Kernel.Builder.Build();
    kernel.Config.AddOpenAITextCompletionService("text-davinci-003", "YOUR_API_KEY");
    
  3. Run a semantic function:

    var result = await kernel.RunAsync("Tell me a joke about programming.");
    Console.WriteLine(result);
    

Competitor Comparisons

Examples and guides for using the OpenAI API

Pros of openai-cookbook

  • Extensive collection of practical examples and tutorials for using OpenAI's APIs
  • Covers a wide range of use cases and applications, from basic to advanced
  • Regularly updated with new examples and best practices

Cons of openai-cookbook

  • Focused solely on OpenAI's offerings, limiting its scope compared to Semantic Kernel
  • Less emphasis on integrating AI capabilities into larger applications or frameworks
  • Lacks the structured approach to building AI-powered applications that Semantic Kernel provides

Code Comparison

openai-cookbook:

import openai

response = openai.Completion.create(
  engine="text-davinci-002",
  prompt="Translate the following English text to French: '{}'",
  max_tokens=60
)

Semantic Kernel:

var kernel = Kernel.Builder.Build();
kernel.Config.AddOpenAITextCompletionService("davinci", "your-api-key");

var translator = kernel.CreateSemanticFunction("Translate the following English text to French: {{$input}}");
var result = await translator.InvokeAsync("Hello, world!");
92,071

🦜🔗 Build context-aware reasoning applications

Pros of LangChain

  • More extensive documentation and examples
  • Larger community and ecosystem of integrations
  • Supports multiple programming languages (Python, JavaScript)

Cons of LangChain

  • Steeper learning curve due to more complex architecture
  • Less focus on enterprise-grade features and security

Code Comparison

LangChain:

from langchain import OpenAI, LLMChain, PromptTemplate

llm = OpenAI(temperature=0.9)
prompt = PromptTemplate(input_variables=["product"], template="What is a good name for a company that makes {product}?")
chain = LLMChain(llm=llm, prompt=prompt)

print(chain.run("colorful socks"))

Semantic Kernel:

using Microsoft.SemanticKernel;

var kernel = Kernel.Builder.Build();
kernel.Config.AddOpenAITextCompletionService("davinci", "YOUR_API_KEY");

var prompt = "What is a good name for a company that makes {{$input}}?";
var result = await kernel.RunAsync(prompt, new KernelArguments { ["input"] = "colorful socks" });

Console.WriteLine(result);

Both repositories aim to simplify working with large language models, but they have different approaches. LangChain offers more flexibility and a wider range of integrations, while Semantic Kernel focuses on providing a more structured, enterprise-ready framework. The choice between them depends on specific project requirements and developer preferences.

Build high-quality LLM apps - from prototyping, testing to production deployment and monitoring.

Pros of Promptflow

  • More focused on workflow management and orchestration for AI tasks
  • Provides a visual interface for designing and managing prompt flows
  • Better suited for non-technical users and rapid prototyping

Cons of Promptflow

  • Less flexible for complex programming tasks
  • More limited in terms of language support and integration options
  • Newer project with a smaller community and fewer resources

Code Comparison

Semantic Kernel:

var kernel = Kernel.Builder.Build();
var result = await kernel.RunAsync("What is the capital of France?");
Console.WriteLine(result);

Promptflow:

from promptflow import PFClient

client = PFClient()
flow = client.flows.load("my_flow")
result = client.test(flow=flow, inputs={"question": "What is the capital of France?"})
print(result)

Summary

Semantic Kernel is a more comprehensive SDK for AI integration, offering greater flexibility and programming capabilities. Promptflow, on the other hand, excels in visual workflow design and is more accessible to non-developers. The choice between the two depends on the specific needs of the project and the technical expertise of the team.

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

Pros of Transformers

  • Extensive model support: Offers a wide range of pre-trained models and architectures
  • Active community: Large user base and frequent updates
  • Comprehensive documentation: Detailed guides and examples for various use cases

Cons of Transformers

  • Steeper learning curve: Requires more in-depth knowledge of NLP concepts
  • Higher resource requirements: Models can be computationally intensive
  • Less focus on integration: Primarily designed for research and model development

Code Comparison

Transformers:

from transformers import pipeline

classifier = pipeline("sentiment-analysis")
result = classifier("I love this product!")[0]
print(f"Label: {result['label']}, Score: {result['score']:.4f}")

Semantic Kernel:

using Microsoft.SemanticKernel;

var kernel = Kernel.Builder.Build();
var sentiment = kernel.ImportSkill("SentimentAnalysisSkill");
var result = await kernel.RunAsync("I love this product!", sentiment["Analyze"]);
Console.WriteLine($"Sentiment: {result}");

Summary

Transformers is a powerful library for NLP tasks with a vast array of models, while Semantic Kernel focuses on integrating AI capabilities into applications. Transformers offers more flexibility for research and custom model development, whereas Semantic Kernel provides a more streamlined approach for incorporating AI into existing software systems.

16,603

:mag: AI orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.

Pros of Haystack

  • More focused on question answering and information retrieval tasks
  • Offers a wider range of pre-built components for NLP pipelines
  • Provides better support for document-level processing and indexing

Cons of Haystack

  • Less integrated with other AI services and platforms
  • May have a steeper learning curve for beginners
  • Limited support for general-purpose AI development compared to Semantic Kernel

Code Comparison

Haystack example:

from haystack import Pipeline
from haystack.nodes import TfidfRetriever, FARMReader

pipeline = Pipeline()
pipeline.add_node(component=TfidfRetriever(document_store=document_store), name="Retriever", inputs=["Query"])
pipeline.add_node(component=FARMReader(model_name_or_path="deepset/roberta-base-squad2"), name="Reader", inputs=["Retriever"])

Semantic Kernel example:

var kernel = Kernel.Builder.Build();
kernel.ImportSkill(new TextSkill());
kernel.ImportSemanticSkillFromDirectory("skills", "RecommendationSkill");
var result = await kernel.RunAsync("What's a good movie to watch?", recommendationSkill["GetRecommendation"]);

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README

Semantic Kernel

Status

  • Python
    Python package
  • .NET
    Nuget packagedotnet Dockerdotnet Windows
  • Java
    Java CICD BuildsMaven Central

Overview

License: MIT Discord

Semantic Kernel is an SDK that integrates Large Language Models (LLMs) like OpenAI, Azure OpenAI, and Hugging Face with conventional programming languages like C#, Python, and Java. Semantic Kernel achieves this by allowing you to define plugins that can be chained together in just a few lines of code.

What makes Semantic Kernel special, however, is its ability to automatically orchestrate plugins with AI. With Semantic Kernel planners, you can ask an LLM to generate a plan that achieves a user's unique goal. Afterwards, Semantic Kernel will execute the plan for the user.

It provides:

  • abstractions for AI services (such as chat, text to images, audio to text, etc.) and memory stores
  • implementations of those abstractions for services from OpenAI, Azure OpenAI, Hugging Face, local models, and more, and for a multitude of vector databases, such as those from Chroma, Qdrant, Milvus, and Azure
  • a common representation for plugins, which can then be orchestrated automatically by AI
  • the ability to create such plugins from a multitude of sources, including from OpenAPI specifications, prompts, and arbitrary code written in the target language
  • extensible support for prompt management and rendering, including built-in handling of common formats like Handlebars and Liquid
  • and a wealth of functionality layered on top of these abstractions, such as filters for responsible AI, dependency injection integration, and more.

Semantic Kernel is utilized by enterprises due to its flexibility, modularity and observability. Backed with security enhancing capabilities like telemetry support, and hooks and filters so you’ll feel confident you’re delivering responsible AI solutions at scale. Semantic Kernel was designed to be future proof, easily connecting your code to the latest AI models evolving with the technology as it advances. When new models are released, you’ll simply swap them out without needing to rewrite your entire codebase.

Please star the repo to show your support for this project!

Enterprise-ready

Getting started with Semantic Kernel

The Semantic Kernel SDK is available in C#, Python, and Java. To get started, choose your preferred language below. See the Feature Matrix for a breakdown of feature parity between our currently supported languages.

Java logo

The quickest way to get started with the basics is to get an API key from either OpenAI or Azure OpenAI and to run one of the C#, Python, and Java console applications/scripts below.

For C#:

  1. Go to the Quick start page here and follow the steps to dive in.
  2. After Installing the SDK, we advise you follow the steps and code detailed to write your first console app. dotnetmap

For Python:

  1. Go to the Quick start page here and follow the steps to dive in.
  2. You'll need to ensure that you toggle to Python in the the Choose a programming language table at the top of the page. pythonmap

For Java:

The Java code is in the semantic-kernel-java repository. See semantic-kernel-java build for instructions on how to build and run the Java code.

Please file Java Semantic Kernel specific issues in the semantic-kernel-java repository.

Learning how to use Semantic Kernel

The fastest way to learn how to use Semantic Kernel is with our C# and Python Jupyter notebooks. These notebooks demonstrate how to use Semantic Kernel with code snippets that you can run with a push of a button.

Once you've finished the getting started notebooks, you can then check out the main walkthroughs on our Learn site. Each sample comes with a completed C# and Python project that you can run locally.

  1. 📖 Getting Started
  2. 🔌 Detailed Samples
  3. 💡 Concepts

Finally, refer to our API references for more details on the C# and Python APIs:

Visual Studio Code extension: design semantic functions with ease

The Semantic Kernel extension for Visual Studio Code makes it easy to design and test semantic functions. The extension provides an interface for designing semantic functions and allows you to test them with the push of a button with your existing models and data.

Join the community

We welcome your contributions and suggestions to SK community! One of the easiest ways to participate is to engage in discussions in the GitHub repository. Bug reports and fixes are welcome!

For new features, components, or extensions, please open an issue and discuss with us before sending a PR. This is to avoid rejection as we might be taking the core in a different direction, but also to consider the impact on the larger ecosystem.

To learn more and get started:

Contributor Wall of Fame

semantic-kernel contributors

Code of Conduct

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

License

Copyright (c) Microsoft Corporation. All rights reserved.

Licensed under the MIT license.

NPM DownloadsLast 30 Days