TypeChat
TypeChat is a library that makes it easy to build natural language interfaces using types.
Top Related Projects
The ChatGPT Retrieval Plugin lets you easily find personal or work documents by asking questions in natural language.
Integrate cutting-edge LLM technology quickly and easily into your apps
🦜🔗 Build context-aware reasoning applications
Build high-quality LLM apps - from prototyping, testing to production deployment and monitoring.
A guidance language for controlling large language models.
Quick Overview
TypeChat is a library that simplifies the process of building natural language interfaces using large language models (LLMs). It provides a strongly-typed approach to creating AI-powered chatbots and assistants, leveraging TypeScript's type system to ensure type safety and improve developer experience.
Pros
- Strong type safety through TypeScript integration
- Simplified development of natural language interfaces
- Seamless integration with popular LLMs like GPT-3.5 and GPT-4
- Improved reliability and maintainability of AI-powered applications
Cons
- Requires familiarity with TypeScript
- Limited to TypeScript/JavaScript ecosystems
- Potential learning curve for developers new to LLMs or type-driven development
- Dependency on external LLM services
Code Examples
- Defining a schema for a shopping cart:
type CartItem = {
name: string;
quantity: number;
price: number;
};
type ShoppingCart = {
items: CartItem[];
total: number;
};
- Creating a TypeChat instance:
import { TypeChat } from "typechat";
const typeChat = new TypeChat<ShoppingCart>(schema, {
model: "gpt-3.5-turbo",
apiKey: process.env.OPENAI_API_KEY,
});
- Processing user input:
const userInput = "Add 2 apples and 1 banana to my cart";
const result = await typeChat.complete(userInput);
if (result.success) {
console.log("Parsed shopping cart:", result.data);
} else {
console.error("Error:", result.message);
}
Getting Started
-
Install TypeChat:
npm install typechat
-
Define your schema:
type YourSchema = { // Define your schema here };
-
Create a TypeChat instance:
import { TypeChat } from "typechat"; const typeChat = new TypeChat<YourSchema>(schema, { model: "gpt-3.5-turbo", apiKey: process.env.OPENAI_API_KEY, });
-
Process user input:
const result = await typeChat.complete(userInput); if (result.success) { // Handle successful parsing } else { // Handle error }
Competitor Comparisons
The ChatGPT Retrieval Plugin lets you easily find personal or work documents by asking questions in natural language.
Pros of chatgpt-retrieval-plugin
- Focuses on document retrieval and integration with ChatGPT
- Provides a complete solution for indexing and querying documents
- Supports multiple vector database options out-of-the-box
Cons of chatgpt-retrieval-plugin
- Limited to retrieval tasks and may not be as flexible for other use cases
- Requires more setup and configuration compared to TypeChat
- Potentially higher computational resources needed for document indexing
Code Comparison
TypeChat:
import { TypeChat } from "typechat";
const schema = `
type Order = {
items: string[];
total: number;
};
`;
const typechat = new TypeChat(schema);
const result = await typechat.complete("I want to order a pizza and a soda");
chatgpt-retrieval-plugin:
from datastore.factory import get_datastore
from services.retrieval import RetrievalService
datastore = get_datastore()
retrieval_service = RetrievalService(datastore)
results = retrieval_service.query("What are the company's policies?")
Integrate cutting-edge LLM technology quickly and easily into your apps
Pros of Semantic Kernel
- More comprehensive framework for AI orchestration and integration
- Supports multiple programming languages (C#, Python, Java)
- Offers a wider range of AI-powered functionalities beyond natural language processing
Cons of Semantic Kernel
- Steeper learning curve due to its broader scope and complexity
- May be overkill for projects focused solely on natural language interactions
- Requires more setup and configuration compared to TypeChat's simplicity
Code Comparison
TypeChat:
import { TypeChat } from "typechat";
const schema = `
type Order = {
items: string[];
total: number;
};
`;
const typechat = new TypeChat(schema);
const result = await typechat.complete("I want to order a pizza and a soda");
Semantic Kernel:
using Microsoft.SemanticKernel;
var kernel = Kernel.Builder.Build();
var orderSkill = kernel.ImportSkill(new OrderSkill());
var result = await kernel.RunAsync("I want to order a pizza and a soda", orderSkill["ProcessOrder"]);
Both repositories aim to simplify AI integration, but TypeChat focuses on type-safe natural language processing, while Semantic Kernel provides a broader framework for AI orchestration across various domains and languages.
🦜🔗 Build context-aware reasoning applications
Pros of LangChain
- More comprehensive framework with a wider range of tools and integrations
- Larger community and ecosystem, with more resources and third-party extensions
- Supports multiple programming languages (Python and JavaScript)
Cons of LangChain
- Steeper learning curve due to its extensive feature set
- Can be overkill for simpler projects that don't require its full capabilities
- Less focus on type safety compared to TypeChat
Code Comparison
TypeChat:
import { createLanguageModel, createJsonTranslator } from "typechat";
const model = createLanguageModel(process.env);
const schema = `
interface CalendarEntry {
date: string;
title: string;
description?: string;
}`;
const translator = createJsonTranslator<CalendarEntry>(model, schema);
LangChain:
from langchain import PromptTemplate, LLMChain
from langchain.llms import OpenAI
template = "Create a calendar entry for: {input}"
prompt = PromptTemplate(template=template, input_variables=["input"])
llm = OpenAI()
chain = LLMChain(llm=llm, prompt=prompt)
Both repositories aim to simplify working with language models, but they take different approaches. TypeChat focuses on type-safe interactions and schema validation, while LangChain provides a more extensive toolkit for building complex AI applications.
Build high-quality LLM apps - from prototyping, testing to production deployment and monitoring.
Pros of PromptFlow
- Offers a more comprehensive end-to-end solution for building AI applications
- Provides a visual interface for designing and managing prompt flows
- Supports integration with various AI models and services beyond just language models
Cons of PromptFlow
- May have a steeper learning curve due to its broader scope and features
- Potentially more resource-intensive to set up and run compared to TypeChat
- Less focused on type-driven development and natural language to code generation
Code Comparison
TypeChat:
import { TypeChat } from "typechat";
const schema = `
type Order = {
items: string[];
total: number;
};
`;
const typechat = new TypeChat(schema);
const result = await typechat.complete("I want to order a pizza and a soda");
PromptFlow:
from promptflow import PFClient
client = PFClient()
flow = client.flows.create_or_update(source="./order_flow")
result = client.test(flow=flow, inputs={"user_input": "I want to order a pizza and a soda"})
Summary
While TypeChat focuses on type-driven development for natural language processing, PromptFlow offers a more comprehensive solution for building AI applications with visual tools and broader integration capabilities. TypeChat may be simpler to use for specific language-to-code tasks, while PromptFlow provides more flexibility for complex AI workflows.
A guidance language for controlling large language models.
Pros of Guidance
- More flexible and language-agnostic, supporting multiple LLMs and programming languages
- Offers a wider range of control structures and advanced prompting techniques
- Provides a more comprehensive toolkit for complex AI-driven applications
Cons of Guidance
- Steeper learning curve due to its more extensive feature set
- Less focused on natural language processing and type inference
- May require more setup and configuration for basic use cases
Code Comparison
TypeChat:
const schema = `
type Appointment = {
date: string;
time: string;
description: string;
}`;
const response = await typechat.translate(schema, userInput);
Guidance:
with guidance():
appointment = dedent('''
date: {{select "date" options=["2023-07-01", "2023-07-02", "2023-07-03"]}}
time: {{select "time" options=["09:00", "10:00", "11:00"]}}
description: {{gen "description"}}
''')
result = appointment()
TypeChat focuses on type-based schema validation and natural language processing, while Guidance offers more granular control over the generation process with a wider range of prompting techniques and control structures. TypeChat is more specialized for TypeScript/JavaScript ecosystems, whereas Guidance supports multiple programming languages and LLMs, making it more versatile for diverse AI applications.
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual CopilotREADME
TypeChat
TypeChat is a library that makes it easy to build natural language interfaces using types.
Building natural language interfaces has traditionally been difficult. These apps often relied on complex decision trees to determine intent and collect the required inputs to take action. Large language models (LLMs) have made this easier by enabling us to take natural language input from a user and match to intent. This has introduced its own challenges including the need to constrain the model's reply for safety, structure responses from the model for further processing, and ensuring that the reply from the model is valid. Prompt engineering aims to solve these problems, but comes with a steep learning curve and increased fragility as the prompt increases in size.
TypeChat replaces prompt engineering with schema engineering.
Simply define types that represent the intents supported in your natural language application. That could be as simple as an interface for categorizing sentiment or more complex examples like types for a shopping cart or music application. For example, to add additional intents to a schema, a developer can add additional types into a discriminated union. To make schemas hierarchical, a developer can use a "meta-schema" to choose one or more sub-schemas based on user input.
After defining your types, TypeChat takes care of the rest by:
- Constructing a prompt to the LLM using types.
- Validating the LLM response conforms to the schema. If the validation fails, repair the non-conforming output through further language model interaction.
- Summarizing succinctly (without use of a LLM) the instance and confirm that it aligns with user intent.
Types are all you need!
Getting Started
Install TypeChat for TypeScript/JavaScript:
npm install typechat
You can also work with TypeChat from source for:
To see TypeChat in action, we recommend exploring the TypeChat example projects. You can try them on your local machine or in a GitHub Codespace.
To learn more about TypeChat, visit the documentation which includes more information on TypeChat and how to get started.
Contributing
This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.
When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.
This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.
Trademarks
This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.
Top Related Projects
The ChatGPT Retrieval Plugin lets you easily find personal or work documents by asking questions in natural language.
Integrate cutting-edge LLM technology quickly and easily into your apps
🦜🔗 Build context-aware reasoning applications
Build high-quality LLM apps - from prototyping, testing to production deployment and monitoring.
A guidance language for controlling large language models.
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual Copilot