Top Related Projects
🦜🔗 Build context-aware reasoning applications
Streamlit — A faster way to build and share data apps.
Build and share delightful machine learning apps, all in Python. 🌟 Star to support our work!
Integrate cutting-edge LLM technology quickly and easily into your apps
A Gradio web UI for Large Language Models.
Quick Overview
Chainlit is an open-source Python package for building conversational AI interfaces. It allows developers to create chatbots and AI assistants with a focus on user-friendly interactions and seamless integration with popular language models and frameworks.
Pros
- Easy-to-use API for building conversational interfaces
- Supports integration with various language models and AI frameworks
- Provides real-time streaming of AI responses
- Offers customizable UI components for enhanced user experience
Cons
- Limited documentation for advanced use cases
- Relatively new project, which may lead to potential instability or breaking changes
- Smaller community compared to more established chatbot frameworks
- May require additional setup for complex integrations
Code Examples
- Creating a simple chatbot:
import chainlit as cl
@cl.on_message
async def main(message: str):
# Echo the user's message
await cl.Message(content=f"You said: {message}").send()
if __name__ == "__main__":
cl.run()
- Integrating with OpenAI's GPT model:
import chainlit as cl
import openai
@cl.on_message
async def main(message: str):
response = openai.Completion.create(
engine="text-davinci-002",
prompt=message,
max_tokens=50
)
await cl.Message(content=response.choices[0].text.strip()).send()
if __name__ == "__main__":
cl.run()
- Adding custom UI elements:
import chainlit as cl
@cl.on_chat_start
def start():
cl.set_header(title="My Custom Chatbot")
cl.set_footer(text="Powered by Chainlit")
@cl.on_message
async def main(message: str):
await cl.Message(content=f"You said: {message}").send()
if __name__ == "__main__":
cl.run()
Getting Started
To get started with Chainlit, follow these steps:
-
Install Chainlit:
pip install chainlit
-
Create a new Python file (e.g.,
app.py
) and add the following code:import chainlit as cl @cl.on_message async def main(message: str): await cl.Message(content=f"Echo: {message}").send() if __name__ == "__main__": cl.run()
-
Run the application:
chainlit run app.py
-
Open your browser and navigate to
http://localhost:8000
to interact with your chatbot.
Competitor Comparisons
🦜🔗 Build context-aware reasoning applications
Pros of langchain
- More comprehensive framework for building LLM applications
- Larger community and ecosystem with extensive documentation
- Supports a wider range of LLMs and integrations
Cons of langchain
- Steeper learning curve due to its extensive features
- Can be overkill for simpler chatbot or LLM projects
- Less focused on UI/UX aspects of chatbot development
Code Comparison
Chainlit example:
import chainlit as cl
@cl.on_message
async def main(message: str):
await cl.Message(content=f"You said: {message}").send()
langchain example:
from langchain import OpenAI, LLMChain, PromptTemplate
llm = OpenAI(temperature=0.9)
prompt = PromptTemplate(input_variables=["product"], template="What is a good name for a company that makes {product}?")
chain = LLMChain(llm=llm, prompt=prompt)
Summary
Chainlit focuses on building chatbot interfaces quickly with a streamlined API, while langchain offers a more comprehensive framework for complex LLM applications. Chainlit excels in rapid prototyping and UI development, whereas langchain provides greater flexibility and integration options for advanced use cases.
Streamlit — A faster way to build and share data apps.
Pros of Streamlit
- More mature and established project with a larger community and ecosystem
- Broader application scope, suitable for various data science and ML projects
- Extensive documentation and tutorials available
Cons of Streamlit
- Less specialized for LLM and chatbot applications
- Can be slower for complex applications due to its stateless nature
- Requires more boilerplate code for advanced functionality
Code Comparison
Streamlit example:
import streamlit as st
st.title("Hello World")
name = st.text_input("Enter your name")
st.write(f"Hello, {name}!")
Chainlit example:
import chainlit as cl
@cl.on_message
async def main(message: str):
await cl.Message(f"Hello, {message}!").send()
Summary
Streamlit is a versatile tool for creating data apps, while Chainlit focuses on building LLM-powered chatbots. Streamlit offers broader applicability and a larger ecosystem, but Chainlit provides a more streamlined experience for chat-based AI applications. The code comparison shows that Chainlit requires less boilerplate for simple chatbot functionality, while Streamlit offers more flexibility for general-purpose data apps.
Build and share delightful machine learning apps, all in Python. 🌟 Star to support our work!
Pros of Gradio
- More mature and widely adopted project with a larger community
- Supports a broader range of input/output types and modalities
- Easier integration with popular machine learning frameworks
Cons of Gradio
- Less specialized for conversational AI and chat interfaces
- Requires more code to create complex, multi-step interactions
- Limited built-in support for streaming responses
Code Comparison
Gradio example:
import gradio as gr
def greet(name):
return f"Hello, {name}!"
demo = gr.Interface(fn=greet, inputs="text", outputs="text")
demo.launch()
Chainlit example:
import chainlit as cl
@cl.on_message
async def main(message: str):
await cl.Message(content=f"Hello, {message}!").send()
if __name__ == "__main__":
cl.run()
Both Chainlit and Gradio are Python libraries for building interactive interfaces for machine learning models. Gradio offers a more general-purpose solution with support for various input and output types, while Chainlit specializes in conversational AI and chat interfaces. Gradio has a larger community and more integrations, but Chainlit provides a more streamlined experience for building chat-based applications with features like built-in streaming support and session management.
Integrate cutting-edge LLM technology quickly and easily into your apps
Pros of Semantic Kernel
- More comprehensive framework for AI orchestration and integration
- Stronger support for multiple programming languages (C#, Python, Java)
- Backed by Microsoft, potentially offering better long-term support and resources
Cons of Semantic Kernel
- Steeper learning curve due to its more complex architecture
- Less focused on chat-based applications compared to Chainlit
- May be overkill for simpler AI projects or prototypes
Code Comparison
Chainlit example:
import chainlit as cl
@cl.on_message
async def main(message: str):
await cl.Message(content=f"You said: {message}").send()
Semantic Kernel example:
import semantic_kernel as sk
kernel = sk.Kernel()
kernel.add_text_completion_service("dv", "text-davinci-003")
context = kernel.create_new_context()
result = kernel.run_async(prompt, input_context=context)
Summary
Chainlit focuses on building chat-based AI applications with a simpler, more streamlined approach. Semantic Kernel offers a more comprehensive framework for AI integration across various applications and languages. Choose Chainlit for quick chat app prototypes, and Semantic Kernel for more complex, multi-faceted AI projects.
A Gradio web UI for Large Language Models.
Pros of text-generation-webui
- More comprehensive and feature-rich UI for text generation tasks
- Supports a wider range of language models and fine-tuning options
- Offers advanced features like character creation and chat modes
Cons of text-generation-webui
- Steeper learning curve due to its complexity
- Requires more system resources to run effectively
- Less focused on building chatbot applications specifically
Code Comparison
text-generation-webui:
def generate_reply(
question, state, stopping_strings=None, is_chat=False, for_ui=False
):
# Complex generation logic
# ...
Chainlit:
@cl.on_message
async def main(message: str):
# Simple chatbot response logic
response = await get_response(message)
await cl.Message(content=response).send()
The code snippets highlight the difference in complexity and focus between the two projects. text-generation-webui offers more advanced generation options, while Chainlit provides a simpler interface for building chatbots.
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual CopilotREADME
Welcome to Chainlit by Literal AI ð
Build production-ready Conversational AI applications in minutes, not weeks â¡ï¸
Chainlit is an open-source async Python framework which allows developers to build scalable Conversational AI or agentic applications.
- â ChatGPT-like application
- â Embedded Chatbot & Software Copilot
- â Slack & Discord
- â Custom frontend (build your own agentic experience)
- â API Endpoint
Full documentation is available here. You can ask Chainlit related questions to Chainlit Help, an app built using Chainlit!
[!NOTE]
Contact us here for Enterprise Support. Check out Literal AI, our product to monitor and evaluate LLM applications! It works with any Python or TypeScript applications and seamlessly with Chainlit by adding aLITERAL_API_KEY
in your project.
Installation
Open a terminal and run:
$ pip install chainlit
$ chainlit hello
If this opens the hello app
in your browser, you're all set!
ð Quickstart
ð Pure Python
Create a new file demo.py
with the following code:
import chainlit as cl
@cl.step(type="tool")
async def tool():
# Fake tool
await cl.sleep(2)
return "Response from the tool!"
@cl.on_message # this function will be called every time a user inputs a message in the UI
async def main(message: cl.Message):
"""
This function is called every time a user inputs a message in the UI.
It sends back an intermediate response from the tool, followed by the final answer.
Args:
message: The user's message.
Returns:
None.
"""
final_answer = await cl.Message(content="").send()
# Call the tool
final_answer.content = await tool()
await final_answer.update()
Now run it!
$ chainlit run demo.py -w
ð Key Features and Integrations
Full documentation is available here. Key features:
- ð¬ Multi Modal chats
- ð Chain of Thought visualisation
- ð¾ Data persistence + human feedback
- ð Debug Mode
- ð¤ Authentication
Chainlit is compatible with all Python programs and libraries. That being said, it comes with integrations for:
ð More Examples - Cookbook
You can find various examples of Chainlit apps here that leverage tools and services such as OpenAI, AnthropiÑ, LangChain, LlamaIndex, ChromaDB, Pinecone and more.
Tell us what you would like to see added in Chainlit using the Github issues or on Discord.
ð Contributing
As an open-source initiative in a rapidly evolving domain, we welcome contributions, be it through the addition of new features or the improvement of documentation.
For detailed information on how to contribute, see here.
ð License
Chainlit is open-source and licensed under the Apache 2.0 license.
Top Related Projects
🦜🔗 Build context-aware reasoning applications
Streamlit — A faster way to build and share data apps.
Build and share delightful machine learning apps, all in Python. 🌟 Star to support our work!
Integrate cutting-edge LLM technology quickly and easily into your apps
A Gradio web UI for Large Language Models.
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual Copilot