Top Related Projects
Open source codebase powering the HuggingChat app
Come join the best place on the internet to learn AI skills. Use code "chatbotui" for an extra 20% off.
🤖 💼 Azure Chat Solution Accelerator powered by Azure Open AI Service
Integrate cutting-edge LLM technology quickly and easily into your apps
🦜🔗 Build context-aware reasoning applications
An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.
Quick Overview
Deep Chat is a fully customizable AI chat component for websites. It provides an easy way to integrate various AI models into web applications, offering a flexible and feature-rich chat interface that can be styled to match any design.
Pros
- Highly customizable with extensive styling options
- Supports multiple AI services and models (OpenAI, Anthropic, HuggingFace, etc.)
- Easy to integrate into existing web projects
- Offers both direct API connections and custom API support
Cons
- Requires API keys for most AI services, which may incur costs
- Limited documentation for advanced customization scenarios
- May have a learning curve for developers unfamiliar with AI integrations
- Dependent on third-party AI services for core functionality
Code Examples
- Basic integration with OpenAI:
<deep-chat apiKey="your-openai-api-key"></deep-chat>
<script src="https://cdn.jsdelivr.net/npm/deep-chat@1.3.0/dist/deepChat.min.js"></script>
- Customizing the chat interface:
<deep-chat
style="border-radius: 10px"
textInput-placeholder="Ask me anything..."
introMessage="Hello! How can I assist you today?"
></deep-chat>
- Using a custom API endpoint:
<deep-chat
request-url="https://your-custom-api.com/chat"
request-method="POST"
request-headers='{"Authorization": "Bearer your-token"}'
></deep-chat>
Getting Started
To get started with Deep Chat, follow these steps:
-
Include the Deep Chat script in your HTML file:
<script src="https://cdn.jsdelivr.net/npm/deep-chat@1.3.0/dist/deepChat.min.js"></script>
-
Add the Deep Chat component to your HTML:
<deep-chat apiKey="your-api-key"></deep-chat>
-
Replace
your-api-key
with your actual API key from the AI service you're using. -
Customize the component as needed using attributes or JavaScript.
For more advanced usage and configuration options, refer to the official documentation on the GitHub repository.
Competitor Comparisons
Open source codebase powering the HuggingChat app
Pros of chat-ui
- More comprehensive and feature-rich UI for chat applications
- Better integration with Hugging Face's ecosystem and models
- Supports multiple chat sessions and conversation management
Cons of chat-ui
- Potentially more complex to set up and customize
- May have a steeper learning curve for beginners
- Less focused on lightweight, embeddable chat components
Code Comparison
deep-chat:
<DeepChat
style={{borderRadius: '10px'}}
request={{url: 'https://api.openai.com/v1/chat/completions'}}
apiKey={process.env.REACT_APP_OPENAI_API_KEY}
/>
chat-ui:
import { ChatInterface } from "@huggingface/chat-ui";
<ChatInterface
conversationId={conversationId}
model={selectedModel}
streamResponse={streamResponse}
onSubmit={handleSubmit}
/>
deep-chat focuses on simplicity and ease of integration, while chat-ui offers more advanced features and customization options. deep-chat is better suited for quick implementations and embedding chat functionality, whereas chat-ui is more appropriate for building full-fledged chat applications with advanced features and Hugging Face ecosystem integration.
Come join the best place on the internet to learn AI skills. Use code "chatbotui" for an extra 20% off.
Pros of chatbot-ui
- More comprehensive UI with features like conversation history and settings
- Built with Next.js, offering server-side rendering and better performance
- Supports multiple chat models and providers out of the box
Cons of chatbot-ui
- More complex setup and configuration required
- Heavier codebase with more dependencies
- Less flexible for integration into existing projects
Code Comparison
chatbot-ui:
const Chat = ({ messages, loading, onSend }) => {
return (
<div className="flex flex-col h-full">
<ChatMessages messages={messages} />
<ChatInput onSend={onSend} disabled={loading} />
</div>
);
};
deep-chat:
<DeepChat
style={{width: "700px", height: "500px"}}
request={{
url: 'https://api.openai.com/v1/chat/completions',
headers: {'Authorization': 'Bearer YOUR_API_KEY'}
}}
/>
deep-chat offers a more straightforward implementation with a single component, while chatbot-ui provides a more structured approach with separate components for messages and input. chatbot-ui's code is more customizable but requires more setup, whereas deep-chat's implementation is simpler and more plug-and-play.
🤖 💼 Azure Chat Solution Accelerator powered by Azure Open AI Service
Pros of AzureChat
- Built specifically for Azure, offering deep integration with Azure services
- Includes a full-stack solution with both frontend and backend components
- Provides enterprise-grade security features and compliance
Cons of AzureChat
- Less flexible for use with non-Azure platforms or services
- More complex setup and configuration process
- Requires Azure subscription and associated costs
Code Comparison
Deep-chat:
<DeepChat
style={{borderRadius: '10px'}}
initialMessages={[{role: 'user', text: 'Hello'}]}
request={{url: 'https://api.example.com/chat'}}
/>
AzureChat:
const chatClient = new ChatClient(endpoint, new AzureKeyCredential(apiKey));
const result = await chatClient.getChatCompletions(deploymentId, messages);
console.log(result.choices[0].message);
Summary
Deep-chat is a lightweight, flexible chat component that can be easily integrated into various projects and services. AzureChat, on the other hand, is a comprehensive solution tailored for Azure environments, offering robust features and enterprise-level security. While Deep-chat provides simplicity and ease of use, AzureChat excels in Azure-specific integrations and scalability for large-scale applications.
Integrate cutting-edge LLM technology quickly and easily into your apps
Pros of semantic-kernel
- More comprehensive framework for building AI applications
- Stronger integration with Azure and other Microsoft services
- Larger community and corporate backing for long-term support
Cons of semantic-kernel
- Steeper learning curve due to more complex architecture
- Potentially overkill for simpler chatbot implementations
- Less focused on UI components compared to deep-chat
Code Comparison
semantic-kernel:
var kernel = Kernel.Builder.Build();
var openAIFunction = kernel.CreateSemanticFunction(
"Generate a summary of the following text: {{$input}}",
maxTokens: 100
);
var result = await kernel.RunAsync("Long text here...", openAIFunction);
deep-chat:
<DeepChat
style={{width: "700px", height: "500px"}}
request={{
url: 'https://api.openai.com/v1/chat/completions',
headers: {'Authorization': 'Bearer YOUR_API_KEY'}
}}
/>
Summary
semantic-kernel is a more robust framework for building AI-powered applications, offering deeper integration with Microsoft services and a larger ecosystem. However, it may be more complex for simple chatbot implementations. deep-chat, on the other hand, focuses on providing an easy-to-use UI component for chat interfaces, making it more suitable for quick implementations but potentially less flexible for complex AI applications.
🦜🔗 Build context-aware reasoning applications
Pros of langchain
- More comprehensive framework for building LLM applications
- Extensive documentation and community support
- Wider range of integrations with various AI models and tools
Cons of langchain
- Steeper learning curve due to its complexity
- May be overkill for simple chatbot implementations
- Requires more setup and configuration
Code comparison
deep-chat:
<deep-chat apiKey="YOUR_API_KEY"></deep-chat>
langchain:
from langchain.llms import OpenAI
from langchain.prompts import PromptTemplate
llm = OpenAI(temperature=0.9)
prompt = PromptTemplate(
input_variables=["product"],
template="What is a good name for a company that makes {product}?",
)
Summary
deep-chat is a simpler, more focused solution for implementing chat interfaces, while langchain offers a more comprehensive framework for building complex LLM applications. deep-chat is easier to get started with, especially for web-based chat implementations, but langchain provides more flexibility and integration options for advanced use cases.
An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.
Pros of FastChat
- More comprehensive and feature-rich, offering a complete ecosystem for training and serving large language models
- Supports multiple model architectures and provides tools for model evaluation and fine-tuning
- Has a larger community and more frequent updates, potentially leading to better long-term support
Cons of FastChat
- Higher complexity and steeper learning curve, which may be overwhelming for simpler chatbot implementations
- Requires more computational resources and setup time compared to Deep Chat's lightweight approach
- Less focused on providing a ready-to-use chat interface, as it's more of a framework for model development
Code Comparison
FastChat (server setup):
from fastchat.serve.controller import Controller
from fastchat.serve.model_worker import ModelWorker
from fastchat.serve.openai_api_server import OpenAIAPIServer
controller = Controller(host="localhost", port=21001)
worker = ModelWorker(controller_addr="http://localhost:21001", worker_addr="http://localhost:21002")
api_server = OpenAIAPIServer(controller_addr="http://localhost:21001")
Deep Chat (component usage):
<deep-chat
style="border-radius: 10px"
introMessage="How can I help you today?"
placeholder="Send a message"
></deep-chat>
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual CopilotREADME
:warning: Note from developer (05/10/2024): :warning:
:airplane: I have recently relocated to another country and am currently focusing on settling into my new home and sorting out various logistics. As a result, progress on new features for Deep Chat has slowed down, and I apologize for the delays in responding to issues.
:heart: Deep Chat has been one of my most rewarding projects, and Iâm eager to get back to my usual development pace as soon as possible. Thank you for your understanding and patience during this time.
Deep Chat is a fully customizable AI chat component that can be injected into your website with minimal to no effort. Whether you want to create a chatbot that leverages popular APIs such as ChatGPT or connect to your own custom service, this component can do it all! Explore deepchat.dev to view all of the available features, how to use them, examples and more!
:rocket: Main Features
- Connect to any API
- Avatars
- Names
- Send/Receive files
- Capture photos via webcam
- Record audio via microphone
- Speech To Text for message input
- Text To Speech to hear message responses
- Support for MarkDown and custom elements to help structure text and render code
- Introduction panel and dynamic modals to help describe functionality for your users
- Connect to popular AI APIs such as OpenAI, HuggingFace, Cohere directly from the browser
- Support for all major ui frameworks/libraries
- Host a model on the browser
- Everything is customizable!
:tada: :tada: 2.0 is now available :tada: :tada:
Announcing Deep Chat 2.0! We have redesigned and improved Deep Chat based on all of your generous feedback. It is now much easier to implement into any website and configure to provide the best possible chat experience for your users. Check out the release notes for more information.
:computer: Getting started
npm install deep-chat
If using React, install the following instead:
npm install deep-chat-react
Simply add the following to your markup:
<deep-chat></deep-chat>
The exact syntax for the above will vary depending on the framework of your choice (see here).
:zap: Connect
Connecting to a service is simple, all you need to do is define its API details using the request
property:
<deep-chat request='{"url":"https://service.com/chat"}'/>
The service will need to be able to handle request and response formats used in Deep Chat. Please read the Connect section in documentation and check out the server template examples.
Alternatively, if you want to connect without changing the target service, use the interceptor
properties to augment the transferred objects or the handler
function to control the request code.
:electric_plug: Direct connection
Connect to popular AI APIs directly from the browser via the use of the directConnection
property:
<deep-chat directConnection='{"openAI":true}'/>
<deep-chat directConnection='{"openAI":{"key": "optional-key-here"}}'/>
Please note that this approach should be used for local/prototyping/demo purposes ONLY as it exposes the API Key to the browser. When ready to go live, please switch to using the connect
property described above along with a proxy service.
Currently supported direct API connections: OpenAI, HuggingFace, Cohere, Stability AI, Azure, AssemblyAI
:robot: Web model
No servers, no connections, host an LLM model entirely on your browser.
Simply add the deep-chat-web-llm module and define the webModel property:
<deep-chat webModel="true" />
:camera: :microphone: Camera and Microphone
Use Deep Chat to capture photos with your webcam and record audio with the microphone. You can enable this using the camera
and microphone
properties:
<deep-chat camera="true" microphone="true" ...other properties />
:microphone: :sound: Speech
https://github.com/OvidijusParsiunas/deep-chat/assets/18709577/e103a42e-b3a7-4449-b9db-73fed6d7876e
Input text with your voice using Speech To Text capabilities and have the responses read out to you with Text To Speech. You can enable this functionality via the speechToText
and textToSpeech
properties.
<deep-chat speechToText="true" textToSpeech="true" ...other properties />
:beginner: Examples
Check out live codepen examples for your UI framework/library of choice:
React | Vue 2 | Vue 3 | Svelte | SvelteKit | Angular | Solid | Next | Nuxt | VanillaJS |
---|---|---|---|---|---|---|---|---|---|
Setting up your own server has never been easier with the following server templates. From creating your own service to establishing proxies for other APIs such as OpenAI, everything has been documented with clear examples to get you up and running in seconds:
Express | Nest | Flask | Spring | Go | SvelteKit | Next |
---|---|---|---|---|---|---|
All examples are ready to be deployed on a hosting platform such as Vercel.
:tv: Tutorials
Demo videos are available on YouTube:
:joystick: Playground
Create, configure and use Deep Chat components without writing any code in the official Playground!
:tada: Update - components can now be stretched to full screen dimensions using the new Expanded View:
:star2: Sponsors
Thankyou to our generous sponsors!
matthiasamberg dorra techpeace aquarius-wing
:heart: Contributions
Open source is built by the community for the community. All contributions to this project are welcome!
Additionally, if you have any suggestions for enhancements, ideas on how to take the project further or have discovered a bug, do not hesitate to create a new issue ticket and we will look into it as soon as possible!
Top Related Projects
Open source codebase powering the HuggingChat app
Come join the best place on the internet to learn AI skills. Use code "chatbotui" for an extra 20% off.
🤖 💼 Azure Chat Solution Accelerator powered by Azure Open AI Service
Integrate cutting-edge LLM technology quickly and easily into your apps
🦜🔗 Build context-aware reasoning applications
An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual Copilot