Convert Figma logo to code with AI

AI-Yash logost-chat

Streamlit Component, for a Chatbot UI

1,020
270
1,020
41

Top Related Projects

AI chat for any model.

Welcome to the Bot Framework samples repository. Here you will find task-focused samples in C#, JavaScript/TypeScript, and Python to help you get started with the Bot Framework SDK!

13,433

The open-source hub to build & deploy GPT/LLM Agents ⚡️

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

Quick Overview

st-chat is a Streamlit component that provides a chat interface for building conversational AI applications. It offers a simple way to create chat-based user interfaces within Streamlit apps, making it easier to develop and deploy chatbots, conversational agents, and other interactive AI systems.

Pros

  • Easy integration with Streamlit applications
  • Customizable chat interface with various styling options
  • Supports both user and AI messages
  • Includes features like message avatars and timestamps

Cons

  • Limited documentation and examples
  • Dependency on Streamlit framework
  • May require additional backend logic for complex chatbot functionality
  • Not suitable for standalone chat applications outside of Streamlit

Code Examples

  1. Basic chat interface setup:
import streamlit as st
from streamlit_chat import message

st.title("Simple Chat Interface")

message("Hello! How can I assist you today?", is_user=False)
message("I have a question about your services.", is_user=True)
  1. Adding user input and displaying chat history:
import streamlit as st
from streamlit_chat import message

st.title("Interactive Chat")

# Initialize chat history
if "messages" not in st.session_state:
    st.session_state.messages = []

# Display chat history
for i, msg in enumerate(st.session_state.messages):
    message(msg["content"], is_user=msg["is_user"], key=f"msg_{i}")

# User input
user_input = st.text_input("Your message:")
if user_input:
    st.session_state.messages.append({"content": user_input, "is_user": True})
    # Add AI response logic here
    ai_response = "Thank you for your message. How can I help you further?"
    st.session_state.messages.append({"content": ai_response, "is_user": False})
    st.experimental_rerun()
  1. Customizing message appearance:
import streamlit as st
from streamlit_chat import message

st.title("Styled Chat Messages")

message("Welcome to our service!", is_user=False, avatar_style="bottts", seed="Aneka")
message("Thanks! I'm excited to try it out.", is_user=True, avatar_style="human")
message("Great! Let me know if you have any questions.", is_user=False, avatar_style="bottts", seed="Aneka")

Getting Started

To use st-chat in your Streamlit application:

  1. Install the package:

    pip install streamlit-chat
    
  2. Import and use in your Streamlit app:

    import streamlit as st
    from streamlit_chat import message
    
    st.title("My Chat Application")
    message("Hello, how can I help you?", is_user=False)
    user_input = st.text_input("Your message:")
    if user_input:
        message(user_input, is_user=True)
        # Add your chatbot logic here
    
  3. Run your Streamlit app:

    streamlit run your_app.py
    

Competitor Comparisons

AI chat for any model.

Pros of chatbot-ui

  • More feature-rich UI with conversation history, settings, and API key management
  • Supports multiple chat models and providers (OpenAI, Anthropic, etc.)
  • Built with Next.js, offering better performance and SEO capabilities

Cons of chatbot-ui

  • More complex setup and configuration required
  • Heavier resource usage due to its comprehensive feature set
  • Steeper learning curve for developers new to Next.js or React

Code Comparison

st-chat (Python):

def chat(prompt, history):
    with st.chat_message("user"):
        st.markdown(prompt)
    with st.chat_message("assistant"):
        message_placeholder = st.empty()
        full_response = ""
        for response in openai.ChatCompletion.create(
            model=st.session_state["openai_model"],
            messages=[
                {"role": "system", "content": "You are a helpful assistant."},
                *history,
                {"role": "user", "content": prompt},
            ],
            stream=True,
        ):
            full_response += response.choices[0].delta.get("content", "")
            message_placeholder.markdown(full_response + "▌")
        message_placeholder.markdown(full_response)
    return full_response

chatbot-ui (TypeScript):

export const ChatHandler = async (
  message: Message,
  chatMessages: Message[],
  prompt: string,
  model: OpenAIModel
) => {
  let apiKey = process.env.OPENAI_API_KEY;

  if (!apiKey) {
    return {
      error:
        "Missing API key. Please set the OPENAI_API_KEY environment variable.",
    };
  }

  try {
    const response = await fetch("https://api.openai.com/v1/chat/completions", {
      method: "POST",
      headers: {
        "Content-Type": "application/json",
        Authorization: `Bearer ${apiKey}`,
      },
      body: JSON.stringify({
        model: model.id,
        messages: [
          {
            role: "system",
            content: prompt,
          },
          ...chatMessages,
          {
            role: "user",
            content: message.content,
          },
        ],
      }),
    });

    const data = await response.json();

    return { data };
  } catch (error) {
    console.error("Error:", error);
    return { error: "An error occurred while fetching the data." };
  }
};

Welcome to the Bot Framework samples repository. Here you will find task-focused samples in C#, JavaScript/TypeScript, and Python to help you get started with the Bot Framework SDK!

Pros of BotBuilder-Samples

  • Comprehensive collection of bot-building examples across multiple programming languages and frameworks
  • Extensive documentation and integration with Microsoft Azure services
  • Supports complex conversational flows and multi-turn dialogs

Cons of BotBuilder-Samples

  • Steeper learning curve due to its extensive feature set
  • Primarily focused on Microsoft's ecosystem, which may limit flexibility for some developers
  • Requires more setup and configuration compared to simpler chatbot solutions

Code Comparison

st-chat:

import streamlit as st
from streamlit_chat import message

st.title("Simple Streamlit Chat")
message("Hello, how can I help you?")

BotBuilder-Samples (JavaScript):

const { ActivityHandler, MessageFactory } = require('botbuilder');

class EchoBot extends ActivityHandler {
    constructor() {
        super();
        this.onMessage(async (context, next) => {
            await context.sendActivity(MessageFactory.text(`You said: ${context.activity.text}`));
            await next();
        });
    }
}

The st-chat example demonstrates a simple chat interface using Streamlit, while the BotBuilder-Samples code shows a more complex bot structure with message handling and activity management.

13,433

The open-source hub to build & deploy GPT/LLM Agents ⚡️

Pros of botpress

  • More comprehensive chatbot development platform with visual flow builder
  • Supports multiple channels (web, Slack, Facebook Messenger, etc.)
  • Extensive documentation and community support

Cons of botpress

  • Steeper learning curve due to more complex features
  • Requires more setup and configuration
  • Heavier resource usage for full functionality

Code Comparison

st-chat (Python):

import streamlit as st
from streamlit_chat import message

def chat():
    st.title("Chatbot")
    # Chat implementation

botpress (JavaScript):

const botpress = require('botpress')

botpress({
  botfile: '<path to botfile.js>',
}).start()

st-chat is a lightweight Streamlit-based chat interface, while botpress is a full-featured chatbot development platform. st-chat is easier to set up and integrate into existing Streamlit applications, making it ideal for quick prototypes or simple chat interfaces. botpress offers more advanced features like natural language understanding, multi-channel support, and a visual flow builder, but requires more setup and resources. Choose st-chat for simple Streamlit-based chat applications, and botpress for more complex, production-ready chatbot solutions across multiple platforms.

🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.

Pros of transformers

  • Comprehensive library with support for numerous state-of-the-art models
  • Extensive documentation and community support
  • Regularly updated with new models and features

Cons of transformers

  • Steeper learning curve due to its extensive functionality
  • Larger library size, which may impact project load times
  • May be overkill for simple chatbot applications

Code Comparison

transformers:

from transformers import pipeline

classifier = pipeline("sentiment-analysis")
result = classifier("I love this product!")[0]
print(f"Label: {result['label']}, Score: {result['score']:.4f}")

st-chat:

import streamlit as st
from streamlit_chat import message

message("How can I help you today?", is_user=False)
user_input = st.text_input("Your message: ", key="user_input")

Summary

transformers is a powerful and versatile library for working with various NLP models, offering extensive functionality and support. However, it may be more complex and resource-intensive than necessary for simple chatbot applications. st-chat, on the other hand, focuses specifically on creating chat interfaces using Streamlit, making it more straightforward for basic chatbot implementations but potentially limiting for more advanced NLP tasks.

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README

st-chat

Streamlit Component, for a Chat-bot UI, example app

authors - @yashppawar & @YashVardhan-AI

Installation

Install streamlit-chat with pip

pip install streamlit-chat 

usage, import the message function from streamlit_chat

import streamlit as st
from streamlit_chat import message

message("My message") 
message("Hello bot!", is_user=True)  # align's the message to the right

Screenshot

chatbot-og

Another example for html in chat, and Refresh chat button

import streamlit as st
from streamlit_chat import message
from streamlit.components.v1 import html

def on_input_change():
    user_input = st.session_state.user_input
    st.session_state.past.append(user_input)
    st.session_state.generated.append("The messages from Bot\nWith new line")

def on_btn_click():
    del st.session_state.past[:]
    del st.session_state.generated[:]

audio_path = "https://docs.google.com/uc?export=open&id=16QSvoLWNxeqco_Wb2JvzaReSAw5ow6Cl"
img_path = "https://www.groundzeroweb.com/wp-content/uploads/2017/05/Funny-Cat-Memes-11.jpg"
youtube_embed = '''
<iframe width="400" height="215" src="https://www.youtube.com/embed/LMQ5Gauy17k" title="YouTube video player" frameborder="0" allow="accelerometer; encrypted-media;"></iframe>
'''

markdown = """
### HTML in markdown is ~quite~ **unsafe**
<blockquote>
  However, if you are in a trusted environment (you trust the markdown). You can use allow_html props to enable support for html.
</blockquote>

* Lists
* [ ] todo
* [x] done

Math:

Lift($L$) can be determined by Lift Coefficient ($C_L$) like the following
equation.

$$
L = \\frac{1}{2} \\rho v^2 S C_L
$$

~~~py
import streamlit as st

st.write("Python code block")
~~~

~~~js
console.log("Here is some JavaScript code")
~~~

"""

table_markdown = '''
A Table:

| Feature     | Support              |
| ----------: | :------------------- |
| CommonMark  | 100%                 |
| GFM         | 100% w/ `remark-gfm` |
'''

st.session_state.setdefault(
    'past', 
    ['plan text with line break',
     'play the song "Dancing Vegetables"', 
     'show me image of cat', 
     'and video of it',
     'show me some markdown sample',
     'table in markdown']
)
st.session_state.setdefault(
    'generated', 
    [{'type': 'normal', 'data': 'Line 1 \n Line 2 \n Line 3'},
     {'type': 'normal', 'data': f'<audio controls src="{audio_path}"></audio>'}, 
     {'type': 'normal', 'data': f'<img width="100%" height="200" src="{img_path}"/>'}, 
     {'type': 'normal', 'data': f'{youtube_embed}'},
     {'type': 'normal', 'data': f'{markdown}'},
     {'type': 'table', 'data': f'{table_markdown}'}]
)

st.title("Chat placeholder")

chat_placeholder = st.empty()

with chat_placeholder.container():    
    for i in range(len(st.session_state['generated'])):                
        message(st.session_state['past'][i], is_user=True, key=f"{i}_user")
        message(
            st.session_state['generated'][i]['data'], 
            key=f"{i}", 
            allow_html=True,
            is_table=True if st.session_state['generated'][i]['type']=='table' else False
        )
    
    st.button("Clear message", on_click=on_btn_click)

with st.container():
    st.text_input("User Input:", on_change=on_input_change, key="user_input")

Screenshot

chatbot-markdown-sp