Convert Figma logo to code with AI

microsoft logoazurechat

🤖 💼 Azure Chat Solution Accelerator powered by Azure Open AI Service

1,177
1,070
1,177
131

Top Related Projects

The ChatGPT Retrieval Plugin lets you easily find personal or work documents by asking questions in natural language.

93,526

🦜🔗 Build context-aware reasoning applications

Integrate cutting-edge LLM technology quickly and easily into your apps

Build high-quality LLM apps - from prototyping, testing to production deployment and monitoring.

36,658

An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.

Quick Overview

Azure Chat is an open-source enterprise-grade application that enables organizations to create their own ChatGPT-like experiences using their private data. It leverages Azure OpenAI and Cognitive Search to provide a secure, customizable chatbot solution that can be deployed on Azure.

Pros

  • Customizable and secure chatbot solution for enterprises
  • Utilizes Azure OpenAI and Cognitive Search for powerful AI capabilities
  • Supports multiple data sources and file types for knowledge base creation
  • Provides a user-friendly interface for both end-users and administrators

Cons

  • Requires Azure subscription and associated costs
  • Limited to Azure ecosystem, which may not be suitable for all organizations
  • Potential complexity in setup and configuration for non-technical users
  • Dependency on Azure services may lead to vendor lock-in

Getting Started

To get started with Azure Chat:

  1. Clone the repository:

    git clone https://github.com/microsoft/azurechat.git
    
  2. Set up Azure resources:

    • Azure OpenAI service
    • Azure Cognitive Search
    • Azure App Service
    • Azure Storage Account
  3. Configure environment variables:

    • Copy .env.example to .env.local
    • Fill in the required Azure service details
  4. Install dependencies and run the application:

    npm install
    npm run dev
    
  5. Access the application at http://localhost:3000

For detailed instructions, refer to the project's README and documentation.

Competitor Comparisons

The ChatGPT Retrieval Plugin lets you easily find personal or work documents by asking questions in natural language.

Pros of chatgpt-retrieval-plugin

  • More flexible and customizable for various data sources and retrieval methods
  • Supports multiple vector database options (Pinecone, Weaviate, Zilliz, etc.)
  • Can be integrated with existing ChatGPT applications

Cons of chatgpt-retrieval-plugin

  • Requires more setup and configuration
  • Less user-friendly for non-technical users
  • Limited built-in features compared to AzureChat's comprehensive solution

Code Comparison

chatgpt-retrieval-plugin:

from datastore.providers import DatastoreProvider
from datastore.providers.pinecone_datastore import PineconeDatastore

datastore = PineconeDatastore(
    api_key=os.getenv("PINECONE_API_KEY"),
    environment=os.getenv("PINECONE_ENVIRONMENT"),
    index_name=os.getenv("PINECONE_INDEX")
)

AzureChat:

var chatCompletion = await openAIClient.GetChatCompletionsAsync(
    deploymentOrModelName: "gpt-35-turbo",
    new ChatCompletionsOptions()
    {
        Messages = { systemMessage, userMessage },
    });

The code snippets highlight the different approaches:

  • chatgpt-retrieval-plugin focuses on configuring the datastore provider
  • AzureChat emphasizes simplicity in making API calls to Azure OpenAI
93,526

🦜🔗 Build context-aware reasoning applications

Pros of langchain

  • More versatile and flexible, supporting multiple LLMs and integrations
  • Extensive documentation and active community support
  • Provides a higher level of abstraction for complex AI workflows

Cons of langchain

  • Steeper learning curve due to its broader scope
  • May be overkill for simple chatbot applications
  • Requires more setup and configuration compared to AzureChat

Code Comparison

AzureChat (TypeScript):

const chat = new AzureChatAPI(apiKey, endpoint);
const response = await chat.sendMessage("Hello, how are you?");
console.log(response.content);

langchain (Python):

from langchain import OpenAI, LLMChain, PromptTemplate

llm = OpenAI(temperature=0.9)
prompt = PromptTemplate(input_variables=["product"], template="What is a good name for a company that makes {product}?")
chain = LLMChain(llm=llm, prompt=prompt)

print(chain.run("colorful socks"))

AzureChat is more focused on Azure-specific implementations and provides a simpler interface for chatbot applications. langchain offers a broader range of capabilities and integrations, making it suitable for more complex AI workflows across various platforms and use cases.

Integrate cutting-edge LLM technology quickly and easily into your apps

Pros of semantic-kernel

  • More versatile and can be used for a wider range of AI-powered applications
  • Offers a robust plugin system for extending functionality
  • Provides deeper integration with various AI models and services

Cons of semantic-kernel

  • Steeper learning curve due to its more complex architecture
  • Requires more setup and configuration compared to AzureChat
  • May be overkill for simple chatbot applications

Code Comparison

semantic-kernel:

using Microsoft.SemanticKernel;

var kernel = Kernel.Builder.Build();
var result = await kernel.RunAsync("Hello, world!");
Console.WriteLine(result);

AzureChat:

import { AzureChatBot } from '@azure/bot-service';

const bot = new AzureChatBot(config);
bot.onMessage(async (context) => {
  await context.sendActivity('Hello, world!');
});

The code snippets demonstrate that semantic-kernel offers a more flexible approach for building AI-powered applications, while AzureChat provides a simpler, more focused solution for creating chatbots. semantic-kernel's code shows its kernel-based architecture, allowing for more complex operations, whereas AzureChat's code is specifically tailored for bot interactions.

Build high-quality LLM apps - from prototyping, testing to production deployment and monitoring.

Pros of PromptFlow

  • More versatile, supporting various AI/ML workflows beyond just chat applications
  • Offers a visual interface for designing and managing complex prompt flows
  • Provides integration with Azure AI services and supports multiple LLM providers

Cons of PromptFlow

  • Steeper learning curve due to its broader scope and more complex features
  • May be overkill for simple chat applications or prototypes
  • Requires more setup and configuration compared to AzureChat

Code Comparison

AzureChat:

const chat = new AzureChatService(config);
const response = await chat.sendMessage(userInput);

PromptFlow:

from promptflow import PFClient

client = PFClient()
flow = client.flows.create_or_update(source="./flow")
result = client.test(flow=flow, inputs={"text": "Hello"})

Summary

PromptFlow is a more comprehensive tool for building AI-powered applications, offering greater flexibility and integration options. However, it may be more complex to set up and use compared to AzureChat, which is specifically designed for chat applications. AzureChat provides a simpler, more focused solution for building chat interfaces, while PromptFlow caters to a wider range of AI/ML use cases and workflows.

36,658

An open platform for training, serving, and evaluating large language models. Release repo for Vicuna and Chatbot Arena.

Pros of FastChat

  • More versatile, supporting multiple LLM models (ChatGPT, Vicuna, Alpaca, etc.)
  • Offers a web UI, CLI, and API for interacting with models
  • Provides tools for model evaluation and fine-tuning

Cons of FastChat

  • Requires more setup and configuration compared to AzureChat
  • Less integrated with Azure services and ecosystem
  • May require more technical expertise to deploy and manage

Code Comparison

FastChat example (model loading):

from fastchat.model import load_model

model, tokenizer = load_model("vicuna-7b")

AzureChat example (chat completion):

from azure.ai.ml import MLClient
from azure.ai.ml.entities import AmlCompute

ml_client = MLClient.from_config()
chat_deployment = ml_client.online_endpoints.get("chat-endpoint")

Summary

FastChat offers more flexibility and supports multiple LLM models, making it suitable for research and experimentation. AzureChat, on the other hand, provides a more streamlined experience within the Azure ecosystem, making it easier to deploy and integrate with other Azure services. The choice between the two depends on specific project requirements and the desired level of customization.

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README

Unleash the Power of Azure Open AI

  1. Introduction
  2. Solution Overview
  3. Deploy to Azure
  4. Run from your local machine
  5. Deploy to Azure with GitHub Actions
  6. Add identity provider
  7. Chatting with your file
  8. Persona
  9. Extensions
  10. Environment variables
  11. Migration considerations

Introduction

Azure Chat Solution Accelerator powered by Azure Open AI Service

Azure Chat Solution Accelerator powered by Azure Open AI Service is a solution accelerator that allows organisations to deploy a private chat tenant in their Azure Subscription, with a familiar user experience and the added capabilities of chatting over your data and files.

Benefits are:

  1. Private: Deployed in your Azure tenancy, allowing you to isolate it to your Azure tenant.

  2. Controlled: Network traffic can be fully isolated to your network and other enterprise grade authentication security features are built in.

  3. Value: Deliver added business value with your own internal data sources (plug and play) or integrate with your internal services (e.g., ServiceNow, etc).

Deploy to Azure

You can provision Azure resources for the solution accelerator using either the Azure Developer CLI or the Deploy to Azure button below. Regardless of the method you chose you will still need set up an identity provider and specify an admin user

Deployment Options

You can deploy the application using one of the following options:

1. Azure Developer CLI

[!IMPORTANT] This section will create Azure resources and deploy the solution from your local environment using the Azure Developer CLI. Note that you do not need to clone this repo to complete these steps.

  1. Download the Azure Developer CLI
  2. If you have not cloned this repo, run azd init -t microsoft/azurechat. If you have cloned this repo, just run 'azd init' from the repo root directory.
  3. Run azd up to provision and deploy the application
azd init -t microsoft/azurechat
azd up

# if you are wanting to see logs run with debug flag
azd up --debug

2. Azure Portal Deployment

[!WARNING] This button will only create Azure resources. You will still need to deploy the application by following the deploy to Azure section to build and deploy the application using GitHub actions.

Click on the Deploy to Azure button to deploy the Azure resources for the application.

Deploy to Azure

[!IMPORTANT] The application is protected by an identity provider and follow the steps in Add an identity provider section for adding authentication to your app.

Next

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.opensource.microsoft.com.

When you submit a pull request, a CLA bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., status check, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

Trademarks

This project may contain trademarks or logos for projects, products, or services. Authorized use of Microsoft trademarks or logos is subject to and must follow Microsoft's Trademark & Brand Guidelines. Use of Microsoft trademarks or logos in modified versions of this project must not cause confusion or imply Microsoft sponsorship. Any use of third-party trademarks or logos are subject to those third-party's policies.