casibase
β‘οΈAI Cloud OS: Open-source enterprise-level AI knowledge base and MCP (model-context-protocol)/A2A (agent-to-agent) management platform with admin UI, user management and Single-Sign-Onβ‘οΈ, supports ChatGPT, Claude, Llama, Ollama, HuggingFace, etc., chat bot demo: https://ai.casibase.com, admin UI demo: https://ai-admin.casibase.com
Top Related Projects
Integrate cutting-edge LLM technology quickly and easily into your apps
π¦π Build context-aware reasoning applications
the AI-native open-source embedding database
LlamaIndex is the leading framework for building LLM-powered agents over your data.
Quick Overview
Casibase is an open-source AI knowledge database and chatbot platform. It combines the power of language models with a customizable knowledge base, allowing users to create AI assistants tailored to specific domains or use cases. Casibase supports multiple language models and offers features like semantic search and multi-turn conversations.
Pros
- Customizable knowledge base for domain-specific AI assistants
- Support for multiple language models, including OpenAI, Azure, and Hugging Face
- User-friendly web interface for managing the knowledge base and interacting with chatbots
- Open-source and self-hostable, providing greater control and privacy
Cons
- Requires technical knowledge to set up and configure
- Limited documentation compared to some commercial alternatives
- May require significant resources for hosting and running language models locally
- Still in active development, which may lead to frequent changes and potential instability
Getting Started
To get started with Casibase, follow these steps:
-
Clone the repository:
git clone https://github.com/casibase/casibase.git
-
Navigate to the project directory:
cd casibase
-
Install dependencies:
go mod tidy
-
Build the project:
go build
-
Run Casibase:
./casibase
-
Access the web interface at
http://localhost:14000
and follow the setup wizard to configure your instance.
For more detailed instructions and configuration options, refer to the project's README and documentation on GitHub.
Competitor Comparisons
Integrate cutting-edge LLM technology quickly and easily into your apps
Pros of Semantic Kernel
- More extensive documentation and examples
- Larger community and support from Microsoft
- Broader language support (C#, Python, Java)
Cons of Semantic Kernel
- More complex architecture and setup
- Steeper learning curve for beginners
- Less focus on low-code/no-code solutions
Code Comparison
Semantic Kernel (C#):
var kernel = Kernel.Builder.Build();
var promptTemplate = "{{$input}}";
var function = kernel.CreateSemanticFunction(promptTemplate);
var result = await kernel.RunAsync("Hello, world!", function);
Casibase (JavaScript):
const casibase = new Casibase();
const result = await casibase.chat({
messages: [{ role: "user", content: "Hello, world!" }],
});
Summary
Semantic Kernel offers a more comprehensive framework with extensive documentation and multi-language support, backed by Microsoft. However, it may be more complex for beginners. Casibase provides a simpler, more straightforward approach, particularly suitable for JavaScript developers and those seeking a low-code solution. The code comparison demonstrates the difference in complexity and setup between the two projects.
π¦π Build context-aware reasoning applications
Pros of LangChain
- More extensive and mature ecosystem with a wider range of integrations
- Larger community and better documentation, making it easier for developers to get started and find support
- Offers more flexibility and customization options for building complex AI applications
Cons of LangChain
- Steeper learning curve due to its extensive features and abstractions
- Can be overkill for simpler projects, potentially adding unnecessary complexity
- Requires more setup and configuration compared to Casibase's out-of-the-box approach
Code Comparison
LangChain example:
from langchain import OpenAI, LLMChain, PromptTemplate
llm = OpenAI(temperature=0.9)
prompt = PromptTemplate(input_variables=["product"], template="What is a good name for a company that makes {product}?")
chain = LLMChain(llm=llm, prompt=prompt)
print(chain.run("colorful socks"))
Casibase example:
const casibase = require('casibase');
const client = new casibase.Client('YOUR_API_KEY');
const response = await client.chat('What is a good name for a company that makes colorful socks?');
console.log(response);
The code comparison shows that LangChain offers more granular control over the AI model and prompt structure, while Casibase provides a simpler, more straightforward API for quick implementation.
the AI-native open-source embedding database
Pros of Chroma
- More mature project with higher GitHub stars and contributors
- Extensive documentation and examples for various use cases
- Strong focus on vector database functionality and embeddings
Cons of Chroma
- Limited to Python ecosystem, less versatile for multi-language projects
- Requires more setup and configuration compared to Casibase's all-in-one approach
- Less emphasis on UI and visualization tools
Code Comparison
Chroma example:
import chromadb
client = chromadb.Client()
collection = client.create_collection("my_collection")
collection.add(
documents=["This is a document", "This is another document"],
metadatas=[{"source": "my_source"}, {"source": "my_source"}],
ids=["id1", "id2"]
)
Casibase example:
import "github.com/casibase/casibase/object"
obj := &object.Object{
Owner: "admin",
Name: "My Object",
CreatedTime: util.GetCurrentTime(),
Data: "This is a document",
}
object.AddObject(obj)
Both repositories offer solutions for managing and querying data, but Chroma focuses more on vector databases and embeddings, while Casibase provides a broader set of features including authentication, visualization, and multi-language support. Chroma may be better suited for specialized machine learning projects, while Casibase offers a more comprehensive solution for general-purpose data management and analysis.
LlamaIndex is the leading framework for building LLM-powered agents over your data.
Pros of LlamaIndex
- More extensive documentation and examples, making it easier for developers to get started
- Broader range of integrations with popular LLM models and data sources
- Active community with frequent updates and contributions
Cons of LlamaIndex
- Steeper learning curve due to more complex architecture and features
- Potentially higher resource requirements for large-scale applications
- Less focus on UI components, requiring more custom development for user interfaces
Code Comparison
Casibase (Python):
from casibase import Casibase
cb = Casibase()
cb.add_document("doc1.txt", "This is a sample document.")
results = cb.search("sample")
LlamaIndex (Python):
from llama_index import GPTSimpleVectorIndex, Document
documents = [Document("This is a sample document.")]
index = GPTSimpleVectorIndex.from_documents(documents)
response = index.query("sample")
Both libraries provide simple ways to index and search documents, but LlamaIndex offers more advanced querying capabilities and integration with various LLM models.
Convert
designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual CopilotREADME
Γ°ΒΒ¦ÒΒ‘︠Casibase
Open-source AI LangChain-like RAG (Retrieval-Augmented Generation) knowledge database with web UI and Enterprise SSO, supports OpenAI, Azure, LLaMA, Google Gemini, HuggingFace, Claude, Grok, etc.,
Online Demo
Read-only site (any modification operation will fail)
- Chat bot: https://ai.casibase.com
- Admin UI: https://ai-admin.casibase.com
Writable site (original data will be restored for every 5 minutes)
- Chat bot: https://demo.casibase.com
- Admin UI: https://demo-admin.casibase.com
Documentation
Architecture
Casibase contains 2 parts:
Name | Description | Language |
---|---|---|
Frontend | User interface for Casibase | JavaScript + React |
Backend | Server-side logic and API for Casibase | Golang + Beego + Python + Flask + MySQL |
Supported Models
Language Model
Model | Sub Type | Link |
---|---|---|
OpenAI | dall-e-3, gpt-3.5-turbo-0125, gpt-3.5-turbo, gpt-3.5-turbo-1106, gpt-3.5-turbo-instruct, gpt-3.5-turbo-16k-0613, gpt-3.5-turbo-16k, gpt-4-0125-preview, gpt-4-1106-preview, gpt-4-turbo-preview, gpt-4-vision-preview, gpt-4-1106-vision-preview, gpt-4, gpt-4-0613, gpt-4-32k, gpt-4-32k-0613, gpt-4o, gpt-4o-2024-05-13, gpt-4o-mini, gpt-4o-mini-2024-07-18 | OpenAI |
Claude | claude-2.0, claude-2.1, claude-instant-1.2, claude-3-sonnet-20240229, claude-3-opus-20240229, claude-3-haiku-20240307 | Claude |
Local | custom-model | Local Computer |
DeepSeek | deepseek-chat, deepseek-reasoner | DeepSeek |
Azure | dall-e-3, gpt-3.5-turbo-0125, gpt-3.5-turbo, gpt-3.5-turbo-1106, gpt-3.5-turbo-instruct, gpt-3.5-turbo-16k-0613, gpt-3.5-turbo-16k, gpt-4-0125-preview, gpt-4-1106-preview, gpt-4-turbo-preview, gpt-4-vision-preview, gpt-4-1106-vision-preview, gpt-4, gpt-4-0613, gpt-4-32k, gpt-4-32k-0613, gpt-4o, gpt-4o-2024-05-13, gpt-4o-mini, gpt-4o-mini-2024-07-18 | Azure |
Amazon Bedrock | claude, claude-instant, command, command-light, embed-english, embed-multilingual, jurassic-2-mid, jurassic-2-ultra, llama-2-chat-13b, llama-2-chat-70b, titan-text-lite, titan-text-express, titan-embeddings, titan-multimodal-embeddings | Amazon Bedrock |
Qwen | qwen-long, qwen-turbo, qwen-plus, qwen-max, qwen-max-longcontext | Qwen |
Gemini | gemini-pro, gemini-pro-vision | Gemini |
Hugging Face | meta-llama/Llama-2-7b, tiiuae/falcon-180B, bigscience/bloom, gpt2, baichuan-inc/Baichuan2-13B-Chat, THUDM/chatglm2-6b | Hugging Face |
Cohere | command-light, command | Cohere |
iFlytek | spark-v1.5, spark-v2.0 | iFlytek |
ChatGLM | glm-3-turbo, glm-4, glm-4V | ChatGLM |
MiniMax | abab5-chat | MiniMax |
Ernie | ERNIE-Bot, ERNIE-Bot-turbo, BLOOMZ-7B, Llama-2 | Ernie |
Moonshot | moonshot-v1-8k, moonshot-v1-32k, moonshot-v1-128k | Moonshot |
Baichuan | Baichuan2-Turbo, Baichuan3-Turbo, Baichuan4 | Baichuan |
Doubao | Doubao-lite-4k, Doubao-lite-32k, Doubao-lite-128k, Doubao-pro-4k, Doubao-pro-32k, Doubao-pro-128k | Doubao |
StepFun | step-1-8k, step-1-32k, step-1-128k, sstep-1-256k, step-1-flash, step-2-16k | StepFun |
Hunyuan | hunyuan-lite, hunyuan-standard, hunyuan-standard-256K, hunyuan-pro, hunyuan-code, hunyuan-role, hunyuan-turbo | Hunyuan |
Mistral | mistral-large-latest, pixtral-large-latest, mistral-small-latest, codestral-latest, ministral-8b-latest, ministral-3b-latest, pixtral-12b, mistral-nemo, open-mistral-7b, open-mixtral-8x7b, open-mixtral-8x22b | Mistral |
OpenRouter | google/palm-2-codechat-bison, google/palm-2-chat-bison, openai/gpt-3.5-turbo, openai/gpt-3.5-turbo-16k, openai/gpt-4, openai/gpt-4-32k, anthropic/claude-2, anthropic/claude-instant-v1, meta-llama/llama-2-13b-chat, meta-llama/llama-2-70b-chat, palm-2-codechat-bison, palm-2-chat-bison, gpt-3.5-turbo, gpt-3.5-turbo-16k, gpt-4, gpt-4-32k, claude-2, claude-instant-v1, llama-2-13b-chat, llama-2-70b-chat | OpenRouter |
Embedding Model
Model | Sub Type | Link |
---|---|---|
OpenAI | text-embedding-ada-002, text-embedding-3-small, text-embedding-3-large | OpenAI |
Local | custom-embedding | Local Computer |
Azure | text-embedding-ada-002, text-embedding-3-small, text-embedding-3-large | Azure |
Hugging Face | sentence-transformers/all-MiniLM-L6-v2 | Hugging Face |
Qwen | text-embedding-v1, text-embedding-v2, text-embedding-v3 | Qwen |
Cohere | embed-english-v2.0, embed-english-light-v2.0, embed-multilingual-v2.0, embed-english-v3.0 | Cohere |
Ernie | default | Ernie |
MiniMax | embo-01 | MiniMax |
Hunyuan | hunyuan-embedding | Hunyuan |
Jina | jina-embeddings-v2-base-zh, jina-embeddings-v2-base-en, jina-embeddings-v2-base-de, jina-embeddings-v2-base-code | Jina |
Documentation
Install
https://casibase.org/docs/basic/server-installation
How to contact?
Discord: https://discord.gg/5rPsrAzK7S
Contribute
For Casibase, if you have any questions, you can give issues, or you can also directly start Pull Requests(but we recommend giving issues first to communicate with the community).
License
Top Related Projects
Integrate cutting-edge LLM technology quickly and easily into your apps
π¦π Build context-aware reasoning applications
the AI-native open-source embedding database
LlamaIndex is the leading framework for building LLM-powered agents over your data.
Convert
designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual Copilot