Convert Figma logo to code with AI

FlowiseAI logoFlowise

Drag & drop UI to build your customized LLM flow

29,475
15,254
29,475
441

Top Related Projects

92,073

πŸ¦œπŸ”— Build context-aware reasoning applications

27,191

Langflow is a low-code app builder for RAG and multi-agent AI applications. It’s Python-based and agnostic to any model, API, or database.

⚑ Langchain apps in production using Jina & FastAPI

Integrate cutting-edge LLM technology quickly and easily into your apps

16,603

:mag: AI orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.

Quick Overview

FlowiseAI/Flowise is an open-source UI visual tool for building and prototyping LLM (Large Language Model) flows. It allows users to create custom AI agents and workflows by dragging and dropping components, making it easier to experiment with and deploy AI-powered applications without extensive coding knowledge.

Pros

  • User-friendly visual interface for creating AI workflows
  • Supports integration with various LLMs and tools
  • Extensible architecture allowing for custom component creation
  • Active community and regular updates

Cons

  • May have a learning curve for users new to LLMs and AI concepts
  • Limited compared to more advanced programming frameworks for AI
  • Potential performance limitations for complex workflows
  • Dependency on external services and APIs

Getting Started

To get started with Flowise, follow these steps:

# Clone the repository
git clone https://github.com/FlowiseAI/Flowise.git

# Navigate to the project directory
cd Flowise

# Install dependencies
npm install

# Build the project
npm run build

# Start Flowise
npm start

After starting Flowise, open your browser and navigate to http://localhost:3000 to access the UI and begin building your AI workflows.

Competitor Comparisons

92,073

πŸ¦œπŸ”— Build context-aware reasoning applications

Pros of Langchain

  • More extensive and flexible framework for building AI applications
  • Larger community and ecosystem with more integrations and resources
  • Supports multiple programming languages (Python, JavaScript, etc.)

Cons of Langchain

  • Steeper learning curve due to its complexity and extensive features
  • Requires more coding knowledge and effort to set up and use effectively
  • Less user-friendly for non-technical users or quick prototyping

Code Comparison

Langchain (Python):

from langchain import OpenAI, LLMChain, PromptTemplate

llm = OpenAI(temperature=0.9)
prompt = PromptTemplate(input_variables=["product"], template="What is a good name for a company that makes {product}?")
chain = LLMChain(llm=llm, prompt=prompt)

Flowise (JavaScript):

import { ChatOpenAI } from "langchain/chat_models/openai";
import { PromptTemplate } from "langchain/prompts";

const chatModel = new ChatOpenAI({ temperature: 0.9 });
const prompt = PromptTemplate.fromTemplate("What is a good name for a company that makes {product}?");

Flowise offers a more visual, drag-and-drop interface for creating AI workflows, making it more accessible to non-developers. However, it may be less flexible for complex use cases compared to Langchain's code-based approach. Langchain provides more control and customization options but requires more programming expertise.

27,191

Langflow is a low-code app builder for RAG and multi-agent AI applications. It’s Python-based and agnostic to any model, API, or database.

Pros of Langflow

  • More extensive documentation and examples
  • Supports a wider range of LLM providers out-of-the-box
  • Offers a more intuitive drag-and-drop interface for building flows

Cons of Langflow

  • Less active community and fewer contributors
  • Limited customization options for advanced users
  • Slower release cycle and updates compared to Flowise

Code Comparison

Langflow example:

from langflow import LangFlow

flow = LangFlow()
flow.add_node("OpenAI", api_key="your-api-key")
flow.add_node("PromptTemplate", template="Hello, {name}!")
flow.connect("OpenAI", "PromptTemplate")

Flowise example:

import { Flowise } from 'flowise';

const flow = new Flowise();
flow.addNode('OpenAI', { apiKey: 'your-api-key' });
flow.addNode('PromptTemplate', { template: 'Hello, {name}!' });
flow.connect('OpenAI', 'PromptTemplate');

Both Langflow and Flowise aim to simplify the process of building language model workflows. While Langflow offers a more user-friendly interface and broader LLM support, Flowise benefits from a more active community and faster development cycle. The choice between the two depends on specific project requirements and user preferences.

⚑ Langchain apps in production using Jina & FastAPI

Pros of langchain-serve

  • Built on Jina AI's robust ecosystem, offering scalability and distributed processing capabilities
  • Seamless integration with LangChain, providing access to a wide range of language models and tools
  • Supports asynchronous processing, enabling efficient handling of multiple requests

Cons of langchain-serve

  • Steeper learning curve due to its integration with Jina AI's ecosystem
  • Less user-friendly interface compared to Flowise's drag-and-drop UI
  • May require more setup and configuration for simple use cases

Code Comparison

langchain-serve:

from langchain_serve import LangServe
from langchain.chains import LLMChain

serve = LangServe()
serve.add_chain(LLMChain(...))
serve.run()

Flowise:

import { Flowise } from 'flowise';

const flowise = new Flowise();
flowise.addNode('LLMChain', { ... });
flowise.run();

Both repositories aim to simplify the deployment of LangChain applications, but they take different approaches. Flowise focuses on providing a user-friendly, visual interface for building and deploying LangChain flows, making it more accessible to non-developers. On the other hand, langchain-serve leverages Jina AI's powerful ecosystem, offering more advanced features and scalability options for complex use cases.

Integrate cutting-edge LLM technology quickly and easily into your apps

Pros of Semantic Kernel

  • More comprehensive framework for AI orchestration and integration
  • Stronger support for enterprise-level applications
  • Extensive documentation and community support

Cons of Semantic Kernel

  • Steeper learning curve for beginners
  • Less visual interface for workflow creation
  • Primarily focused on C# and .NET ecosystems

Code Comparison

Semantic Kernel (C#):

var kernel = Kernel.Builder.Build();
var function = kernel.CreateSemanticFunction("Generate a story about {{$input}}");
var result = await kernel.RunAsync("a brave knight", function);

Flowise (JavaScript):

const response = await fetch('/api/v1/prediction/[chatflowid]', {
  method: 'POST',
  body: JSON.stringify({ question: 'Generate a story about a brave knight' })
});
const result = await response.json();

Summary

Semantic Kernel offers a more robust framework for AI integration, particularly suited for enterprise applications and .NET developers. It provides extensive documentation and community support but has a steeper learning curve.

Flowise, on the other hand, offers a more user-friendly visual interface for creating AI workflows, making it more accessible to beginners. However, it may have limitations for complex enterprise-level applications compared to Semantic Kernel.

The code comparison shows that Semantic Kernel uses a more structured approach with its kernel and function creation, while Flowise relies on API calls for predictions.

16,603

:mag: AI orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.

Pros of Haystack

  • More comprehensive NLP framework with support for various tasks beyond QA
  • Extensive documentation and tutorials for easier adoption
  • Larger community and ecosystem with more integrations

Cons of Haystack

  • Steeper learning curve due to its broader scope
  • Requires more setup and configuration for basic use cases
  • Heavier resource requirements for deployment

Code Comparison

Haystack:

from haystack import Pipeline
from haystack.nodes import TextConverter, PreProcessor, Retriever, Reader

pipeline = Pipeline()
pipeline.add_node(component=TextConverter(), name="TextConverter", inputs=["File"])
pipeline.add_node(component=PreProcessor(), name="PreProcessor", inputs=["TextConverter"])
pipeline.add_node(component=Retriever(), name="Retriever", inputs=["PreProcessor"])
pipeline.add_node(component=Reader(), name="Reader", inputs=["Retriever"])

Flowise:

import { ChatOpenAI } from "langchain/chat_models/openai";
import { ConversationChain } from "langchain/chains";

const chatModel = new ChatOpenAI({ temperature: 0 });
const chain = new ConversationChain({ llm: chatModel });
const response = await chain.call({ input: "Hello! How are you?" });

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README

Flowise - Build LLM Apps Easily

Release Notes Discord Twitter Follow GitHub star chart GitHub fork

English | À¸­æ–‡ | æ—Β₯本èΒͺž | Γ­Β•ΒœΓͺ¡­ì–´

Drag & drop UI to build your customized LLM flow

Қ‘Quick Start

Download and Install NodeJS >= 18.15.0

  1. Install Flowise

    npm install -g flowise
    
  2. Start Flowise

    npx flowise start
    

    With username & password

    npx flowise start --FLOWISE_USERNAME=user --FLOWISE_PASSWORD=1234
    
  3. Open http://localhost:3000

🐳 Docker

Docker Compose

  1. Go to docker folder at the root of the project
  2. Copy .env.example file, paste it into the same location, and rename to .env
  3. docker compose up -d
  4. Open http://localhost:3000
  5. You can bring the containers down by docker compose stop

Docker Image

  1. Build the image locally:

    docker build --no-cache -t flowise .
    
  2. Run image:

    docker run -d --name flowise -p 3000:3000 flowise
    
  3. Stop image:

    docker stop flowise
    

Γ°ΒŸΒ‘Β¨Γ’Β€ΒΓ°ΒŸΒ’Β» Developers

Flowise has 3 different modules in a single mono repository.

  • server: Node backend to serve API logics
  • ui: React frontend
  • components: Third-party nodes integrations
  • api-documentation: Auto-generated swagger-ui API docs from express

Prerequisite

  • Install PNPM
    npm i -g pnpm
    

Setup

  1. Clone the repository

    git clone https://github.com/FlowiseAI/Flowise.git
    
  2. Go into repository folder

    cd Flowise
    
  3. Install all dependencies of all modules:

    pnpm install
    
  4. Build all the code:

    pnpm build
    
    Exit code 134 (JavaScript heap out of memory) If you get this error when running the above `build` script, try increasing the Node.js heap size and run the script again:
    export NODE_OPTIONS="--max-old-space-size=4096"
    pnpm build
    
  5. Start the app:

    pnpm start
    

    You can now access the app on http://localhost:3000

  6. For development build:

    • Create .env file and specify the VITE_PORT (refer to .env.example) in packages/ui

    • Create .env file and specify the PORT (refer to .env.example) in packages/server

    • Run

      pnpm dev
      

    Any code changes will reload the app automatically on http://localhost:8080

Γ°ΒŸΒ”Β’ Authentication

To enable app level authentication, add FLOWISE_USERNAME and FLOWISE_PASSWORD to the .env file in packages/server:

FLOWISE_USERNAME=user
FLOWISE_PASSWORD=1234

🌱 Env Variables

Flowise support different environment variables to configure your instance. You can specify the following variables in the .env file inside packages/server folder. Read more

Γ°ΒŸΒ“Β– Documentation

Flowise Docs

🌐 Self Host

Deploy Flowise self-hosted in your existing infrastructure, we support various deployments

Ҙï¸ Flowise Cloud

Get Started with Flowise Cloud

Γ°ΒŸΒ™Β‹ Support

Feel free to ask any questions, raise problems, and request new features in discussion

Γ°ΒŸΒ™ΒŒ Contributing

Thanks go to these awesome contributors

See contributing guide. Reach out to us at Discord if you have any questions or issues. Star History Chart

Γ°ΒŸΒ“Β„ License

Source code in this repository is made available under the Apache License Version 2.0.

NPM DownloadsLast 30 Days