kestra
:zap: Workflow Automation Platform. Orchestrate & Schedule code in any language, run anywhere, 600+ plugins. Alternative to Airflow, n8n, Rundeck, VMware vRA, Zapier ...
Top Related Projects
Apache Airflow - A platform to programmatically author, schedule, and monitor workflows
Prefect is a workflow orchestration framework for building resilient data pipelines in Python.
An orchestration platform for the development, production, and observation of data assets.
Always know what to expect from your data.
🧙 Build, run, and manage data pipelines for integrating and transforming data.
Conductor is a microservices orchestration engine.
Quick Overview
Kestra is an open-source, cloud-native orchestration and scheduling platform for data-driven workflows. It allows users to build, run, schedule, and monitor complex data pipelines and ETL processes with a user-friendly interface and powerful features.
Pros
- Highly scalable and distributed architecture
- Supports multiple programming languages (Java, Python, Node.js, etc.)
- Provides a user-friendly web interface for workflow management
- Offers built-in error handling and retry mechanisms
Cons
- Relatively new project, which may lead to potential stability issues
- Limited ecosystem compared to more established workflow tools
- Steeper learning curve for users unfamiliar with declarative workflow definitions
- Documentation could be more comprehensive for advanced use cases
Code Examples
- Define a simple workflow:
id: simple_workflow
namespace: dev
tasks:
- id: hello
type: io.kestra.core.tasks.scripts.Bash
script: echo "Hello, Kestra!"
- id: print_date
type: io.kestra.core.tasks.scripts.Bash
script: date
- Create a Python task:
- id: python_task
type: io.kestra.core.tasks.scripts.Python
script: |
import pandas as pd
data = {'Name': ['Alice', 'Bob', 'Charlie'],
'Age': [25, 30, 35]}
df = pd.DataFrame(data)
print(df)
- Use a flow trigger:
triggers:
- id: schedule
type: io.kestra.core.models.triggers.types.Schedule
cron: "0 0 * * *"
Getting Started
- Install Kestra using Docker:
docker run -d --name kestra -p 8080:8080 kestra/kestra:latest server standalone
-
Create a workflow file (e.g.,
workflow.yml
) with your tasks and triggers. -
Submit the workflow to Kestra:
curl -X POST -H "Content-Type: application/yaml" --data-binary @workflow.yml http://localhost:8080/api/v1/flows
- Access the Kestra UI at
http://localhost:8080
to manage and monitor your workflows.
Competitor Comparisons
Apache Airflow - A platform to programmatically author, schedule, and monitor workflows
Pros of Airflow
- Mature ecosystem with extensive plugin support and integrations
- Large and active community, providing robust support and resources
- Flexible scheduling options with powerful DAG-based workflow definitions
Cons of Airflow
- Steep learning curve, especially for complex workflows
- Resource-intensive, requiring significant infrastructure for large-scale deployments
- Potential performance issues with high-concurrency workloads
Code Comparison
Airflow DAG definition:
from airflow import DAG
from airflow.operators.python_operator import PythonOperator
from datetime import datetime
def print_hello():
return 'Hello World'
dag = DAG('hello_world', description='Simple tutorial DAG',
schedule_interval='0 12 * * *',
start_date=datetime(2017, 3, 20), catchup=False)
hello_operator = PythonOperator(task_id='hello_task', python_callable=print_hello, dag=dag)
Kestra flow definition:
id: hello_world
namespace: examples
tasks:
- id: hello
type: io.kestra.core.tasks.scripts.Shell
commands:
- echo "Hello World"
triggers:
- id: schedule
type: io.kestra.core.models.triggers.types.Schedule
cron: "0 12 * * *"
The code comparison shows that Kestra uses a YAML-based configuration for defining workflows, while Airflow uses Python code. Kestra's approach may be more accessible for users less familiar with Python, but Airflow's Python-based DAGs offer more flexibility and programmatic control.
Prefect is a workflow orchestration framework for building resilient data pipelines in Python.
Pros of Prefect
- More mature project with a larger community and ecosystem
- Offers both open-source and cloud-hosted options
- Extensive documentation and tutorials available
Cons of Prefect
- Steeper learning curve for beginners
- More complex setup and configuration process
- Heavier resource usage for small-scale workflows
Code Comparison
Prefect task definition:
from prefect import task
@task
def my_task():
return "Hello, Prefect!"
Kestra task definition:
tasks:
- id: my_task
type: io.kestra.core.tasks.scripts.Python
script: |
print("Hello, Kestra!")
Both Kestra and Prefect are workflow orchestration tools, but they differ in their approach and implementation. Prefect uses a Python-centric approach with decorators, while Kestra relies on YAML configuration for defining tasks and workflows. Prefect offers more flexibility and power for complex workflows, but Kestra's simplicity may be advantageous for straightforward use cases. The choice between them depends on specific project requirements and team expertise.
An orchestration platform for the development, production, and observation of data assets.
Pros of Dagster
- More extensive documentation and tutorials
- Stronger integration with Python ecosystem and data science tools
- Larger community and more frequent updates
Cons of Dagster
- Steeper learning curve for beginners
- Less intuitive UI for non-technical users
- More complex setup and configuration process
Code Comparison
Kestra task definition:
tasks:
- id: hello
type: io.kestra.core.tasks.scripts.Bash
commands:
- echo "Hello World!"
Dagster op definition:
@op
def hello_world():
print("Hello World!")
@job
def hello_job():
hello_world()
Both Kestra and Dagster are workflow orchestration tools, but they differ in their approach and target audience. Kestra focuses on simplicity and ease of use, with a more visual interface and YAML-based configuration. Dagster, on the other hand, is more Python-centric and offers deeper integration with data engineering and machine learning workflows.
Kestra's strength lies in its straightforward setup and intuitive UI, making it accessible to a wider range of users. Dagster excels in complex data pipelines and offers more advanced features for experienced data engineers and scientists.
The code comparison shows that Kestra uses YAML for task definitions, while Dagster employs Python decorators and functions. This reflects their different approaches to workflow design and execution.
Always know what to expect from your data.
Pros of Great Expectations
- Focused on data quality and validation, providing a comprehensive framework for data testing
- Extensive documentation and community support
- Integrates well with various data platforms and tools
Cons of Great Expectations
- Steeper learning curve due to its specialized focus on data quality
- Less flexibility for general-purpose workflow orchestration
- May require additional tools for complete data pipeline management
Code Comparison
Great Expectations:
import great_expectations as ge
df = ge.read_csv("my_data.csv")
df.expect_column_values_to_be_between("age", min_value=0, max_value=120)
Kestra:
id: data_validation
namespace: my_project
tasks:
- id: validate_data
type: io.kestra.plugin.scripts.python.Script
script: |
import pandas as pd
df = pd.read_csv("my_data.csv")
assert df["age"].between(0, 120).all()
While Great Expectations provides a dedicated framework for data validation, Kestra offers a more general-purpose workflow orchestration platform that can incorporate various tasks, including data validation, within a single pipeline.
🧙 Build, run, and manage data pipelines for integrating and transforming data.
Pros of Mage
- More user-friendly interface with a visual pipeline builder
- Stronger focus on machine learning and data science workflows
- Built-in data visualization and exploration tools
Cons of Mage
- Less mature project with fewer integrations compared to Kestra
- Limited support for complex workflow orchestration scenarios
- Smaller community and ecosystem
Code Comparison
Mage pipeline definition:
@data_loader
def load_data():
return pd.read_csv('data.csv')
@transformer
def transform_data(df):
return df.dropna()
@data_exporter
def export_data(df):
df.to_csv('cleaned_data.csv')
Kestra flow definition:
id: data_processing
namespace: example
tasks:
- id: load_data
type: io.kestra.plugin.jdbc.Query
url: jdbc:postgresql://localhost:5432/database
username: user
password: pass
sql: SELECT * FROM table
Summary
Mage is more focused on data science and ML workflows with a user-friendly interface, while Kestra offers more robust workflow orchestration capabilities and integrations. The choice between them depends on specific project requirements and team expertise.
Conductor is a microservices orchestration engine.
Pros of Conductor
- More mature project with a larger community and extensive production usage
- Supports multiple languages for task implementations (Java, Python, Go)
- Offers a rich set of features including dynamic workflows and event-based triggers
Cons of Conductor
- More complex setup and configuration compared to Kestra
- Steeper learning curve due to its extensive feature set
- Less focus on data-oriented workflows compared to Kestra
Code Comparison
Conductor workflow definition (JSON):
{
"name": "example_workflow",
"tasks": [
{
"name": "task_1",
"taskReferenceName": "task_1",
"type": "SIMPLE"
}
]
}
Kestra workflow definition (YAML):
id: example_workflow
tasks:
- id: task_1
type: io.kestra.core.tasks.scripts.Shell
commands:
- echo "Hello, World!"
Both Conductor and Kestra offer workflow orchestration capabilities, but they have different focuses and strengths. Conductor is more suited for complex, microservices-based architectures, while Kestra excels in data-oriented workflows and offers a more streamlined experience for getting started. The choice between the two depends on specific project requirements, team expertise, and the existing technology stack.
Convert
designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual CopilotREADME
Event-Driven Declarative Orchestration Platform
Click on the image to learn how to get started with Kestra in 4 minutes.
ð What is Kestra?
Kestra is an open-source, event-driven orchestration platform that makes both scheduled and event-driven workflows easy. By bringing Infrastructure as Code best practices to data, process, and microservice orchestration, you can build reliable workflows directly from the UI in just a few lines of YAML.
Key Features:
- Everything as Code and from the UI: keep workflows as code with a Git Version Control integration, even when building them from the UI.
- Event-Driven & Scheduled Workflows: automate both scheduled and real-time event-driven workflows via a simple
trigger
definition. - Declarative YAML Interface: define workflows using a simple configuration in the built-in code editor.
- Rich Plugin Ecosystem: hundreds of plugins built in to extract data from any database, cloud storage, or API, and run scripts in any language.
- Intuitive UI & Code Editor: build and visualize workflows directly from the UI with syntax highlighting, auto-completion and real-time syntax validation.
- Scalable: designed to handle millions of workflows, with high availability and fault tolerance.
- Version Control Friendly: write your workflows from the built-in code Editor and push them to your preferred Git branch directly from Kestra, enabling best practices with CI/CD pipelines and version control systems.
- Structure & Resilience: tame chaos and bring resilience to your workflows with namespaces, labels, subflows, retries, timeout, error handling, inputs, outputs that generate artifacts in the UI, variables, conditional branching, advanced scheduling, event triggers, backfills, dynamic tasks, sequential and parallel tasks, and skip tasks or triggers when needed by setting the flag
disabled
totrue
.
ð§âð» The YAML definition gets automatically adjusted any time you make changes to a workflow from the UI or via an API call. Therefore, the orchestration logic is always managed declaratively in code, even if you modify your workflows in other ways (UI, CI/CD, Terraform, API calls).
ð Quick Start
Try the Live Demo
Try Kestra with our Live Demo. No installation required!
Get Started Locally in 5 Minutes
Launch Kestra in Docker
Make sure that Docker is running. Then, start Kestra in a single command:
docker run --pull=always --rm -it -p 8080:8080 --user=root \
-v /var/run/docker.sock:/var/run/docker.sock \
-v /tmp:/tmp kestra/kestra:latest server local
If you're on Windows and use PowerShell:
docker run --pull=always --rm -it -p 8080:8080 --user=root `
-v "/var/run/docker.sock:/var/run/docker.sock" `
-v "C:/Temp:/tmp" kestra/kestra:latest server local
If you're on Windows and use Command Prompt (CMD):
docker run --pull=always --rm -it -p 8080:8080 --user=root ^
-v "/var/run/docker.sock:/var/run/docker.sock" ^
-v "C:/Temp:/tmp" kestra/kestra:latest server local
If you're on Windows and use WSL (Linux-based environment in Windows):
docker run --pull=always --rm -it -p 8080:8080 --user=root \
-v "/var/run/docker.sock:/var/run/docker.sock" \
-v "C:/Temp:/tmp" kestra/kestra:latest server local
Check our Installation Guide for other deployment options (Docker Compose, Podman, Kubernetes, AWS, GCP, Azure, and more).
Access the Kestra UI at http://localhost:8080 and start building your first flow!
Your First Hello World Flow
Create a new flow with the following content:
id: hello_world
namespace: dev
tasks:
- id: say_hello
type: io.kestra.plugin.core.log.Log
message: "Hello, World!"
Run the flow and see the output in the UI!
ð§© Plugin Ecosystem
Kestra's functionality is extended through a rich ecosystem of plugins that empower you to run tasks anywhere and code in any language, including Python, Node.js, R, Go, Shell, and more. Here's how Kestra plugins enhance your workflows:
-
Run Anywhere:
- Local or Remote Execution: Execute tasks on your local machine, remote servers via SSH, or scale out to serverless containers using Task Runners.
- Docker and Kubernetes Support: Seamlessly run Docker containers within your workflows or launch Kubernetes jobs to handle compute-intensive workloads.
-
Code in Any Language:
- Scripting Support: Write scripts in your preferred programming language. Kestra supports Python, Node.js, R, Go, Shell, and others, allowing you to integrate existing codebases and deployment patterns.
- Flexible Automation: Execute shell commands, run SQL queries against various databases, and make HTTP requests to interact with APIs.
-
Event-Driven and Real-Time Processing:
- Real-Time Triggers: React to events from external systems in real-time, such as file arrivals, new messages in message buses (Kafka, Redis, Pulsar, AMQP, MQTT, NATS, AWS SQS, Google Pub/Sub, Azure Event Hubs), and more.
- Custom Events: Define custom events to trigger flows based on specific conditions or external signals, enabling highly responsive workflows.
-
Cloud Integrations:
- AWS, Google Cloud, Azure: Integrate with a variety of cloud services to interact with storage solutions, messaging systems, compute resources, and more.
- Big Data Processing: Run big data processing tasks using tools like Apache Spark or interact with analytics platforms like Google BigQuery.
-
Monitoring and Notifications:
- Stay Informed: Send messages to Slack channels, email notifications, or trigger alerts in PagerDuty to keep your team updated on workflow statuses.
Kestra's plugin ecosystem is continually expanding, allowing you to tailor the platform to your specific needs. Whether you're orchestrating complex data pipelines, automating scripts across multiple environments, or integrating with cloud services, there's likely a plugin to assist. And if not, you can always build your own plugins to extend Kestra's capabilities.
ð§âð» Note: This is just a glimpse of what Kestra plugins can do. Explore the full list on our Plugins Page.
ð Key Concepts
- Flows: the core unit in Kestra, representing a workflow composed of tasks.
- Tasks: individual units of work, such as running a script, moving data, or calling an API.
- Namespaces: logical grouping of flows for organization and isolation.
- Triggers: schedule or events that initiate the execution of flows.
- Inputs & Variables: parameters and dynamic data passed into flows and tasks.
ð¨ Build Workflows Visually
Kestra provides an intuitive UI that allows you to interactively build and visualize your workflows:
- Drag-and-Drop Interface: add and rearrange tasks from the Topology Editor.
- Real-Time Validation: instant feedback on your workflow's syntax and structure to catch errors early.
- Auto-Completion: smart suggestions as you type to write flow code quickly and without syntax errors.
- Live Topology View: see your workflow as a Directed Acyclic Graph (DAG) that updates in real-time.
ð§ Extensible and Developer-Friendly
Plugin Development
Create custom plugins to extend Kestra's capabilities. Check out our Plugin Developer Guide to get started.
Infrastructure as Code
- Version Control: store your flows in Git repositories.
- CI/CD Integration: automate deployment of flows using CI/CD pipelines.
- Terraform Provider: manage Kestra resources with the official Terraform provider.
ð Join the Community
Stay connected and get support:
- Slack: Join our Slack community to ask questions and share ideas.
- LinkedIn: Follow us on LinkedIn â next to Slack and GitHub, this is our main channel to share updates and product announcements.
- YouTube: Subscribe to our YouTube channel for educational video content. We publish new videos every week!
- X: Follow us on X if you're still active there.
ð¤ Contributing
We welcome contributions of all kinds!
- Report Issues: Found a bug or have a feature request? Open an issue on GitHub.
- Contribute Code: Check out our Contributor Guide for initial guidelines, and explore our good first issues for beginner-friendly tasks to tackle first.
- Develop Plugins: Build and share plugins using our Plugin Developer Guide.
- Contribute to our Docs: Contribute edits or updates to keep our documentation top-notch.
ð License
Kestra is licensed under the Apache 2.0 License © Kestra Technologies.
âï¸ Stay Updated
Give our repository a star to stay informed about the latest features and updates!
Thank you for considering Kestra for your workflow orchestration needs. We can't wait to see what you'll build!
Top Related Projects
Apache Airflow - A platform to programmatically author, schedule, and monitor workflows
Prefect is a workflow orchestration framework for building resilient data pipelines in Python.
An orchestration platform for the development, production, and observation of data assets.
Always know what to expect from your data.
🧙 Build, run, and manage data pipelines for integrating and transforming data.
Conductor is a microservices orchestration engine.
Convert
designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual Copilot