
A lightweight and powerful workflow engine for enterprise & small teams
Quick Start
Install and run your first workflow in under 2 minutes.
Install
bash
npm install -g dagu
bash
curl -L https://raw.githubusercontent.com/dagu-org/dagu/main/scripts/installer.sh | bash
bash
docker pull ghcr.io/dagu-org/dagu:latest
bash
brew install dagu-org/brew/dagu
1. Create a workflow
bash
mkdir -p ~/.config/dagu/dags && cat > ~/.config/dagu/dags/hello.yaml << 'EOF'
steps:
- name: hello
command: echo "Hello from Dagu!"
- name: world
command: echo "Running step 2"
EOF
bash
mkdir -p ~/.dagu/dags && cat > ~/.dagu/dags/hello.yaml << 'EOF'
steps:
- name: hello
command: echo "Hello from Dagu!"
- name: world
command: echo "Running step 2"
EOF
2. Run it
bash
dagu start hello
bash
docker run --rm \
-v ~/.dagu:/var/lib/dagu \
ghcr.io/dagu-org/dagu:latest \
dagu start hello
Output:
┌─ DAG: hello ─────────────────────────────────────────────────────┐
│ Status: Success ✓ | Started: 23:34:57 | Elapsed: 471ms │
└──────────────────────────────────────────────────────────────────┘
Progress: ████████████████████████████████████████ 100% (2/2 steps)
Note: The output may vary if you are using Docker.
:::
3. Check the status
bash
dagu status hello
bash
docker run --rm \
-v ~/.dagu:/var/lib/dagu \
ghcr.io/dagu-org/dagu:latest \
dagu status hello
4. Start the UI
bash
dagu start-all
bash
docker run -d \
-p 8080:8080 \
-v ~/.dagu:/var/lib/dagu \
ghcr.io/dagu-org/dagu:latest \
dagu start-all
Visit http://localhost:8080 to see your workflows.
Why Dagu?
Zero Dependencies
Single binary. No database, no message broker. Deploy anywhere in seconds.
Language Agnostic
Execute any command. Your existing scripts work without modification.
Production Ready
Battle-tested error handling, retries, logging, and monitoring.
Hierarchical Workflows
Compose workflows from smaller workflows. Build modular, reusable components.
Example: Data Pipeline with Nested Workflows
yaml
name: etl-pipeline
schedule: "0 2 * * *" # 2 AM daily
steps:
- name: extract
command: python extract.py --date=${DATE}
output: RAW_DATA
- name: transform
run: transform-data
parallel:
items: [customers, orders, products]
params: "TYPE=${ITEM} INPUT=${RAW_DATA}"
- name: load
command: python load.py
retryPolicy:
limit: 3
intervalSec: 2
backoff: true # Exponential backoff
---
name: transform-data
params: [TYPE, INPUT]
steps:
- name: process
command: python transform.py --type=${TYPE} --input=${INPUT}