Writing Workflows
Workflow Structure
yaml
description: "Process daily data"
schedule: "0 2 * * *" # Optional: cron schedule
maxActiveRuns: 1 # Optional: concurrency limit
params: # Runtime parameters
- DATE: "`date +%Y-%m-%d`"
env: # Environment variables
- DATA_DIR: /tmp/data
steps: # Workflow steps
- echo "Processing for date ${DATE}"
Base Configuration
Share common settings across all DAGs using base configuration:
yaml
# ~/.config/dagu/base.yaml
env:
- LOG_LEVEL: info
- AWS_REGION: us-east-1
smtp:
host: smtp.company.com
port: "587"
username: ${SMTP_USER}
password: ${SMTP_PASS}
errorMail:
from: [email protected]
to: [email protected]
attachLogs: true
histRetentionDays: 30 # Dagu deletes workflow history and logs older than this
maxActiveRuns: 5
DAGs automatically inherit these settings:
yaml
# my-workflow.yaml
# Inherits all base settings
# Can override specific values:
env:
- LOG_LEVEL: debug # Override
- CUSTOM_VAR: value # Addition
steps:
- echo "Processing"
Configuration precedence: System defaults → Base config → DAG config
Guide Sections
- Basics - Steps, commands, dependencies
- Container - Run workflows in Docker containers
- Control Flow - Parallel execution, conditions, loops
- Data & Variables - Parameters, outputs, data passing
- Error Handling - Retries, failures, notifications
- Patterns - Composition, optimization, best practices
Complete Example
yaml
schedule: "0 2 * * *"
params:
- DATE: "`date +%Y-%m-%d`"
env:
- DATA_DIR: /tmp/data/${DATE}
steps:
- command: aws s3 cp s3://bucket/${DATE}.csv ${DATA_DIR}/
retryPolicy:
limit: 3
intervalSec: 60
- command: python validate.py ${DATA_DIR}/${DATE}.csv
continueOn:
failure: false
- parallel: [users, orders, products]
command: python process.py --type=$ITEM --date=${DATE}
output: RESULT_${ITEM}
- python report.py --date=${DATE}
handlerOn:
failure:
command: echo "Notifying failure for ${DATE}"
Common Patterns
Sequential Pipeline
yaml
steps:
- echo "Extracting data"
- echo "Transforming data"
- echo "Loading data"
Parallel Processing
yaml
steps:
- parallel: [file1, file2, file3]
run: process-file
params: "FILE=${ITEM}"
---
# A child workflow for processing each file
# This can be in a same file separated by `---` or in a separate file
name: process-file
steps:
- echo "Processing" --file ${FILE}