Executors Reference
Overview
Executors extend Dagu's capabilities beyond simple shell commands. Available executors:
- Shell (default) - Execute shell commands
- Docker - Run commands in Docker containers
- SSH - Execute commands on remote hosts
- HTTP - Make HTTP requests
- Mail - Send emails
- JQ - Process JSON data
TIP
For detailed documentation on each executor, click the links above to visit the feature pages.
Shell Executor (Default)
INFO
For detailed Shell executor documentation, see Shell Executor Guide.
The default executor runs commands in the system shell.
steps:
- name: simple-command
command: echo "Hello World"
- name: with-specific-shell
command: echo $BASH_VERSION
shell: bash # Use specific shell
Shell Selection
steps:
- name: default-shell
command: echo "Uses $SHELL or /bin/sh"
- name: bash-specific
shell: bash
command: echo "Uses bash features"
- name: custom-shell
shell: /usr/bin/zsh
command: echo "Uses zsh"
Docker Executor
INFO
For detailed Docker executor documentation, see Docker Executor Guide.
Run commands in Docker containers for isolation and reproducibility.
Create and Run Container
steps:
- name: run-in-container
executor:
type: docker
config:
image: alpine:latest
autoRemove: true
command: echo "Hello from container"
Image Pull Options
steps:
- name: pull-always
executor:
type: docker
config:
image: myapp:latest
pull: always # Always pull from registry
autoRemove: true
command: ./app
- name: pull-if-missing
executor:
type: docker
config:
image: myapp:latest
pull: missing # Default - pull only if not local
autoRemove: true
command: ./app
- name: never-pull
executor:
type: docker
config:
image: local-image:dev
pull: never # Use local image only
autoRemove: true
command: ./test
Volume Mounts
steps:
- name: with-volumes
executor:
type: docker
config:
image: python:3.11
autoRemove: true
host:
binds:
- /host/data:/container/data:ro # Read-only
- /host/output:/container/output:rw # Read-write
- ./config:/app/config # Relative path
command: python process.py /container/data
Environment Variables
env:
- API_KEY: secret123
steps:
- name: with-env
executor:
type: docker
config:
image: node:18
autoRemove: true
container:
env:
- NODE_ENV=production
- API_KEY=${API_KEY} # Pass from DAG env
- DB_HOST=postgres
command: npm start
Network Configuration
steps:
- name: custom-network
executor:
type: docker
config:
image: alpine
autoRemove: true
network:
EndpointsConfig:
my-network:
Aliases:
- my-service
- my-alias
command: ping other-service
Platform Selection
steps:
- name: specific-platform
executor:
type: docker
config:
image: myapp:latest
platform: linux/amd64 # Force platform
autoRemove: true
command: ./app
Working Directory
steps:
- name: custom-workdir
executor:
type: docker
config:
image: python:3.11
autoRemove: true
container:
workingDir: /app
env:
- PYTHONPATH=/app
host:
binds:
- ./src:/app
command: python main.py
Execute in Existing Container
steps:
- name: exec-in-running
executor:
type: docker
config:
containerName: my-app-container
exec:
user: root
workingDir: /app
env:
- DEBUG=true
command: ./debug.sh
Complete Docker Example
steps:
- name: complex-docker
executor:
type: docker
config:
image: postgres:15
containerName: test-db
pull: missing
platform: linux/amd64
autoRemove: false
container:
env:
- POSTGRES_USER=test
- POSTGRES_PASSWORD=test
- POSTGRES_DB=testdb
exposedPorts:
5432/tcp: {}
host:
binds:
- postgres-data:/var/lib/postgresql/data
portBindings:
5432/tcp:
- hostIP: "127.0.0.1"
hostPort: "5432"
network:
EndpointsConfig:
bridge:
Aliases:
- postgres-test
command: postgres
SSH Executor
INFO
For detailed SSH executor documentation, see SSH Executor Guide.
Execute commands on remote hosts over SSH.
Basic SSH
steps:
- name: remote-command
executor:
type: ssh
config:
user: deploy
host: server.example.com
port: 22
key: /home/user/.ssh/id_rsa
command: ls -la /var/www
With Environment
steps:
- name: remote-with-env
executor:
type: ssh
config:
user: deploy
host: 192.168.1.100
key: ~/.ssh/deploy_key
command: |
export APP_ENV=production
cd /opt/app
./deploy.sh
Multiple Commands
steps:
- name: remote-script
executor:
type: ssh
config:
user: admin
host: backup.server.com
key: ${SSH_KEY_PATH}
script: |
#!/bin/bash
set -e
echo "Starting backup..."
tar -czf /backup/app-$(date +%Y%m%d).tar.gz /var/www
echo "Cleaning old backups..."
find /backup -name "app-*.tar.gz" -mtime +7 -delete
echo "Backup complete"
HTTP Executor
INFO
For detailed HTTP executor documentation, see HTTP Executor Guide.
Make HTTP requests to APIs and web services.
GET Request
steps:
- name: simple-get
executor:
type: http
config:
silent: true # Output body only
command: GET https://api.example.com/status
POST with Body
steps:
- name: post-json
executor:
type: http
config:
headers:
Content-Type: application/json
Authorization: Bearer ${API_TOKEN}
body: |
{
"name": "test",
"value": 123
}
timeout: 30
command: POST https://api.example.com/data
Query Parameters
steps:
- name: search-api
executor:
type: http
config:
query:
q: "dagu workflow"
limit: "10"
offset: "0"
silent: true
command: GET https://api.example.com/search
Form Data
steps:
- name: form-submit
executor:
type: http
config:
headers:
Content-Type: application/x-www-form-urlencoded
body: "username=user&password=pass&remember=true"
command: POST https://example.com/login
Complete HTTP Example
steps:
- name: api-workflow
executor:
type: http
config:
headers:
Accept: application/json
X-API-Key: ${API_KEY}
timeout: 60
silent: false
command: GET https://api.example.com/data
output: API_RESPONSE
- name: process-response
command: echo "${API_RESPONSE}" | jq '.data[]'
depends: api-workflow
Mail Executor
INFO
For detailed Mail executor documentation, see Mail Executor Guide.
Send emails for notifications and alerts.
Basic Email
smtp:
host: smtp.gmail.com
port: "587"
username: [email protected]
password: ${SMTP_PASSWORD}
steps:
- name: send-notification
executor:
type: mail
config:
to: [email protected]
from: [email protected]
subject: "Workflow Completed"
message: "The data processing workflow has completed successfully."
With Attachments
steps:
- name: send-report
executor:
type: mail
config:
to: [email protected]
from: [email protected]
subject: "Daily Report - ${TODAY}"
message: |
Please find attached the daily report.
Generated at: ${TIMESTAMP}
attachments:
- /tmp/daily-report.pdf
- /tmp/summary.csv
Multiple Recipients
steps:
- name: alert-team
executor:
type: mail
config:
to:
- [email protected]
- [email protected]
- [email protected]
from: [email protected]
subject: "[ALERT] Process Failed"
message: |
The critical process has failed.
Error: ${ERROR_MESSAGE}
Time: ${TIMESTAMP}
HTML Email
steps:
- name: send-html
executor:
type: mail
config:
to: [email protected]
from: [email protected]
subject: "Weekly Stats"
contentType: text/html
message: |
<html>
<body>
<h2>Weekly Statistics</h2>
<p>Users: <strong>${USER_COUNT}</strong></p>
<p>Revenue: <strong>${REVENUE}</strong></p>
</body>
</html>
JQ Executor
INFO
For detailed JQ executor documentation, see JQ Executor Guide.
Process and transform JSON data using jq syntax.
Format JSON
steps:
- name: pretty-print
executor: jq
script: |
{"name":"test","values":[1,2,3],"nested":{"key":"value"}}
Output:
{
"name": "test",
"values": [1, 2, 3],
"nested": {
"key": "value"
}
}
Query JSON
steps:
- name: extract-value
executor: jq
command: '.data.users[] | select(.active == true) | .email'
script: |
{
"data": {
"users": [
{"id": 1, "email": "[email protected]", "active": true},
{"id": 2, "email": "[email protected]", "active": false},
{"id": 3, "email": "[email protected]", "active": true}
]
}
}
Output:
Transform JSON
steps:
- name: transform-data
executor: jq
command: '{id: .id, name: .name, total: (.items | map(.price) | add)}'
script: |
{
"id": "order-123",
"name": "Test Order",
"items": [
{"name": "Item 1", "price": 10.99},
{"name": "Item 2", "price": 25.50},
{"name": "Item 3", "price": 5.00}
]
}
Output:
{
"id": "order-123",
"name": "Test Order",
"total": 41.49
}
Complex Processing
steps:
- name: analyze-logs
executor: jq
command: |
group_by(.level) |
map({
level: .[0].level,
count: length,
messages: map(.message)
})
script: |
[
{"level": "ERROR", "message": "Connection failed"},
{"level": "INFO", "message": "Process started"},
{"level": "ERROR", "message": "Timeout occurred"},
{"level": "INFO", "message": "Process completed"}
]
DAG Executor
INFO
DAG executor allows running other workflows as steps. This is documented in the Nested Workflows section.
Execute other workflows as steps, enabling workflow composition.
Execute External DAG
steps:
- name: run-etl
executor: dag
command: workflows/etl-pipeline.yaml
params: "DATE=${TODAY} ENV=production"
Execute Local DAG
name: main-workflow
steps:
- name: prepare-data
executor: dag
command: data-prep
params: "SOURCE=/data/raw"
---
name: data-prep
params:
- SOURCE: /tmp
steps:
- name: validate
command: validate.sh ${SOURCE}
- name: clean
command: clean.py ${SOURCE}
Capture DAG Output
steps:
- name: analyze
executor: dag
command: analyzer.yaml
params: "FILE=${INPUT_FILE}"
output: ANALYSIS
- name: use-results
command: |
echo "Status: ${ANALYSIS.outputs.status}"
echo "Count: ${ANALYSIS.outputs.record_count}"
depends: analyze
Error Handling
steps:
- name: may-fail
executor: dag
command: risky-process.yaml
continueOn:
failure: true
retryPolicy:
limit: 3
intervalSec: 300
Dynamic DAG Selection
steps:
- name: choose-workflow
command: |
if [ "${ENVIRONMENT}" = "prod" ]; then
echo "production-workflow.yaml"
else
echo "staging-workflow.yaml"
fi
output: WORKFLOW_FILE
- name: run-selected
executor: dag
command: ${WORKFLOW_FILE}
params: "ENV=${ENVIRONMENT}"
depends: choose-workflow
See Also
- Shell Executor - Shell command execution details
- Docker Executor - Container execution guide
- SSH Executor - Remote execution guide
- HTTP Executor - API interaction guide
- Mail Executor - Email notification guide
- JQ Executor - JSON processing guide
- Writing Workflows - Using executors in workflows
- Examples - Real-world executor usage