Skip to content

Features

Deep dive into all Dagu capabilities.

Overview

Dagu provides a comprehensive set of features for building and managing workflows. This section covers everything you need to know about each feature in detail.

Feature Categories

🖥️ Interfaces

How to interact with Dagu:

🔧 Executors

Different ways to run your commands:

  • Shell - Run any command or script (default)
  • Docker - Execute in containers for isolation
  • SSH - Run commands on remote servers
  • HTTP - Make REST API calls and webhook requests
  • Mail - Send email notifications and reports
  • JQ - Process and transform JSON data

Scheduling

Control when workflows run:

  • Cron expressions with timezone support
  • Multiple schedules per workflow
  • Start/stop/restart patterns
  • Skip redundant executions

🚀 Execution Control

Advanced execution patterns:

  • Parallel execution with concurrency limits
  • Conditional execution (preconditions)
  • Continue on failure patterns
  • Retry and repeat policies
  • Output size management
  • Signal handling

📊 Data Flow

Managing data in workflows:

  • Parameters and runtime values
  • Output variables between steps
  • Environment variable management
  • JSON path references
  • Template rendering
  • Special system variables

📋 Queue System

Workflow orchestration at scale:

  • Built-in queue management
  • Per-DAG queue assignment
  • Priority-based execution
  • Manual queue operations
  • Concurrency control

📧 Notifications

Stay informed about workflow status:

  • Email alerts on success/failure
  • Custom notification handlers
  • Log attachments
  • Flexible SMTP configuration

Feature Highlights

Zero Dependencies

Unlike other workflow engines, Dagu requires:

  • ❌ No database
  • ❌ No message broker
  • ❌ No runtime dependencies
  • ✅ Just a single binary

Language Agnostic

Run anything that works on your system:

yaml
steps:
  - name: python
    command: python script.py
  - name: node
    command: npm run task
  - name: go
    command: go run main.go
  - name: bash
    command: ./script.sh

Hierarchical Workflows

Build complex systems from simple parts:

yaml
steps:
  - name: data-pipeline
    run: etl.yaml
    params: "DATE=today"
    
  - name: ml-training
    run: train.yaml
    depends: data-pipeline
    
  - name: deployment
    run: deploy.yaml
    depends: ml-training

Production Ready

Built for reliability:

  • Process Management: Proper signal handling and graceful shutdown
  • Error Recovery: Configurable retry policies and failure handlers
  • Logging: Comprehensive logs with stdout/stderr separation
  • Monitoring: Built-in metrics and health checks

Common Use Cases

Data Engineering

  • ETL pipelines with dependency management
  • Parallel data processing
  • Scheduled batch jobs
  • Data quality checks

DevOps Automation

  • CI/CD pipelines
  • Infrastructure provisioning
  • Backup and restore workflows
  • System maintenance tasks

Business Process Automation

  • Report generation
  • Data synchronization
  • Customer onboarding
  • Invoice processing

Performance Characteristics

Scalability

  • Handle thousands of concurrent workflows
  • Efficient file-based storage
  • Minimal memory footprint
  • Fast startup times

Limitations

  • Single-machine execution (no distributed mode)
  • 1MB default output limit per step
  • 1000 item limit for parallel execution
  • File system dependent

Getting Started with Features

  1. Start with the basics: Learn about Interfaces to interact with Dagu
  2. Choose your executor: Pick the right Executor for your tasks
  3. Add scheduling: Set up automatic execution
  4. Handle errors: Implement proper retry and error handling
  5. Scale up: Use queues for complex orchestration

Feature Comparison

FeatureDaguAirflowGitHub ActionsCron
Local Development
Web UI
Dependencies
Retries
Parallel Execution
No External Services
Language Agnostic

See Also

Explore specific features:

Released under the MIT License.