Schedule Jobs
Learn how to automate tasks using Dokploy's Schedule Jobs feature
Schedule Jobs in Dokploy allows you to create and manage automated tasks that run on a specified schedule using cron expressions. Each job execution creates a log entry where you can monitor the output and execution status.
Job Types
Dokploy supports four types of scheduled jobs:
- Application Jobs: Run commands inside specific application containers
- Compose Jobs: Execute commands in Docker Compose services
- Server Jobs: Run scripts on remote servers (executed on the host)
- Dokploy Server Jobs: Execute tasks at the container level within the Dokploy container. These jobs can interact with commands inside the Dokploy container (e.g.,
docker ps,docker image prune), but they are not executed directly on the host system
Container-based Jobs (Application and Compose)
For application and compose jobs, you can run single commands that will be executed inside the target container. Dokploy internally uses Docker exec to run these commands:
docker exec -it <container_id> <command>Example
Assuming you with a nginx container and you want to check the nginx version in a container:
- Create a new schedule job
- Set the command to:
nginx -v - Configure your desired schedule using cron syntax
- Save and monitor the execution logs
The target container must be running for the job to execute successfully.
For docker compose jobs, is required to not change the COMPOSE_PROJECT_NAME environment variable, since this is used to identify the project.
Server-based Jobs (Server and Dokploy Server)
Server Jobs
For remote servers, you can write bash scripts to perform various tasks. These scripts are executed directly on the host system and can use any command or tool available on the target server.
Dokploy Server Jobs
Dokploy Server Jobs are executed at the container level within the Dokploy container. This means:
- Commands run inside the Dokploy container environment
- You can interact with Docker commands (e.g.,
docker ps,docker image prune,docker system prune) - Scripts have access to the Docker socket and can manage containers and images
- Jobs do not execute directly on the host system, but within the containerized Dokploy environment
Example: You can create a scheduled job to clean up unused Docker images:
#!/bin/bash
docker image prune -afThis command will run inside the Dokploy container and can interact with Docker to clean up images.
Make sure any required dependencies are installed on the target server before using them in your scripts.
Example 1: Automatic Docker Cleanup
This script cleans up unused Docker containers. You could schedule it to run every 15 minutes using the cron expression */15 * * * *:
#!/bin/bash
docker system prune --forceExample 2: Custom Database Backup
You can create scripts to backup databases that aren't natively supported by Dokploy. Here's an example structure for a custom backup script:
#!/bin/bash
# Backup script for custom database
backup_date=$(date +%Y%m%d_%H%M%S)
backup_file="database_${backup_date}.backup"
# search the container name
container_name=$(docker ps --filter "name=clickhouse" --format "{{.Names}}")
# Add your backup commands here
docker exec -it $container_name clickhouse-client --query "BACKUP DATABASE mydb TO '/backups/$backup_file'"
# Upload to S3 (if needed)
# aws s3 cp /backups/$backup_file s3://your-bucket/backups/Best Practices
- Always test your commands or scripts manually before scheduling them
- Use appropriate error handling in your scripts
- Consider the impact of scheduled jobs on system resources
Auto Deploy
Automatically deploying your application to Dokploy can be achieved through two primary methods using Webhooks or the Dokploy API. Each method supports various platforms and provides a streamlined deployment process.
Volume Backups
Learn how to backup your volumes using Dokploy's Volume Backups feature