Bash script for daily backups from PostgreSQL Docker containers
How I backup PostgreSQL databases from Docker containers.
I use Docker with Docker Compose for my own apps. I've got a few apps that use PostgreSQL, and a couple of years ago I was looking for a way to backup the databases from those containers in an easy and automated way.
I found a few solutions, but all were way too complicated or inefficient or required more work to restore, or were reliant on the host machine having PostgreSQL installed (which I didn't want because my apps are running different PostgreSQL versions), so I wrote a script myself!
I've been using this bash script to backup the databases from the containers. It's a simple script that runs every day at a specific time (via a simple cron job), and backs up the databases to a specific directory (via pg_dump
), without any passwords in the script. It keeps 14 days of backups, compresses them, and cleans up old backups. Oh, and it supports multiple containers!
Here it is:
backup.sh
#!/bin/bash
#
# Backup Postgresql databases into daily files for multiple containers.
#
# Base backup settings
BASE_BACKUP_DIR=/home/www/backups
DAYS_TO_KEEP=14
FILE_SUFFIX=_backup.sql
# App container names
declare -A CONTAINERS=(
["amazing-app"]="/home/www/apps/amazing-app"
["another-app"]="/home/www/apps/another-app"
)
# Common PostgreSQL settings
POSTGRES_USER=postgres
POSTGRES_CONTAINER_USER=postgres
# Function to backup a single container
backup_container() {
local container_name=$1
local container_dir=$2
local backup_dir="${BASE_BACKUP_DIR}/${container_name}"
local file=$(date +"%Y%m%d%H%M")${FILE_SUFFIX}
local output_file="${backup_dir}/${file}"
local postgres_container="${container_name}-postgresql-1"
local postgres_db="${container_name}"
# Create backup directory if it doesn't exist
mkdir -p "${backup_dir}"
# Do the database backup (dump)
cd "${container_dir}" && docker exec -u ${POSTGRES_CONTAINER_USER} ${postgres_container} pg_dump --dbname="${postgres_db}" --username="${POSTGRES_USER}" > "${output_file}"
# gzip the database dump file
gzip "${output_file}"
# Show the result
echo "${output_file}.gz was created:"
ls -l "${output_file}.gz"
# Prune old backups for this container
find "${backup_dir}" -maxdepth 1 -mtime +${DAYS_TO_KEEP} -name "*${FILE_SUFFIX}.gz" -exec rm -f '{}' ';'
}
for container_name in "${!CONTAINERS[@]}"; do
echo "Backing up ${container_name}..."
backup_container "${container_name}" "${CONTAINERS[$container_name]}"
done
echo "Done backing up databases."
## Restore:
# cd ${container_dir} && docker cp /home/www/backups/${postgres_container}/*_backup.sql.gz ${postgres_container}:/tmp/backup.sql.gz
# docker exec -u ${POSTGRES_CONTAINER_USER} -i ${postgres_container} pg_restore -C --dbname="${postgres_db}" --username="${POSTGRES_USER}" /tmp/backup.sql.gz
Obviously there are some things you can improve, like adding support for multiple databases per container, multiple postgresql container names (per container), and multiple postgresql users. I don't need that, so it'd be unnecessary complexity for me.
I run this with a simple cronjob:
# Every day at 3:07am
7 3 * * * /home/www/scripts/backup.sh > /home/www/logs/backup.log 2>&1
Which keeps a nice log of the last daily backup in /home/www/logs/backup.log
.
I hope this helps you.
Thank you so much for being here. I really appreciate you!