DEV Community

Cover image for Automatically backup and upload to S3

Posted on • Updated on

Automatically backup and upload to S3

Puedes leer este artículo en castellano aquí


Because who doesn't need a database backup? ;)


For my personal project, FacturApp, I use PostgreSQL because it's awesome and also because is free. From time to time I need to restore the prod db to my local instance to have true data and make tests more reliable.

Because I love automation, I decided to have a daily backup just in case and when I need to have the current data, I would just download the file from S3.



The TLDR part would be

  1. Find the container's ID
  2. Save current time in a variable called now
  3. Save the db backup using the now variable for the name
  4. Upload backup to S3
  5. Locally delete files older than 30 days

The Code

docker_id="$(docker ps -aqf 'name=postgres')"
echo "${docker_id}"
now=`date +%Y%m%d%H%M%S`
docker exec ${docker_id} /usr/local/bin/pg_dump -U theuser -Fc thedb > /root/backups/facturapp_${now}.backup
# Upload file to S3
/usr/local/bin/aws s3 cp /root/db_daily/facturapp_${now}.custom s3://db-backup-bucket/facturapp_${now}.custom --sse AES256
# Deleting old files
find /root/db_daily -type f -mtime +30 -exec rm {} \;
Enter fullscreen mode Exit fullscreen mode

Automate it!

The final step would be to create a cron job by running crontab -e and adding something like
0 0 * * * /root/dailybackup >> /root/logs/backup.log
to have to script running each day at midnight.

And that's how I save a backup to an S3 bucket every day at midnight :)

The /root/logs/backup.log part is just in case you want to save the log for each time the script runs.

Top comments (0)