DEV Community

Cover image for Production-Ready MySQL Backup Script (with Compression and AWS Upload)
DevOps Descent
DevOps Descent

Posted on

Production-Ready MySQL Backup Script (with Compression and AWS Upload)

🗄️ Automate MySQL Backups to AWS S3 with a Simple Bash Script

Backups are one of those things you don’t think about… until it’s too late.

I wanted a simple, no-dependency solution for MySQL backups that I could run via cron, store safely on AWS S3, and clean up automatically.

So I wrote a Bash script that does exactly that — checks versions, loads environment variables, dumps the DB, compresses it, uploads to S3, and cleans up afterward.

Here’s the full script and a breakdown of how it works. 👇


💾 Script

#!/bin/bash
set -euo pipefail
IFS=$'\n\t'

DIRNAME=$(dirname "$0")
TIMESTAMP=$(date +"%Y-%m-%d_%H-%M-%S")

echo "Starting database backup for $TIMESTAMP";

check_versions() {
    aws --version
    mysqldump --version
}

load_env() {
    echo "Loading .env file";

    ENV_FILE="$DIRNAME/../.env"
    [[ ! -f $ENV_FILE ]] && printf "%s: file not found" "$ENV_FILE" && exit 1
    source "$ENV_FILE"

    echo "Database name is: $DB_DATABASE"
}

check_variables() {
    if [[ ! -v AWS_DB_BACKUP_BUCKET ]]; then
        echo "Error: AWS_DB_BACKUP_BUCKET environment variable is not set!"
        exit 1;
    fi
}

create_dump() {
    echo "Creating database dump...";
    OUTPUT_FILE="$PWD/$TIMESTAMP-$DB_DATABASE.sql"

    mysqldump "$DB_DATABASE" \
        --column-statistics=0 \
        --no-tablespaces \
        --set-gtid-purged=OFF \
        --host="$DB_HOST" \
        --user="$DB_USERNAME" \
        --password="$DB_PASSWORD" > "$OUTPUT_FILE"

    echo "$OUTPUT_FILE file created."
    du -sh "$OUTPUT_FILE"
}

zip_output_file() {
    echo "Compressing target file...";

    zip -j -T "$OUTPUT_FILE.zip" "$OUTPUT_FILE"
    du -sh "$OUTPUT_FILE.zip"
}

upload_to_s3() {
    echo "Uploading zip file to S3 bucket...";

    CURRENT_YEAR=$(date +"%Y")
    S3_DESTINATION_PATH="s3://$AWS_DB_BACKUP_BUCKET/$CURRENT_YEAR/"
    aws s3 cp "$OUTPUT_FILE.zip" "$S3_DESTINATION_PATH"
}

cleanup() {
    echo "Removing temp files from local disk";

    rm -f "$OUTPUT_FILE.zip"
    rm -f "$OUTPUT_FILE"
}

check_versions
load_env
check_variables
create_dump
zip_output_file
upload_to_s3
cleanup

echo "Database backup script finished!"
Enter fullscreen mode Exit fullscreen mode

⚙️ Common Pitfalls / Troubleshooting

Here are a few common issues you might encounter and how to fix them:


🧾 Missing .env file

The script will exit with “file not found.”

✅ Make sure your .env file exists and contains:

DB_HOST=localhost
DB_DATABASE=my_database
DB_USERNAME=root
DB_PASSWORD=secret
AWS_DB_BACKUP_BUCKET=my-s3-bucket
Enter fullscreen mode Exit fullscreen mode

🧰 mysqldump: command not found

Install and configure  mysql-client
Enter fullscreen mode Exit fullscreen mode

☁️ aws: command not found

Install and configure AWS CLI
Enter fullscreen mode Exit fullscreen mode

That’s it! You now have a simple, reliable, and automated MySQL backup solution using nothing more than Bash and AWS CLI.

If you found this helpful,give a ❤️

Happy automating! 🚀

Top comments (0)