DEV Community

Alan Varghese
Alan Varghese

Posted on

Automate Your Backups Like a Pro: A Robust Bash Script for DevOps Enthusiasts.

In the world of DevOps, there's a golden rule: If it’s not backed up, it doesn’t exist.

While there are many enterprise-grade backup solutions available, sometimes you need something lightweight, highly customizable, and easy to integrate into your existing workflows. That's where a well-crafted Bash script comes in.

In this post, I'll walk you through a Robust File Backup Script I built that handles compression, remote transfers, retention policies, and detailed logging—all while following core DevOps principles.


Key Features

Our backup script isn't just a simple cp command. It's designed to be production-ready with features like:

  1. Smart Compression: Uses tar and gzip to minimize storage space.
  2. Configuration Decoupling: All settings live in a separate backup.conf file (Infrastructure as Code lite).
  3. Dry Run Mode: A -d flag to see exactly what would happen without actually doing it.
  4. Automatic Retention: Keeps your disk clean by deleting local backups older than N days.
  5. Secure Remote Transfer: Optionally sends your archives to a remote server via scp.
  6. Comprehensive Logging: Every action, warning, and error is timestamped and logged for auditing.

🛠 The Technical Breakdown

1. Separation of Concerns

We keep our logic (backup.sh) separate from our settings (backup.conf). This makes the script portable across different environments (Dev, Stage, Prod) without modification.

backup.conf example:

SOURCE_PATHS=(
    "/var/www/html"
    "/etc/nginx/conf.d"
)
BACKUP_DIR="./backups"
RETENTION_DAYS=7
ENABLE_REMOTE="true"
REMOTE_HOST="backup-server.local"
Enter fullscreen mode Exit fullscreen mode

2. Flexible Argument Parsing

Using getopts, the script feels like a professional CLI tool. You can specify custom config files or trigger a dry run easily.

while getopts ":c:dh" opt; do
    case ${opt} in
        c ) CONFIG_FILE=$OPTARG ;;
        d ) DRY_RUN=true ;;
        h ) usage ;;
    esac
done
Enter fullscreen mode Exit fullscreen mode

3. The Power of find for Retention

Managing disk space is crucial. We use find with the -mtime flag to identify and remove old archives automatically.

find "$BACKUP_DIR" -type f -name "backup_*.tar.gz" -mtime +"$RETENTION_DAYS" -exec rm {} \;
Enter fullscreen mode Exit fullscreen mode

4. Secure Remote Offloading

A backup on the same disk isn't a true backup. Our script supports secure transfer to a remote host:

scp "$BACKUP_DIR/$BACKUP_NAME" "$REMOTE_USER@$REMOTE_HOST:$REMOTE_PATH"
Enter fullscreen mode Exit fullscreen mode

📖 Lessons Learned & DevOps Principles

Building this project reinforced several key concepts:

  • Automation over Manual Work: Human error is the #1 cause of data loss. Automating the backup removes the "I forgot" factor.
  • Idempotency & Resilience: The script checks if directories exist and if source paths are valid before starting.
  • Visibility: "In DevOps, if it wasn't logged, it didn't happen." Detailed logs are essential for debugging scheduled cron jobs.
  • Safety First: The Dry Run mode is a lifesaver when testing new configurations on a production server.

How to Use It

  1. Clone the Repo: [Link to your repo here]
  2. Configure: Edit backup.conf with your paths.
  3. Make it Executable: chmod +x backup.sh
  4. Test it: ./backup.sh -d (Dry run)
  5. Run it: ./backup.sh
  6. Schedule it: Add it to your crontab to run nightly!
0 2 * * * /path/to/backup.sh >> /path/to/logs/cron.log 2>&1
Enter fullscreen mode Exit fullscreen mode

Future Enhancements

  • Adding AWS S3 support using the AWS CLI.
  • Implementing Slack/Email notifications on failure.
  • Adding Checksum verification to ensure data integrity after transfer.

What does your backup strategy look like? Do you prefer simple scripts or complex tools? Let’s discuss in the comments! 👇

https://github.com/alanvarghese-dev/Bash_Scripting

https://www.linkedin.com/in/alanvarghese-dev

Top comments (0)