DEV Community

Alan Varghese
Alan Varghese

Posted on

Automating Your Backups with a Simple Bash Script

Learn how to create a robust, automated file backup system using Bash, complete with compression, exclusions, and retention policies.

As developers, we know that data loss is a nightmare. Whether it's a hardware failure or an accidental rm -rf, having a reliable backup system is essential. While there are many complex tools available, sometimes a simple, customized Bash script is all you need to get the job done efficiently.

In this post, I'll walk you through a lightweight Bash script I built to automate file backups. It handles compression, excludes unnecessary files, and even manages disk space by deleting old archives.

The Problem

Manual backups are tedious and prone to human error. You might forget to run them, or you might end up filling your disk with redundant copies. I wanted a solution that:

  1. Is easy to trigger via CLI or cron.
  2. Compresses data to save space.
  3. Automatically cleans up old backups.
  4. Ignores "noise" like .git folders and temporary files.

The Solution: backup.sh

Here is the breakdown of the script's core functionality.

1. Robust Input Validation

Before doing anything, the script ensures that the user provided the necessary source and destination paths, and verifies that the source actually exists.

if [ $# -ne 2 ]; then
    echo "Usage: $0 <source_directory> <destination_directory>"
    exit 1
fi

SOURCE_DIR="$1"
DESTINATION_DIR="$2"

if [ ! -d "$SOURCE_DIR" ]; then
    echo "Error: Source directory '$SOURCE_DIR' does not exist."
    exit 1
fi
Enter fullscreen mode Exit fullscreen mode

2. Smart Directory Management

If your destination folder doesn't exist yet, the script handles it gracefully using mkdir -p.

if [ ! -d "$DESTINATION_DIR" ]; then
    mkdir -p "$DESTINATION_DIR"
fi
Enter fullscreen mode Exit fullscreen mode

3. Efficient Compression with Exclusions

One of the most powerful parts of this script is how it filters files. We don't want to back up heavy .git history or transient log files. By combining find and tar, we only grab what matters.

EXCLUSIONS="*.git/* *log* *temp*"

find "$SOURCE_DIR" -type f -not -path "*$EXCLUSIONS*" -print0 | 
xargs -0 tar -czvf "$DESTINATION_DIR/backup.tar.gz" -C "$SOURCE_DIR" .
Enter fullscreen mode Exit fullscreen mode

4. Automatic Retention Policy

To prevent your backup drive from filling up, the script automatically purges files older than 30 days.

find "$DESTINATION_DIR" -type f -mtime +30 -delete
Enter fullscreen mode Exit fullscreen mode

How to Use It

  1. Clone the repo (or copy the script).
  2. Make it executable:
   chmod +x backup.sh
Enter fullscreen mode Exit fullscreen mode
  1. Run a backup:
   ./backup.sh ~/my_project ~/backups/my_project_backups
Enter fullscreen mode Exit fullscreen mode

Conclusion

This script is a great starting point for anyone looking to automate their workflow. It's modular, easy to read, and relies on standard Unix tools that are available on almost every system.

Next Steps: You could easily extend this by adding:

  • Timestamped filenames (e.g., backup_2023-10-27.tar.gz).
  • Email notifications on failure.
  • Remote syncing to S3 or a VPS using rsync.

What do you use for your personal backups? Let me know in the comments!

https://github.com/alanvarghese-dev/Bash_Scripting

https://www.linkedin.com/in/alanvarghese-dev


Top comments (0)