DEV Community

Christiana Otoboh
Christiana Otoboh

Posted on

Automating Backups to S3 with Bash, Crontab & AWS CLI as a Beginner

As part of a learning assignment, I was tasked with creating an automated backup system.
The goal was to:

  • Backup a specific local directory on my Linux machine.
  • Compress the backup as a .tar.gz file.
  • Upload it to an Amazon S3 bucket.
  • Log each step with timestamps.
  • Automate the process with a cron job.

While this sounds straightforward for an experienced DevOps engineer, I had zero knowledge of Linux scripting or AWS CLI when I started. What followed was a rollercoaster of trial, error, and growth. This post documents my journey, what I learned, the final working solution, and how I overcame some tough beginner mistakes.

My Learning Roadmap

1. Learning Bash Scripting

I started with learning how to write a basic bash script. I needed it to:

  • Define a source and destination path.
  • Use tar to compress the files.
  • Log each step with a timestamp.

2. Understanding S3 Uploads

Initially, I installed the AWS CLI without configuring it. Later, I stumbled upon s3fs (a way to mount an S3 bucket as a local file system), but I got stuck for hours trying to make it work.

Thankfully, I reached out to a senior developer who advised me to use aws s3 cp instead, which was simpler and more appropriate for my use case.

3. Configuring AWS CLI

After some dependency issues related to Python (which AWS CLI uses), I was able to:

  • Create an IAM user with programmatic access.
  • Configure my AWS CLI with the IAM credentials (aws configure).
  • Successfully test uploads using aws s3 cp.

4. Automating with Crontab

Finally, I set up a cron job to run my script every day at 7:00 AM. I used crontab -e and added:
0 7 * * * /home/christiana/backup.sh

The Final Working Script

Here’s the working version of my backup script:

#!/bin/bash

SOURCE=/home/christiana/documents
DEST=/home/christiana/backupfolder
S3_BUCKET=s3://demosample-backupbucket/Backup/

NOW=$(date +"%Y-%m-%d_%H-%M-%S")
BACKUP_NAME="backup_$NOW.tar.gz"
BACKUP_PATH="$DEST/$BACKUP_NAME"
LOG="$DEST/backup_log.txt"

echo "[$NOW] Starting backup..." >> "$LOG"

# Create compressed backup
tar -czvf "$BACKUP_PATH" "$SOURCE"
if [ $? -eq 0 ]; then
  echo "[$NOW] Backup created: $BACKUP_NAME" >> "$LOG"
else
  echo "[$NOW] Backup FAILED" >> "$LOG"
  exit 1
fi 

# Upload to S3
aws s3 cp "$BACKUP_PATH" "$S3_BUCKET/"
if [ $? -eq 0 ]; then
  echo "[$NOW] Upload to S3 Successful" >> "$LOG"
else  
  echo "[$NOW] Upload to S3 FAILED" >> "$LOG"
  exit 1
fi 

echo "[$NOW] Backup completed" >> "$LOG"
Enter fullscreen mode Exit fullscreen mode

Troubleshooting and Lessons Learned

Here are some common problems I faced and how I fixed them:

1.AWS CLI Installed but Not Working
Problem: I installed the AWS CLI but forgot to configure it.

Solution: Ran aws configure and entered my IAM credentials.

2.Python Dependency Errors
Problem: AWS CLI wasn't working due to missing Python dependencies.

Solution: I installed the required version of Python and ensured it was available in my PATH.

3.Using s3fs Instead of aws s3 cp
Problem: I wasted hours trying to mount S3 as a drive with s3fs.

_Solution: I learned it’s better to use aws s3 cp for one-off uploads. It's simpler, faster, and has fewer moving parts.
_
4.Crontab Didn't Run My Script
Problem: Cron job wasn’t running the script.

Fixes:

  • Ensured the script had execute permission (chmod +x backup.sh).
  • Used absolute paths in the script (cron doesn't know your environment).
  • Added full paths for tar, aws, etc., or sourced my environment manually.

Final Thoughts

This small project taught me a lot:

  • How to write bash scripts and use tar/gzip for compression.
  • The power and simplicity of AWS CLI.
  • The importance of reaching out when you're stuck.
  • How to schedule recurring tasks in Linux using cron.

I started as a complete beginner and ended with a working automation system that runs every morning at 7 AM, backing up my documents to the cloud.

If you’re starting your DevOps or scripting journey, don't be afraid to struggle just document everything, ask questions, and keep going. You'll figure it out. Just like I did.

✅ Tools Used:

  1. Bash
  2. AWS CLI
  3. IAM User with S3 permissions
  4. tar and gzip
  5. crontab

Top comments (0)