Automated Database Backups: Protecting Your Production Data
Most developers don't think about backups until they lose data. Set it up now, before you need it.
The 3-2-1 Rule
- 3 copies of your data
- 2 different storage types
- 1 off-site copy
For most SaaS: primary DB + automated snapshots + S3 backups = covered.
PostgreSQL Backup Script
#!/bin/bash
# backup.sh
set -euo pipefail
DB_NAME="myapp_production"
BACKUP_DIR="/var/backups/postgres"
S3_BUCKET="s3://my-backups/postgres"
TIMESTAMP=$(date +%Y%m%d_%H%M%S)
FILENAME="${DB_NAME}_${TIMESTAMP}.sql.gz"
# Create backup
pg_dump \
--no-password \
--format=custom \
--compress=9 \
"$DATABASE_URL" | gzip > "${BACKUP_DIR}/${FILENAME}"
# Upload to S3
aws s3 cp "${BACKUP_DIR}/${FILENAME}" "${S3_BUCKET}/${FILENAME}"
# Delete local copies older than 7 days
find "$BACKUP_DIR" -name '*.sql.gz' -mtime +7 -delete
echo "Backup complete: ${FILENAME}"
Automating With Cron
# crontab -e
# Daily backup at 2 AM
0 2 * * * /usr/local/bin/backup.sh >> /var/log/backup.log 2>&1
# Hourly backup for high-write databases
0 * * * * /usr/local/bin/backup.sh >> /var/log/backup.log 2>&1
Point-in-Time Recovery With WAL
# postgresql.conf — enable WAL archiving
wal_level = replica
archive_mode = on
archive_command = 'aws s3 cp %p s3://my-backups/wal/%f'
With WAL archiving, you can restore to any point in time — not just backup snapshots.
Testing Restores (Critical!)
# Restore to a test database monthly to verify backups work
pg_restore \
--no-password \
--clean \
--if-exists \
--dbname=test_restore \
backup_20240115_020000.sql.gz
echo 'Restore test complete'
An untested backup is not a backup. Add a monthly restore test to your calendar.
Managed Alternatives
If you're on Supabase, Railway, or Neon: automated daily backups are included. On RDS: enable automated backups and multi-AZ. On a VPS: the script above plus S3 is your stack.
Database backup, monitoring, and disaster recovery patterns are documented in the AI SaaS Starter Kit.
Top comments (0)