Since I've started using WSL, most of my dev projects have migrated there.
And while the code is mostly stored on GitHub, there's still a question of backing up the rest of the files (dotfiles, auxiliary scripts, data etc). Basically, I want to run daily incremental backups of my home dir, preferably encryped.
In Windows I have a nice open-source tool called Duplicati which makes daily incremental encrypted backups to a NAS. So, I was hoping to use it for WSL, too.
Fail: The Windows way
As the first shot, I tried accessing WSL files from Duplicati running in Windows, and added this path to my backup: \\wsl$\Ubuntu\home
The performance was horrendous, just as I expected. β
(Maybe I should have excluded all those node_modules
after all... π€)
The next experiment was grabbing the whole WSL file system image file. It's located at %LOCALAPPDATA%\Packages\ CanonicalGroupLimited.UbuntuonWindows_79rhkp1fndgsc\LocalState\ext4.vhdx
However, at some point WSL wouldn't start; I guess the file got locked by the backup process. Bad idea! Having one big file isn't good for incremental backups anyway, so I dropped this plan. β
The "official" method with wsl.exe βexport
also doesn't suit me, since it requires shutting down WSL and produces another big file. β
Well, it looks like I'll have to do everything from inside WSL! And it has plenty of tools to choose from.
Success: The Linux way
The first candidate is rsync:
rsync -av -e ssh /home rsync@192.168.1.200::/backup/wsl-files/
(On the NAS, I've added the user 'rsync' with proper privileges).
This does the job pretty quickly and can be automated. To avoid the interactive password prompt, I'd need a key- or a password file (both require some effort to set up). π
However, I wanted to have my files in the destination folder encrypted. So I decided to use duplicity. As a quick test, I ran this:
export PASSPHRASE=MySuperPassphrase
duplicity /home sftp://duplicity:MySuperPassword@192.168.1.200/home/wsl-backup
The PASSPHRASE is used to encrypt the files.
By default, duplicity creates full backup on the 1st run and incremental backups after. π―
There are more advanced script examples in the Ubuntu help. Following them, I came to something like this:
#! /bin/bash
BASEDIR=$(dirname "$0")
# Export vars from .env (no spaces or = in values). Required:
# NAS_HOST=
# NAS_USER=
# NAS_PASSWORD=
# PASSPHRASE=
export $(egrep -v '^#' $BASEDIR/.env | xargs)
duplicity --full-if-older-than 1M /home/alex/ sftp://${NAS_USER}:${NAS_PASSWORD}@${NAS_HOST}/home/wsl-backup
duplicity remove-older-than 3M --force sftp://${NAS_USER}:${NAS_PASSWORD}@${NAS_HOST}/home/wsl-backup
The variables used in the script are kept in the .env
file located in the same dir as the script.
Now I just need to schedule my backup script like this, right? (crontab -e
to edit the current user's cron table)
0 21 * * * /home/alex/nas-backup/backup.sh >>/home/alex/nas-backup/duplicity.log
Not so fast! It's WSL for you, so somebody has to start the cron service. π·ββοΈ
There are different ways to do it, I was lazy and added this "autostart" to my .zshrc file:
service cron status || sudo service cron start
This means the backup won't run if I'm not using WSL, but that's only logical.
Windows scheduler can be used as an alternative, as described here.
One last step. Since I'm starting the cron service as a superuser, I have to enter the password every time. To avoid this, I created a rule in /etc/sudoers.d/service
, disabling the password request for this case:
alex ALL=NOPASSWD:/usr/sbin/service cron start
And here's a bonus script to run once in a while and verify the files:
#! /bin/bash
export $(egrep -v '^#' .env | xargs)
duplicity verify -vInfo sftp://${NAS_USER}:${NAS_PASSWORD}@${NAS_HOST}/home/wsl-backup /home/alex/
Now we're all set! β
But wait, what about the databases and Docker containers? Well, maybe next time...
Top comments (0)
Some comments have been hidden by the post's author - find out more