DEV Community

Kevin Naidoo
Kevin Naidoo

Posted on

How to "hot reload" remote files on a Linux Server

Sometimes you just need to copy files to a remote server while testing without having to keep running your build and deploys.

I have a Laravel app that will run through some build steps including running unit tests, running Vite to build react code, and so forth. The deployment can take 2-3 minutes and sometimes I'm just impatient.

Since I am deploying to a test server - it's fine If something breaks. You probably can use a similar script for production, however, there are better tools for that like Ansible or Jenkins or some CI tool.

The script - hotreload.sh

#!/bin/bash
# Stop script on failure
set -e

# $(pwd) - will substitute with the full path of the 
# - directory this script is running in. You can also
# - use any absolute path :/home/kevin/somewhere/

while inotifywait -r -e modify,create,delete "$(pwd)/app/"; do

# -a does many things like recursive copy, keep perms, etc...
# -v for verbose.
# -e use ssh on a different port and not 22.
# use rsync -h - to get more details on these flags.

rsync -av -e 'ssh -p 2022' "$(pwd)/app/"  \
devuser@testcluster-01:/var/www/webapp/app/

# Optional - fix owner since the remote user is not the same.
ssh -p2022 devuser@testcluster-01 \ 
'sudo chown -R www-data:www-data /var/www/webapp/app'
done
Enter fullscreen mode Exit fullscreen mode

You might need to install "inotify-tools"

sudo apt-get -y install inotify-tools
Enter fullscreen mode Exit fullscreen mode

Now just run the script:

chmod +x hotreload.sh
./hotreload.sh
Enter fullscreen mode Exit fullscreen mode

This script will listen for any file changes in the directory path specified, if a change happens - it simply copies and updates these files on your remote server using rsync.

Speeding things up

Rsync on its own cannot transmit multiple files at the same time, however, Linux has another powerful utility called "xargs".

Xargs - can take input from any command such as "ls" or "find", and pipe that as arguments to other Linux commands such as "rsync".

Furthermore, it's capable of spawning multiple child processes - which is perfect for our use case. We'll read all the files in the current directory being monitored and use "xargs" to spawn 4 "rsync" processes at a time to work through the full list of files and folders.

Example:

#!/bin/bash
# Stop script on failure
set -e

while inotifywait -r -e modify,create,delete "$(pwd)/app/"; do

# xargs - will take the output from ls and pass it to rsync.
# -n1 - pipe output to one command i.e. rsync.
# -P4 - run 4 rsync's in parallel.
# -I% - a placeholder that will hold the current
# - file/directory passed from ls through xargs to rsync.

ls "$(pwd)/app/" | xargs -n1 -P4 -I% \ 
rsync -av -e 'ssh -p 2022' \
--progress % devuser@testcluster-01:/var/www/webapp/app/

done
Enter fullscreen mode Exit fullscreen mode

Conclusion

Although, this article is about syncing files from your local to a remote server. I am sure you will agree both "rsync" and "xargs" are powerful utilities that can be used for a wide variety of use cases.

One of the most common use cases I have used these utilities together for - is taking backups of MySQL servers.

You can sync hundreds of Gigabytes of data in minutes depending on your connection and drive speeds, sometimes this is just a much quicker option to do than using "mysqldump" or even xtrabackup from the Percona toolkit.

Top comments (0)