DEV Community

Cosme Escobedo
Cosme Escobedo

Posted on • Updated on • Originally published at cosme.dev

How I added zero downtime deployment to my website

While I was building laravelremote.com (a Laravel remote job aggregator), I was worried that each time I wanted to push some changes to the production server, it might take the website down for a couple of seconds. Which probably isn't a big deal, but I still wanted to avoid it.

Laravel offers a first-party paid product to avoid this, Envoyer it's only $10 bucks a month. But laravelremote.com doesn't generate any revenue right now, and I'm the type of person that likes to do things in-house to learn how it works, and I also like the freedom that it provides.

So with that in mind, I figured out a way to do this for free, and I wanted to share this in case someone else finds it useful.

The theory

I searched around the internet to see how Envoyer works under the hood, I don't exactly remember where I saw this, but basically, it works like this:

We have 3 directories that relate to our project:

  • releases/
  • current/
  • storage/

A new directory will be created in the releases/ directory each time we need to deploy a change to the website, then we run some "set up" commands (ex. composer install, move the cache routes, views, etc.), and we add a symbolic link from the releases' storage directory to the "permanent" storage/ directory, since we don't want to forget sessions and stuff like that on each deployment. After all of this, we add a symbolic link to the current/directory that points to the new release we just created.

This article is not a tutorial on how to deploy a Laravel app, so I'm not going to explain my NGINX configuration. The only thing you need to know is that my NGINX server is configured to read my website from the /var/www/laravelremote.com/ directory. This directory has another symbolic that points to the current/ directory.

More simply, every new release will be deployed then linked to the current directory, which will make the server show the updates.

The practice

Let's put that into practice, first I created a base directory that will contain everything we need for the project. I like to add this directory under the home folder and give it the name of the app

mkdir laravelremote/
Enter fullscreen mode Exit fullscreen mode

since we are going to be working in this directory, we should cd into it

cd laravelremote
Enter fullscreen mode Exit fullscreen mode

Now we need to create the storage directory and the other subdirectories that Laravel uses.

mkdir storage/
mkdir -p storage/cache/data/
mkdir storage/sessions/
mkdir storage/views/
Enter fullscreen mode Exit fullscreen mode

These are the default Laravel directories that exist within the storage directory.

After that, we need to create the releases folder.

mkdir releases/
Enter fullscreen mode Exit fullscreen mode

Also, we should create and global .env file that we should link to on every release.

touch .env
Enter fullscreen mode Exit fullscreen mode

Add to this file the production environment variables.

The current directory will be created once we deploy the project for the first time, so there is no need to create it right now.

Deploying

We could create a new directory in the releases' directory, pull the main branch from GitHub, run every other needed command, and link everything at the end to the current directory. But that seems like a lot of steps for simple deployment, so I created a bash script to do that instead:

#!/bin/sh

UNIX_TIME=$(date +%s)

DEPLOYMENT_DIRECTORY=$UNIX_TIME

mkdir -p $DEPLOYMENT_DIRECTORY
git clone --depth 1 --branch master git@github.com:username/reponame.git $DEPLOYMENT_DIRECTORY

echo removing storage
rm -Rf $DEPLOYMENT_DIRECTORY/storage

cd $DEPLOYMENT_DIRECTORY

cp ~/laravelremote/.env .env

echo link master storage
ln -s -n -f -T ~/laravelremote/storage ~/laravelremote/releases/$DEPLOYMENT_DIRECTORY/storage

composer install --no-dev
npm install
npm run prod

echo view:cache command
php artisan view:cache

echo route:cache command
php artisan route:cache

echo config:cache command
php artisan config:cache

echo migrations
php artisan migrate --force


cd ..

#link to the latest deployment from Live
ln -s -n -f -T ~/laravelremote/releases/$DEPLOYMENT_DIRECTORY ~/laravelremote/current

echo Done!

Enter fullscreen mode Exit fullscreen mode

This script lives under the releases/ folder. It creates a new directory for the new release with the current UNIX timestamp, then it pulls the repo from GitHub into that directory. After that, I remove the storage folder from that directory, copy the .env file that contains the production keys (you might want to create a symbolic link instead), and then link the "permanent" storage file to the current release.

After that, I run some necessary commands for the project to work.

  • Install the composer dependencies
  • Install the NPM dependencies and build the resources. (I don't commit my build resources, other people might do it differently)
  • Cache the views, routes, and config for optimization
  • Run the database migrations

Then finally, we link the current directory to this release, which will publish the changes.

And that's it, whenever I push something to the master branch, I ssh into the server, and then I run this script, and my changes are deployed with 0 downtime.

Possible improvements

I still have some things that I don't like about this approach.
One of them is that I would like to make it, so it automatically deploys the changes when I push them to GitHub without having to ssh into the server.
Another one is that the old versions are not removed automatically, I currently remove them manually, but I would like to automate this too. And since we are talking about that, I noticed that Envoyer has a way to roll back the changes if something goes wrong, so you can link one of the previous versions if you need to. That would probably need a more complicated setup on my part, but It would be interesting to do.

Regardless, I'm pretty happy with how this turned up, even though I still want to improve it. If you would like to hear about those future improvements follow me on Twitter, I'll probably post any developments there.

Latest comments (4)

Collapse
 
nssimeonov profile image
Templar++

A mor mature and scalable solution, especially if you have a load balancer and multiple front-end servers, would be to spin off new instances and once deployment is complete to incrementally route more traffic to them and reduce the traffic to the old nodes.

Collapse
 
mtk3d profile image
Mateusz Cholewka

Have you tried maybe the deployer? deployer.org/
It can handle zero downtime and you can specify how many versions should be keeped, and some others. In the documentation is an example of how to use it in GitHub actions or gitlab ci 🙂.

Collapse
 
cosmeoes profile image
Cosme Escobedo

Other people have recommended it as well, so I might give it a try, thank you!

Collapse
 
brunot profile image
Bruno

Either that or maybe use CircleCI or any other CI/CD platform along with Docker to make it even simpler? With the proper setup you can manage multiple envs from a central place and have those updated every time you push code to your repo.