DEV Community

Cover image for Automate React App deployment using GitLab CI/CD, Ansible, Docker
Minaro
Minaro

Posted on • Edited on

6 1

Automate React App deployment using GitLab CI/CD, Ansible, Docker

Custom workflow to automate your deployment using Gitlab and Ansible.

If you want to see your changes online every time you push some code, you need a custom workflow. You are at the right place.
Follow this short tutorial, I have the perfect recipe for you !

We will use the most suitable tools for our project, let me introduce our little friends :

  • GitLab to host our project and run pipelines
  • Ansible to deploy on remote server
  • Docker to containerized our app and ship it easily and fast
  • ReactJS fontend frameworks for the demo (it could be anything else, but you’ll need to adapt the container)

1. Create a new project with create-react-app and host it on Gitlab

We will create a new react app with CRA:

yarn create react-app <project_name> --template typescript
cd <project_name>
git remote add origin <gitlab_project_url>
git add .
git commit -m “Initial commit”
git push -u origin master
Enter fullscreen mode Exit fullscreen mode

2. Create CI job with GitLab CI

At the root of your project we will create a new file .gitlab-ci.yml to enable the gitlab CI.

stages:
- build
cache:
paths:
- build
# TEMPLATES
.build: &build
image: node:lts-alpine3.13
stage: build
before_script:
- yarn install --production --frozen-lockfile
script:
- yarn build
- echo Build successful !
artifacts:
expire_in: 30 mins
paths:
- ./build
.cache: &global_cache
key: $CI_REGISTRY_IMAGE
paths:
- node_modules/
- public/
policy: pull-push
# STAGES
build:
<<: *build
cache:
<<: *global_cache
only:
- tags
view raw .gitlab-ci.yml hosted with ❤ by GitHub

Currently this file contains only one stage dedicated to the build. We basically install only the dependencies of our project and not the devDependencies.
Then we build and we cache the build folder for later jobs.
We also cache the node_modules folder so the next time we don’t need to download all the node_modules again.

3. Containerize and release our app with Docker on the GitLab container registry

We have created a production build for our react app but we still need to create a container to ship it.

stages:
- build
- release
cache:
paths:
- build
variables:
DOCKER_HOST: tcp://docker:2375
DOCKER_DRIVER: overlay2
DOCKER_TLS_CERTDIR: ""
# TEMPLATES
.build: &build
image: node:lts-alpine3.13
stage: build
before_script:
- yarn install --production --frozen-lockfile
script:
- yarn build
- echo Build successful !
artifacts:
expire_in: 30 mins
paths:
- ./build
.release: &release
image: docker:stable
stage: release
services:
- docker:dind
before_script:
- docker login -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD $CI_REGISTRY
script:
- docker build -t <image_name> --cache-from <image_name> .
- docker push <image_name>
- echo Successfully push to registry !
.cache: &global_cache
key: $CI_REGISTRY_IMAGE
paths:
- node_modules/
- public/
policy: pull-push
# STAGES
build:
<<: *build
cache:
<<: *global_cache
only:
- tags
release:
<<: *release
only:
- tags
view raw .gitlab-ci.yml hosted with ❤ by GitHub

We added the release stage which build the container. We use the cache again but for the layers of the image so the less the container change the faster it will be built. We need to login to the project registry thanks to predefined variables and simply push the freshly created container to the registry.

4. Deploy and update our app in production

Now we have prepared everything for the deployment but our app is stored in a container registry but still not running in production. We will be a bit hand-made here.

4.1 Create a SSH key

First thing, we need to be able to connect our deployment server through SSH. So let’s create an SSH key !

sh-keygen -t ed25519 -f ~/.ssh/id_ansible -N ''
ssh-copy-id -i ~/.ssh/id_ansible <user>@<ip>
Enter fullscreen mode Exit fullscreen mode

The last command will add your public key on the server.

cat ~/.ssh/id_ansible
Enter fullscreen mode Exit fullscreen mode

This command reveal your ssh private key, copy it we will use it in the next step. But do not share it with anyone !

4.2 Add a SSH Key to GitLab

In order to connect to your server and deploy the container we need to add this private key to the runner.
To upload the key to GitLab, go to Settings > CI/CD > Variables and click on Add variable. In key write ANSIBLE_DEPLOY_KEY and in Value paste the private key. If you have not created protected branches or tags, uncheck Protect variable, click Add Variable and you're done.

4.3 Create a docker-compose file

On your server under /home/<user>/prod create a docker-compose.yml in order to run your container.

version: "3.7"
services:
<poject_name>:
image: "registry.gitlab.com/<user>/<poject_name>:latest"
container_name: <poject_name>
restart: unless-stopped

4.4 Create an Ansible Playbook

We will use Ansible to execute remote command on the server during the pipelines, this setup does not aims to explain Ansible; it’s a simple setup.
At the root of your project you can create the ansible/ directory.

  • in ansible/hosts copy/paste the file under and replace the <user> and <ip>
  • in ansible/roles/deployment/tasks/main.yml copy/paste the content of the second file
  • in ansible/playbooks again copy/paste the third file.
[environment:children]
prod
[prod]
<ip> ansible_user=<user> ansible_ssh_private_key_file=~/.ssh/id_ansible
view raw hosts hosted with ❤ by GitHub
---
- name: Log into gitlab registry
command: docker login {{ gitlab_registry }} -u {{ gitlab_user }} -p {{ gitlab_password }}
no_log: true
- name: Pull docker images
command: docker-compose pull {{ docker_service }}
args:
chdir: "{{ docker_compose_path }}"
- name: Log out of any docker registry
command: docker logout
no_log: true
- name: Deploy services
command: docker-compose up -d {{ docker_service }}
args:
chdir: "{{ docker_compose_path }}"
- name: Remove old image
command: docker image prune -a -f
view raw main.yml hosted with ❤ by GitHub
- name: "Preparing your environment... "
hosts: environment
roles:
- role: deployment
vars:
docker_compose_path: /home/<user>/prod # remote target path of the project (where task will execute)
ansible_python_interpreter: /usr/bin/python3 # remote target python path
vars_files:
- vars/credentials.yml
view raw playbook.yml hosted with ❤ by GitHub

4.5 Deploying with Ansible playbook

Last step, we added the deploy stage that will connect to the server, pull from the registry and only restart the container we pulled.

stages:
- build
- release
- deploy
cache:
paths:
- build
variables:
DOCKER_HOST: tcp://docker:2375
DOCKER_DRIVER: overlay2
DOCKER_TLS_CERTDIR: ""
ANSIBLE_HOST_KEY_CHECKING: "false"
# TEMPLATES
.build: &build
image: node:lts-alpine3.13
stage: build
before_script:
- yarn install --production --frozen-lockfile
script:
- yarn build
artifacts:
expire_in: 30 mins
paths:
- ./build
.release: &release
image: docker:stable
stage: release
services:
- docker:dind
before_script:
- docker login -u $CI_REGISTRY_USER -p $CI_REGISTRY_PASSWORD $CI_REGISTRY
script:
- docker build -t <image_name> --cache-from <image_name> .
- docker push <image_name>
- echo Successfully push to registry !
.deploy-: &deploy
image: williamyeh/ansible:ubuntu18.04
stage: deploy
before_script:
- eval $(ssh-agent -s)
- ssh-add <(echo "$ANSIBLE_DEPLOY_KEY")
script:
- mkdir ansible/vars
- 'echo -e "---\ngitlab_password: $CI_REGISTRY_PASSWORD \ngitlab_user: $CI_REGISTRY_USER \ngitlab_registry: $CI_REGISTRY \ndocker_service: CI_PROJECT_NAME" > ./ansible/vars/credentials.yml'
- ansible-playbook ./ansible/playbook.yml -i ./ansible/hosts
- echo Successfully deployed !
.cache: &global_cache
key: $CI_REGISTRY_IMAGE
paths:
- node_modules/
- public/
policy: pull-push
# STAGES
build:
<<: *build
cache:
<<: *global_cache
only:
- tags
release:
<<: *release
only:
- tags
deploy:
<<: *deploy
only:
- tags
view raw .gitlab-ci.yml hosted with ❤ by GitHub

Thoughts

I’m aware that this setup does not cover every use case and does not follow every best practice in terms of security, like importing a ssh private key in GitLab or not handling errors if deployment failed.
But I wanted to keep it simple as a basic setup so you can adapt it to your needs, if you have any advise feel free to write a comment, this is my first try so I will definitely improve it.

Thank you for reading !

Heroku

Simplify your DevOps and maximize your time.

Since 2007, Heroku has been the go-to platform for developers as it monitors uptime, performance, and infrastructure concerns, allowing you to focus on writing code.

Learn More

Top comments (1)

Collapse
 
essanousy profile image
mohamed es-sanousy

Easy to read and follow, thanks for sharing

Postmark Image

Speedy emails, satisfied customers

Are delayed transactional emails costing you user satisfaction? Postmark delivers your emails almost instantly, keeping your customers happy and connected.

Sign up

👋 Kindness is contagious

Please leave a ❤️ or a friendly comment on this post if you found it helpful!

Okay