In modern software development, automating every aspect of the workflow has become a necessity for ensuring quality, security, and speed. GitLab CI/CD (Continuous Integration and Continuous Deployment) provides a powerful platform for automating these tasks. In this article, we’ll walk through setting up a complete GitLab CI/CD pipeline that includes code scanning for vulnerabilities using SonarQube, testing with Jest, Cypress, and Locust, creating Docker images for both the backend and frontend, and finally, deploying those images.
Introduction to GitLab CI/CD
GitLab CI/CD enables developers to automate the build, test, and deployment stages of their software development lifecycle. With .gitlab-ci.yml\
, the configuration file, you can define a series of jobs that GitLab runner will execute whenever code is pushed to the repository.
This article assumes you already have a basic GitLab repository set up. We will progressively build a CI/CD pipeline that ensures the code is secure, tested, and ready to deploy to production, all without manual intervention.
Setting Up the GitLab CI/CD Pipeline
To get started, we’ll define our pipeline in a .gitlab-ci.yml\
file. This file will contain jobs for code scanning, testing, building Docker images, and deployment.
First, here’s a high-level view of our pipeline:
stages:
- scan
- test
- build
- deploy
Each job will be assigned to one of these stages. This ensures that tasks are performed in the correct order, and failures in one stage will prevent subsequent stages from running.
Key Prerequisites
- GitLab Runner must be installed and registered with your GitLab instance.
- Docker should be available on the GitLab Runner for building images.
- SonarQube should be configured for code scanning.
Now, let’s dive into each stage.
Step 1: Code Scanning with SonarQube
Code security is critical, and catching vulnerabilities early saves significant time and effort later. SonarQube is a popular tool for code quality and security analysis.
First, ensure that your project is configured to work with SonarQube. This may require adding a sonar-project.properties\
file in your project root.
Here’s an example of a job for running SonarQube in your CI/CD pipeline:
sonarqube_scan:
stage: scan
image: sonarsource/sonar-scanner-cli:latest
script:
- sonar-scanner -Dsonar.projectKey=my-project-key -Dsonar.host.url=$SONAR_HOST_URL -Dsonar.login=$SONAR_TOKEN
only:
- merge_requests
- master
- Stage: We assign this job to the scan\
stage.
- Image: We use the official SonarQube scanner Docker image.
- Script: The script runs the SonarQube scanner using environment variables for the host URL and token.
By limiting the job to run on merge requests and the master branch, we avoid unnecessary scans for every push to feature branches.
Step 2: Testing the Code with Jest, Cypress, and Locust
Testing is a critical step to ensure that the code behaves as expected under different scenarios. Here’s how to incorporate different testing frameworks into your pipeline:
Unit Testing with Jest
Jest is a popular testing framework for JavaScript, especially in React projects.
unit_tests:
stage: test
image: node:14
script:
- npm install
- npm run test
artifacts:
when: always
paths:
- coverage/
- Image: We use the official Node.js Docker image to run our tests.
- Script: We install dependencies and run the tests. The test coverage is saved as an artifact to be reviewed later.
End-to-End Testing with Cypress
Cypress is an end-to-end testing framework for web applications. It simulates real user interactions with your application in a browser.
e2e_tests:
stage: test
image: cypress/base:10
script:
- npm install
- npm run cypress:run
- Image: We use Cypress’s base image.
- Script: We install dependencies and run Cypress tests.
Load Testing with Locust
Locust is a tool for load testing web applications by simulating user traffic.
load_tests:
stage: test
image: locustio/locust
script:
- locust -f locustfile.py --headless -u 100 -r 10 --run-time 1m --host http://myapp.com
- Image: The official Locust Docker image is used.
- Script: Locust runs in headless mode, simulating 100 users over one minute, targeting your deployed application.
Step 3: Building Docker Images for the Backend and Frontend
After passing all the scans and tests, it’s time to package the application into Docker images. We’ll create separate Docker images for the backend and frontend.
Building the Backend Docker Image
build_backend:
stage: build
image: docker:20
services:
- docker:dind
script:
- docker build -t my-backend:$CI_COMMIT_SHA -f backend/Dockerfile .
- docker push my-backend:$CI_COMMIT_SHA
- **Image**: We use Docker’s official image.
- **Services**: We enable Docker-in-Docker (dind) to allow Docker commands within the runner.
- **Script**: The script builds and pushes the Docker image using the current commit SHA as a tag.
Building the Frontend Docker Image
build_frontend:
stage: build
image: docker:20
services:
- docker:dind
script:
- docker build -t my-frontend:$CI_COMMIT_SHA -f frontend/Dockerfile .
- docker push my-frontend:$CI_COMMIT_SHA
The frontend image is built similarly to the backend image, with its own Dockerfile and tagged with the commit SHA.
Step 4: Deploying Docker Images
Once the images are built and pushed to the Docker registry, the next step is to deploy them. This can be done using several deployment strategies like Kubernetes, Docker Swarm, or even simple Docker Compose.
Here’s an example using Docker Compose to deploy both frontend and backend:
deploy:
stage: deploy
image: docker:20
services:
- docker:dind
script:
- docker-compose -f docker-compose.prod.yml up -d
environment:
name: production
url: http://myapp.com
only:
- master
- Script: We use Docker Compose to bring up the containers in detached mode.
- Environment: We define this job to run only in the production environment, deploying the application to the specified URL.
Conclusion
By setting up a robust GitLab CI/CD pipeline, you can automate code scanning, testing, and deployment, ensuring that your application is secure, reliable, and always ready for production. With tools like SonarQube, Jest, Cypress, Locust, and Docker, your workflow becomes more efficient and less error-prone.
This setup is just the beginning. As your project evolves, you can continue to refine and expand your pipeline to fit your team’s needs. Continuous Integration and Continuous Deployment aren’t just about automation — they’re about fostering a culture of quality and efficiency in software development.
Top comments (0)