CI and CD stand for Continuous Integration and Continuous Delivery. CI refers to a modern software development practice of incrementally and frequently integrating code changes into a shared source code repository. Automated jobs build and test the application to ensure the code changes being merged are reliable. The CD process is then responsible for quickly delivering the changes into the specified environments.
In my last post you learned how to deploy a Django app to an AWS Lambda function using Serverless Framework. This post will use the same project available in GitHub. Now we’ll create a pipeline to run unit tests, build and deploy the app automatically whenever a new merge request is merged into the staging
branch of the project. So, the first step is to create the staging
branch, you can run the command bellow in a terminal window in the root folder of the project:
git checkout -b staging
git push origin staging
The first command creates and switches to the new branch, the second command pushes it to the server. staging
will be our main branch, all feature branches must be created from it. Create the the branch we will work on, I’ll call it create-cicd-pipeline
:
git checkout -b create-cicd-pipeline
The .gitlab-ci.yml
file
To use GitLab CI/CD, we start by creating a .gitlab-ci.yml
file at the root of our project. This file will contain the configuration for our CI/CD pipelines.
Create the file and add the code bellow:
stages:
- test
"Server Tests":
image: python:3.9-slim
stage: test
rules:
- if: $CI_PIPELINE_SOURCE == "merge_request_event" && $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "staging"
before_script:
- pip install --upgrade pip
- pip install -r requirements/dev.txt
script:
- pytest -v
The code above crates a pipeline with one job named “Server Tests”. The job belongs to the “test” stage and will run in the python:3.9-slim
Docker image.
stages
Stages can contain groups of jobs. The order of the items in stages
defines the execution order of the jobs. Important to know about stages:
- Jobs in the same stage run in parallel
- Jobs in the next stage run after the jobs from the previous stage complete successfully
before_script
, script
These sections define the commands that will run during job execution. If any of those script commands return a non-zero exit code (indicating an error or failure), the job fails and subsequent commands do not execute.
rules
It is used to determine whether a job should be included (run) or excluded (not run) from a pipeline. Rules are evaluated when the pipeline is created. When a match is found, the job is either included or excluded from the pipeline depending on the configuration.
The rules:if
clause specifies when to add a job to a pipeline. If the if
statement is true, then the job is added. In our if
statement we’re using two predefined CI/CD variables.
Predefined CI/CD variables
GitLab CI/CD makes a set of predefined variables available in every GitLab CI/CD pipeline. Those variables can be used in pipeline configuration and job scripts. To determine the execution of this job, we are using the following variables in the rules:if
clause:
CI_PIPELINE_SOURCE
How the pipeline was triggered. This variable can take on a few different values, such as push, web, api, among others. When a merge request is created or updated, its value becomes merge_request
.
CI_MERGE_REQUEST_TARGET_BRANCH_NAME
The target branch name of the merge request.
Now that you understand the rules
clause and the variables we’re using in the if
statement, you can guess when this job is going to be run, right?
The “Server Tests” job will be added to the pipeline every time a Merge Request, whose target branch is “staging”, is created or updated (when you push more commits to the MR’s source branch, for example).
There’s one last thing missing before we can create our first MR and run the pipeline — The database!
The service
clause
A service is an additional container that your pipeline can create. The service will be available to your first container. The two containers will have access to one another and will be able to communicate when the job is running. You specify the service’s image by using the services
keyword.
The most common use case for this clause is to run a database container. It’s easier and faster to use an existing image and run it as an additional container than to install PostgreSQL, for example, each time the pipeline is run.
Let’s create our PostgreSQL service and pass custom environment variables to the containers so our application can connect to the database:
stages:
- test
"Server Tests":
image: python:3.9-slim
stage: test
services:
- postgres:15.4-alpine
rules:
- if: $CI_PIPELINE_SOURCE == "merge_request_event" && $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "staging"
before_script:
- pip install --upgrade pip
- pip install -r requirements/dev.txt
script:
- pytest -v
variables:
PGDATA: /pgdata
POSTGRES_USER: secret
POSTGRES_PASSWORD: secret
POSTGRES_DB: django_serverless
DB_HOST: postgres
DB_NAME: django_serverless
DB_USER: secret
DB_PASSWORD: secret
DB_PORT: 5432
The custom environment variables are defined in the variables
clause. The variables starting with POSTGRES_
will be used by the PostgreSQL container to set up the database. The ones starting with DB_
will be used by the Django application to connect to the database. The app’s connection configuration is located in the myapi/settings.py
file. Search for the DATABASE
constant.
Commit and push the changes to GitLab so we can create our first MR.
Create a Merge Request on GitLab
On your project’s home page, navigate to the left-side menu and click on “Code” then select “Merge Requests”. Choose the source and target branches as shown in the image bellow, and click on “Compare branches and continue”.
In the next page you can just click on “Create merge Request”.
The pipeline is automatically triggered after the MR is created. You can click on the pipeline ID to check the stages and jobs:
You can also click on the jobs to see the detailed execution logs.
The Deploy Job
Create the “Deploy Staging” job after the “Server Tests” one in the .gitlab-ci.yml
:
"Deploy Staging":
image: node:16-bullseye
stage: deploy
environment: staging
rules:
- if: $CI_COMMIT_REF_NAME == "staging" && $CI_PIPELINE_SOURCE == "push"
before_script:
- apt-get update && apt-get install -y python3-pip
- npm install -g serverless
- npm install
- touch .env
- echo "STATIC_FILES_BUCKET_NAME=$STATIC_FILES_BUCKET_NAME">>.env
- echo "AWS_REGION_NAME=$AWS_REGION_NAME">>.env
- echo "DB_NAME=$DB_NAME">>.env
- echo "DB_USER=$DB_USER">>.env
- echo "DB_PASSWORD=$DB_PASSWORD">>.env
- echo "DB_HOST=$DB_HOST">>.env
- echo "DB_PORT=$DB_PORT">>.env
script:
- sls deploy --verbose
- sls wsgi manage --command "collectstatic --noinput"
In this job we’re using the node:16-bullseye
docker image. Bullseye is the Debian release codename, it means that the image is based on version 11 of Debian. I chose this version because it already comes with python3.9 pre-installed. That’s the same python version of the runtime of the Lambda function defined in the serverless.yml
file of our project.
This job will run only when new code is pushed to the staging
branch.
In the before_script
section, we’re installing the dependencies (pip, Serverless Framework and plugins) and creating the .env
file that our application needs. But where did the variables, such as $STATIC_FILES_BUCKET_NAME
and $DB_NAME
come from?
GitLab CI/CD variables
There are a few ways to define custom environment variables to be used in pipeline’s jobs. In the “Server Test” job you learned how to set environment variables for a specific job by using the keyword variables
.
This method is not a good choice if you need to declare variables that will take on sensitive values, like $DB_PASSWORD
, for example. Such variables should be stored in the settings in the UI, not in the .gitlab-ci.yml
file.
To define CI/CD variables in the UI:
- Go to the project’s Settings > CI/CD and expand the Variables section.
- Select Add variable and fill in the details:
-
Key: Must be one line, with no spaces, using only letters, numbers, or
_
. - Value: No limitations.
-
Type:
Variable
(default) orFile
. -
Environment scope: Optional.
All
, or specific environments. - Protect variable Optional. If selected, the variable is only available in pipelines that run on protected branches or protected tags.
- Mask variable Optional. If selected, the variable’s Value is masked in job logs.
-
Key: Must be one line, with no spaces, using only letters, numbers, or
These are the variables I added:
Besides the variables I referenced in the before_script
clause of the deploy job, I’ve included AWS_ACCESS_KEY_ID
and AWS_SECRET_ACCESS_KEY
. This will ensure that the machine running the jobs has access to my AWS account.
Don’t forget to mask the variables with the most sensitive data, such as the database password and the AWS secret access key.
Environments
Environments are used define deployment locations for code. On GitLab you can create multiple environments that align with your workflow, such as integration, staging, beta, and production, among others. Custom variables assigned to specific environments, like “staging”, remain inaccessible from pipelines running in a different environment, such as “production”, for example.
To create an environment for your project in the UI:
- Select Operate > Environments.
- Select Create an environment.
- Type a name and a URL (optional) for the environment
- Click on the “Save” button.
Deploying to Staging
Before our first deploy, we’ll add one more if
clause to the rules
section of the “Server Tests” job:
- if: $CI_COMMIT_REF_NAME == "staging" && $CI_PIPELINE_SOURCE == "push"
It’s the same if
we have in the “Deploy Staging” job. This will make the tests job run again before the deploy job is executed.
The final version of the .gitlab-ci.yml
file:
stages:
- test
- deploy
variables:
PIP_CACHE_DIR: "$CI_PROJECT_DIR/pip-cache"
cache:
key: pip-cache-$CI_COMMIT_REF_SLUG
paths:
- $PIP_CACHE_DIR
"Server Tests":
image: python:3.9-slim
stage: test
services:
- postgres:15.4-alpine
rules:
- if: $CI_PIPELINE_SOURCE == "merge_request_event" && $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == "staging"
- if: $CI_COMMIT_REF_NAME == "staging" && $CI_PIPELINE_SOURCE == "push"
before_script:
- pip install --upgrade pip
- pip install -r requirements/dev.txt
script:
- pytest -v
variables:
PGDATA: /pgdata
POSTGRES_USER: secret
POSTGRES_PASSWORD: secret
POSTGRES_DB: django_serverless
DB_HOST: postgres
DB_NAME: django_serverless
DB_USER: secret
DB_PASSWORD: secret
DB_PORT: 5432
"Deploy Staging":
image: node:16-bullseye
stage: deploy
environment: staging
rules:
- if: $CI_COMMIT_REF_NAME == "staging" && $CI_PIPELINE_SOURCE == "push"
before_script:
- apt-get update && apt-get install -y python3-pip
- npm install -g serverless
- npm install
- touch .env
- echo "STATIC_FILES_BUCKET_NAME=$STATIC_FILES_BUCKET_NAME">>.env
- echo "AWS_REGION_NAME=$AWS_REGION_NAME">>.env
- echo "DB_NAME=$DB_NAME">>.env
- echo "DB_USER=$DB_USER">>.env
- echo "DB_PASSWORD=$DB_PASSWORD">>.env
- echo "DB_HOST=$DB_HOST">>.env
- echo "DB_PORT=$DB_PORT">>.env
script:
- sls deploy --verbose
- sls wsgi manage --command "collectstatic --noinput"
Commit and push the updates to GitLab. The “Server Tests” job will run again, after it finishes successfully, you click on the “Merge” button.
Conclusion
A CI/CD pipeline brings many advantages to an application. The automation of the tests and deploys speeds up the development cycle, enabling quicker delivery of features and updates to your users while increasing consistency and reliability.
See you next time!
Top comments (0)