I’ve been digging GitHub Actions since I received access to the beta. GitHub Actions is generally available since November 11, 2019 and I can explain to you. 🎉
French version right here.
What is it ?
Github Action is a solution for continuous integration and continuous deployment (CI/CD). It’s a new feature from Github, which is displayed as a tab in your repository. Its operations are triggered by Git events such as a commit push. In order to create a new sequence of events to be run on a commit push, we must create a set of commands which will be carried out according to the specifications of the project. This service is designed to perform processes automatically and continuously for collaborative work on a Git repo once implemented. It should be noted that Github Action is based on Docker, so it’s recommended to have some knowledge about it to use it.
Write your action
An action contains several elements necessary to perform a process: the on event that corresponds to the Github event: push, pull request, etc. During the beta however, only the push and pull request triggers are available.
on: [push, pull_request]
It is of course possible to define the scope of your action, so it does specific tasks on certain branches of your repository but not on others. For example, this may include automatic deployment to a server from the staging branch but not from the dev branch.
on:
push:
branches:
- staging
Then we have the jobs. In practice, a job is a series of steps performed in an orderly manner. It has 2 important elements: the environment and the steps to be taken by the service.
The runs-on parameter defines the environment in which the action is performed:
runs-on: ubuntu-latest
There are several possible virtual environments for the execution of your action, this can be useful so that it corresponds to the specificities of each one. From Ubuntu 19.10 to Windows 2016, you have the choice 😉
It should be noted that each job is performed in a new instance of the virtual environment.
So if you want to perform a series of actions with a link between them, you may want consider doing it in the same job. But it is possible to create job dependencies with the need parameter:
jobs:
job1:
job2:
need: job1
In this case, job2 will only run when job1 has been completed.
Then, we have all the steps to perform during the process. With a view to continuous integration, the aim is to launch the tests automatically. However, a number of elements must be defined before the tests can be performed!
This set is defined by the keyword steps. Each step requires at least one element to be performed, i.e. one of its elements:
the uses keyword to use a specific element:
- a public action
- an action in the same repository
- an action on the Docker Hub or on the Docker public register
or
the run keyword to launch a command in bash (knowing that there are several ways to execute a command if necessary)
Finally, we associate the parameter name to know which step it represents. An example being always more meaningful, so here is a job to test a Django project:
jobs:
tests:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v1
- name: Set up Python 3.5.7
- uses: actions/setup-python@v1
with:
python-version: 3.5.7
- name: Install dependencies
run: pip install -r requirements.txt
- name: Run tests
run: python manage.py test
Here the job tests consist of:
- install Python 3.5.7 on the Ubuntu environment
- install the dependencies
- start the tests
For tests in real conditions, a database is needed, hence the implementation of services.
Add a service to your action
How do we do it? Very simple, it is similar to the definition of a service in a docker-compose.yml but with far fewer parameters.
We use the keyword services with :
- the definition of the Docker image to be used
- the list of ports to be exposed of the service
- a list of environment variables
- a list of volumes
Not all of these elements are required but good to know.
In concrete terms, it looks like this, for a Postgres service:
services:
image: postgres
env:
- POSTGRES_USER: postgres
- POSTGRES_PASSWORD: postgres
- POSTGRES_BD: postgres
ports:
- 5432/tcp
options: --health-cmd pg_isready --health-interval 10s --health-timeout 5s --health-retries 5
We need the Docker image to launch, the port to expose and the name of the service. There is a way to transmit environment values or Docker options as shown in the example above. Sensitive values can be defined with secret key.
In addition, it is sometimes necessary to require a service status feedback to establish a connection. Therefore, an element is defined that checks the health of the service to know when it is operational (also called a health check). I took up the example made by Chris Patterson and Mike Coutermarsh for PostgreSQL.
In case you don’t have a health check available, we pause the process execution with the command sleep , yes it’s not great, I grant you…
All this is very good, but we have to make the link between the service and our application. We will do it through the port and for the host it will be localhost since the service is executed directly in the Ubuntu machine in our example. Otherwise it would have been the name of the service.
In the previous example, the machine port 5432 is randomly assigned a free port and accessed as follows:
${{ job.services.postgres.ports[5432] }}
Nevertheless, it can be assigned in a fixed way:
ports:
- 5432:5432
Here, we have the port 5432 of the service related to the port 5432 of the machine defined in the action.
The following workflow is implemented:
name: Test Workflow
on: push
jobs:
tests:
runs-on: ubuntu-latest
services:
image: postgres
env:
- POSTGRES_USER: postgres
- POSTGRES_PASSWORD: postgres
- POSTGRES_BD: postgres
ports:
- 5432:5432
options: --health-cmd pg_isready --health-interval 10s --health-timeout 5s --health-retries 5
steps:
- uses: actions/checkout@v1
- name: Set up Python 3.5.7
- uses: actions/setup-python@v1
with:
python-version: 3.5.7
- name: Install dependencies
run: pip install -r requirements.txt
- name: Run tests
run: python manage.py test
In our example, we still need to override the service values in the Django settings.py for the connection to the PostgreSQL database.
Then the trick is done, just push some code and hope it works. 😉
If you enjoyed this article, please help out with a ❤️ or a share. It will help others in the world !
Top comments (0)