Last week, I already set up some automation tests for Silkie, my static site generator (SSG). Instead of running tests manually on each Pull Request (PR), I made an attempt to configure GitHub Actions to automate this Continuous Integration (CI) workflow. Moreover, I also helped my friend, Luke, add a test case to his SSG this week.
Configure GitHub Actions
For the time being, I have several specifications for my CI pipeline:
- Test Silke only on Python 3.9 (hopefully I can support more versions in future releases)
- Install dependencies from
requirements.txt
file (if it exists) - Run Flake8 linter
- Run tests and generate code coverage
Here's how it actually looks like:
# This is a basic workflow to help you get started with Actions
name: CI
# Controls when the workflow will run
on:
# Triggers the workflow on push or pull request events but only for the main branch
push:
branches: [ main ]
pull_request:
branches: [ main ]
# Allows you to run this workflow manually from the Actions tab
workflow_dispatch:
# A workflow run is made up of one or more jobs that can run sequentially or in parallel
jobs:
# This workflow contains a single job called "build"
build:
# The type of runner that the job will run on
runs-on: ubuntu-latest
strategy:
matrix:
python-version: ["3.9"]
# Steps represent a sequence of tasks that will be executed as part of the job
steps:
# Checks-out your repository under $GITHUB_WORKSPACE, so your job can access it
- uses: actions/checkout@v2
- name: Set up Python ${{ matrix.python-version }}
uses: actions/setup-python@v2
with:
python-version: ${{ matrix.python-version }}
- name: Install dependencies
run: |
python -m pip install --upgrade pip
if [ -f requirements.txt ]; then pip install -r requirements.txt; fi
- name: Run flake8 linter
run: |
pip install flake8
# stop the build if there are Python syntax errors or undefined names
flake8 . --count --select=E9,F63,F7,F82 --show-source --statistics
# exit-zero treats all errors as warnings. The GitHub editor is 127 chars wide
flake8 . --count --exit-zero --max-complexity=10 --max-line-length=127 --statistics
- name: Test & Code coverage with pytest
run: |
pip install pytest
pip install pytest-cov
pytest --junitxml=junit/test-results.xml --cov=silkie --cov-report=xml --cov-report=html
Testing Luke's SSG
Luke used Jest as his testing framework. In my opinion, it wasn't really different from Pytest, so I had no problem with adding a new test case.
Luke had also set up a simple but efficient CI workflow for his project:
# This workflow will do a clean install of node dependencies, cache/restore them, build the source code and run tests across different versions of node
# For more information see: https://help.github.com/actions/language-and-framework-guides/using-nodejs-with-github-actions
name: Node.js CI
on:
push:
branches: [main]
pull_request:
branches: [main]
jobs:
build:
runs-on: ubuntu-latest
strategy:
matrix:
node-version: [14.x, 16.x]
# See supported Node.js release schedule at https://nodejs.org/en/about/releases/
steps:
- uses: actions/checkout@v2
- name: Use Node.js ${{ matrix.node-version }}
uses: actions/setup-node@v2
with:
node-version: ${{ matrix.node-version }}
cache: 'npm'
- run: npm ci
- run: npm test
Luckily, my newly added test case didn't fail the CI workflow, so my PR got merged eventually.
Final thoughts
I think it's pretty easy and simple to get a CI pipeline up and running with GitHub Actions. However, its impact on our productivity and code quality is certainly significant as the project grows bigger.
Top comments (0)