DEV Community

Vincent A. Cicirello
Vincent A. Cicirello

Posted on

How to Test a GitHub Action with GitHub Actions

I maintain several GitHub Actions, all of which are implemented in Python as container actions. This post explains how to test a GitHub Action using a GitHub Actions workflow, including using the workflow as a required check on Pull Requests. Although some of this post is specific to testing an action that is implemented in Python, much of the post is more generally applicable to testing actions regardless of implementation language.

Table of Contents: The rest of this post is organized as follows:

Steps to Test A GitHub Action

In this section, I'll walk you through a GitHub Actions workflow for testing a GitHub Action.

Preliminaries

First, within the .github/workflows directory of the repository, create a YAML file for the workflow. I usually name this workflow build.yml. Start by giving the workflow a name, and configuring the events that will trigger it to run. In this example, the workflow will run on both pushes and pull requests for the branch main. The snippet below also sets up a job that will run on an Ubuntu runner.



name: build

on:
  push:
    branches: [ main ]
  pull_request:
    branches: [ main ]

jobs:

  build:

    runs-on: ubuntu-latest


Enter fullscreen mode Exit fullscreen mode

Checkout

Next, checkout with the actions/checkout action.



    steps:
    - uses: actions/checkout@v3


Enter fullscreen mode Exit fullscreen mode

Run Unit Tests

Our first set of tests are our unit tests. Although I implement actions as container actions, I conduct the unit testing outside of the Docker container. Since this is a Python example, we need two steps here. First, the actions/setup-python action is used to set up Python. In this example, Python 3.10 is used. The second step below uses the Python module unittest to execute our unit tests. This example assumes that there are unit tests in tests/tests.py. If any of the unit tests fail, then the failure will cause a non-zero exit code from Python, which will in turn fail the workflow. In this way, we can use this as a required PR check to ensure that all of our unit tests pass before merging a PR.



    - name: Setup Python
      uses: actions/setup-python@v4
      with:
        python-version: '3.10'

    - name: Run Python unit tests
      run: python3 -u -m unittest tests/tests.py


Enter fullscreen mode Exit fullscreen mode

Build the Docker Container

Next, since this is a container action, we want to ensure that the Docker container successfully builds.



    - name: Verify that the Docker image for the action builds
      run: docker build . --file Dockerfile


Enter fullscreen mode Exit fullscreen mode

Integration Test

Earlier, we ran unit tests external from any container. Now, we want to test the full integration of our action. For this, we want to use the action itself. Ordinarily, you run an action with uses: username/repository@version. However, we don't want to do that here. If we want to use this workflow as a PR check, we want to make sure that we run a version of the action that incorporates any changes from the PR we are checking. We can do this by specifying uses: ./ which will direct the GitHub Actions framework to look for an action at the root of what we earlier checked out with the actions/checkout step.

In this example, I'm running the action we are testing twice with different inputs. You can have as many of these steps as needed. Keep in mind that each test here will be slower than each of your unit tests. After all, each of these is fully running the action, rather than simply testing one small unit; and additionally, the GitHub Actions framework must also build your Docker container.



    - name: Integration test 1
      uses: ./
      with:
        input-one: something
        input-two: true

    - name: Integration test 2
      uses: ./
      with:
        input-one: something else
        input-two: false


Enter fullscreen mode Exit fullscreen mode

Validate the Integration Test Results

We now need a way to detect if the results of the above integration tests are correct. The various actions that I maintain produce files (e.g., jacoco-badge-generator produces coverage badges, and generate-sitemap produces an XML sitemap) or edits existing files (e.g., javadoc-cleanup inserts canonical links and a few other things into the head of javadoc pages). In cases like these, I use Python's unittest module to validate the results. In this case, I define unit test cases in tests/integration.py that verify that the files produced by the action are correct. If any of those tests fail, then Python will exit with a non-zero exit code which will cause the workflow to fail.



    - name: Verify integration test results
      run: python3 -u -m unittest tests/integration.py


Enter fullscreen mode Exit fullscreen mode

Complete Example Workflow

Here's the complete example.



name: build

on:
  push:
    branches: [ main ]
  pull_request:
    branches: [ main ]

jobs:

  build:

    runs-on: ubuntu-latest

    steps:
    - uses: actions/checkout@v3

    - name: Setup Python
      uses: actions/setup-python@v4
      with:
        python-version: '3.10'

    - name: Run Python unit tests
      run: python3 -u -m unittest tests/tests.py

    - name: Verify that the Docker image for the action builds
      run: docker build . --file Dockerfile

    - name: Integration test 1
      uses: ./
      with:
        input-one: something
        input-two: true

    - name: Integration test 2
      uses: ./
      with:
        input-one: something else
        input-two: false

    - name: Verify integration test results
      run: python3 -u -m unittest tests/integration.py


Enter fullscreen mode Exit fullscreen mode

Real Examples

I use this approach in several actions that I maintain. I maintain a website about these.

Vincent Cicirello - Open source GitHub Actions for workflow automation

Features information on several open source GitHub Actions for workflow automation that we have developed to automate parts of the CI/CD pipeline, and other repetitive tasks. The GitHub Actions featured include jacoco-badge-generator, generate-sitemap, user-statistician, and javadoc-cleanup.

favicon actions.cicirello.org

If you'd like to see a couple real examples of my approach to testing a GitHub Action within GitHub Actions, then take a look at the following three examples.

Testing the generate-sitemap Action

This first real example is generate-sitemap. Files relevant to the example are as follows:

What? Direct Link to File
Workflow build.yml
Directory of test data and code tests
Unit tests tests/tests.py
Code to validate integration tests tests/integration.py

GitHub logo cicirello / generate-sitemap

Generate an XML sitemap for a GitHub Pages site using GitHub Actions

generate-sitemap

cicirello/generate-sitemap - Generate XML sitemaps for static websites in GitHub Actions

Check out all of our GitHub Actions: https://actions.cicirello.org/

About

GitHub Actions GitHub release (latest by date) Count of Action Users
Build Status build CodeQL
Source Info GitHub GitHub top language
Support GitHub Sponsors Liberapay Ko-Fi

The generate-sitemap GitHub action generates a sitemap for a website hosted on GitHub Pages, and has the following features:

  • Support for both xml and txt sitemaps (you choose using one of the action's inputs).
  • When generating an xml sitemap, it uses the last commit date of each file to generate the <lastmod> tag in the sitemap entry. If the file was created during that workflow run, but not yet committed, then it instead uses the current date (however, we recommend if possible committing newly created files first).
  • Supports URLs for html and pdf files in the sitemap, and has inputs to control the included file types (defaults include both html and pdf files in the sitemap).
  • Now also supports including URLs for a user specified list of additional file extensions in the sitemap.

Testing the jacoco-badge-generator Action

This next example is the jacoco-badge-generator. Files relevant to the example are as follows:

What? Direct Link to File
Workflow build.yml
Directory of test data and code tests
Unit tests tests/tests.py
Code to validate integration tests tests/integration.py

GitHub logo cicirello / jacoco-badge-generator

Coverage badges, and pull request coverage checks, from JaCoCo reports in GitHub Actions

jacoco-badge-generator

cicirello/jacoco-badge-generator - Coverage badges, and pull request coverage checks, from JaCoCo reports in GitHub Actions

Check out all of our GitHub Actions: https://actions.cicirello.org/

About

GitHub Actions GitHub release (latest by date) Count of Action Users
Command-Line Utility PyPI PyPI Downloads/month PyPI Downloads/week
Build Status build CodeQL
Security Snyk security score
Source Info License GitHub top language
Support GitHub Sponsors Liberapay Ko-Fi

The jacoco-badge-generator can be used in one of two ways: as a GitHub Action or as a command-line utility (e.g., such as part of a local build script). The jacoco-badge-generator parses a jacoco.csv from a JaCoCo coverage report, computes coverage percentages from JaCoCo's Instructions and Branches counters, and generates badges for one or both of these (user configurable) to provide an easy to read visual summary of the code coverage of your test cases. The default behavior directly generates the badges internally with no external calls, but the action also provides an option to instead generate Shields JSON endpoints. It supports both the basic case of a single jacoco.csv, as well as multi-module projects in which case the action can produce coverage badges from the combination of…

Testing the javadoc-cleanup Action

This last example is javadoc-cleanup. Files relevant to the example are as follows:

What? Direct Link to File
Workflow build.yml
Directory of test data and code tests
Unit tests tests/tests.py
Code to validate integration tests tests/integration.py

GitHub logo cicirello / javadoc-cleanup

Create mobile-friendly documentation sites by post-processing javadocs in GitHub Actions

javadoc-cleanup

cicirello/javadoc-cleanup - Create mobile-friendly documentation sites by post-processing javadocs in GitHub Actions

Check out all of our GitHub Actions: https://actions.cicirello.org/

About

GitHub Actions GitHub release (latest by date) Count of Action Users
Build Status build CodeQL
Source Info License GitHub top language
Support GitHub Sponsors Liberapay Ko-Fi

The javadoc-cleanup GitHub action is a utility to tidy up javadocs prior to deployment to an API documentation website, assumed hosted on GitHub Pages. It performs the following functions:

  • Improves mobile browsing experience: It inserts <meta name="viewport" content="width=device-width, initial-scale=1"> within the <head> of each html file that was generated by javadoc, if not already present. Beginning with Java 16, javadoc properly defines the viewport, whereas prior to Java 16, it does not.
  • Strips out any timestamps inserted by javadoc: The timestamps cause a variety of version control issues for documentation sites maintained in git repositories. Javadoc has an option -notimestamp to direct javadoc not to insert timestamps (which we recommend that you also use). However, at the present time there appears to be a bug (in OpenJDK 11's javadoc, and possibly other versions)…

Where You Can Find Me

Follow me here on DEV:

Follow me on GitHub:

GitHub logo cicirello / cicirello

My GitHub Profile

Vincent A Cicirello

Vincent A. Cicirello

Sites where you can find me or my work
Web and social media Personal Website LinkedIn DEV Profile Stack Overflow profile StackExchange profile
Software development Github Maven Central PyPI Docker Hub
Publications Google Scholar ORCID DBLP ACM Digital Library IEEE Xplore ResearchGate arXiv
View Bibliometrics for My Research Publications My bibliometrics
View My Detailed GitHub Activity My GitHub Activity

If you want to generate the equivalent to the above for your own GitHub profile, check out the cicirello/user-statistician GitHub Action.




Or visit my website:

Vincent A. Cicirello - Professor of Computer Science

Vincent A. Cicirello - Professor of Computer Science at Stockton University - is a researcher in artificial intelligence, evolutionary computation, swarm intelligence, and computational intelligence, with a Ph.D. in Robotics from Carnegie Mellon University. He is an ACM Senior Member, IEEE Senior Member, AAAI Life Member, EAI Distinguished Member, and SIAM Member.

favicon cicirello.org

Top comments (7)

Collapse
 
cardinalby profile image
Cardinal

Possible, but isn't very convenient: you can't run your action locally, debug it and mock dependencies such as API calls. Feedback loop is quite long.

Take a look at github-action-ts-run-api, it allows you to write normal tests with your favourite test framework and debug the action

Collapse
 
cicirello profile image
Vincent A. Cicirello

Your approach looks interesting and potentially useful. I don't think I would use it though. At least not in its current form. I don't want to write TypeScript to test something that isn't implemented in TypeScript. That is the main thing I don't like about github-action-ts-run-api.

With my approach, you can run your unit tests locally in your chosen test framework. The step of my workflow for unit testing is identical to what is done locally. I implement Actions in Python as Docker actions. For unit testing I just use Python's unittest module and those tests can run outside of actions and outside of Docker with a simple command line statement.

In some cases, I can run (some) integration tests locally as well depending on whether they depend on being within the Actions framework. For example, 97% of the functionality of jacoco-badge-generator works outside of actions, and outside of Docker, as a CLI tool. And the other 3% is easily faked without running in Actions. Some other actions I maintain are not so easy to fake stuff that depends on being in Actions so your approach might be useful to those.

Now what I do like about your approach is that it simulates the Actions environment to run and test the Action itself locally. I just don't want to write TypeScript to do so. I'd like it better if you could specify those tests using YAML (e.g., specifying the test cases with the exact syntax used to run the action in a workflow). So if you are looking for ideas for how to improve github-action-ts-run-api, maybe work on a way to specify tests in YAML equivalent to how it would be run in Actions. For example, since it is TypeScript, maybe provide a CLI tool implemented in TS that takes a YAML file as input specifying test cases.

Collapse
 
cardinalby profile image
Cardinal

Thanks for the answer :)

I am surprised about the actions written in Python, most of the Actions I have seen already use JS, because it's the way proposed in GitHub Actions examples and doesn't require fetching docker image during the run. Maybe, you are right about your use-case :) This library especially useful for JS actions because allows you to debug the action without any additional configuration.

About the idea of YAML with test cases. Running the whole workflow is outside the library scope, you can take a look at github.com/nektos/act for it.

Thread Thread
 
cicirello profile image
Vincent A. Cicirello

There are 2 main ways of implementing Actions. JS is probably a bit more common. And the other is as a Docker container action. The benefit of the Docker approach is that you can use whatever languages and tools that you want.

Collapse
 
szabgab profile image
Gabor Szabo

Nice.

Why do you restrict the workflow to the main branch? I tend to think that it is better to run the CI on every branch, but I know the default CI configs GitHub offers all have this configuration to only run on main.

Collapse
 
cicirello profile image
Vincent A. Cicirello

I guess you could run it on all branches. The only potential problem I can see in that is if you are in a private repo where you have a limit of minutes of action time. But it is unlimited in public repos so that isn't an issue.

Collapse
 
poetryofcode profile image
Poetry Of Code

Very helpful!