DEV Community

Cover image for CI/CD Pipeline Using GitHub Actions: Automate Software Delivery
Alex Hyett
Alex Hyett

Posted on • Originally published at alexhyett.com

CI/CD Pipeline Using GitHub Actions: Automate Software Delivery

Today we are going to look at CI/CD pipelines and how we can set them up for free using GitHub Actions.

You might not be aware of what CI/CD is and why we use it, so let's cover that first.

Subscribe for more video content

Continuous Integration

CI stands for Continuous Integration, the process of automatically building your code and running all your unit and integration tests. This doesn't need to be on every single commit but generally, all the tests are run when a pull request is created.

If the tests don't pass, the code won't be merged into the develop or master branch.

You should be running unit tests manually anyway on your own computer before you raise a pull request but having them run automatically on a server is a good quality control to have in place. It also prevents the scenario where the tests are only working on your machine but not on anyone else's.

In the past, I have used tools such as Jenkins and TeamCity to run CI pipelines but more and more teams are moving towards GitHub Actions. If your code is on GitHub anyway it is a no-brainer and saves you additional server and licensing costs.

Continuous Delivery / Deployment

CD can stand for either Continuous Delivery or Continuous Deployment they are very similar but there is a subtle difference.

Continuous Delivery is about building and packaging up your application ready for production but doesn't include actually deploying it.

Continuous Deployment on the other hand will actually deploy your application to the target environment.

Most companies don't like having applications deployed automatically to production. It is typical to have continuous deployment when going to DEV and QA environments but then have a bit more manual control when it comes to staging and production.

I have always used Octopus Deploy in the past to gatekeep deployments to particular environments but again there are ways to do this using GitHub Actions as well.

Example Project

So to test out this functionality I have created this very simple .NET Core API that writes to a MySQL database. This project has both unit and integration tests that we will run using GitHub Actions as well showing off the test results.

For the continuous delivery part of this project, we are going to configure this to build a docker image for our API and push it to AWS ECR. I am going to leave out any approval workflows for now but I will cover that in a future article.

The full code for this project is available to my paid newsletter subscribers. My newsletter is otherwise free at the moment but if you want to support my channel then that is the best way to do it.

API

So I have put together a very simple .Net Core API for a library application, think something similar to Goodreads.

I have endpoints for adding and viewing authors and books, with a bit of paging on the get requests.

GET /authors?page=1&Size=10
GET /authors/1
POST /authors
GET /books?page=1&pageSize=10
GET /books/1
POST /books
Enter fullscreen mode Exit fullscreen mode

These all then get saved to a MySQL database. I haven't included any fancy search capabilities as this is really just to show off what we can do with GitHub Actions.

Tests

For the tests, I have written a few unit tests with xunit that simply call the code and mock out all of the dependencies.

For example:

[Fact]
public async Task Given_new_author_should_return_author()
{
  var newAuthor = new NewAuthor("Joe", "Bloggs");
  var mockAuthorDb = new AuthorDb
  {
    AuthorId = 1,
    FirstName = newAuthor.FirstName,
    LastName = newAuthor.LastName,
    DateCreated = DateTime.UtcNow,
    DateModified = DateTime.UtcNow
  };

  _libraryRespository.Setup(x => x.AddAuthorAsync(It.IsAny<NewAuthorDb>())).ReturnsAsync(mockAuthorDb.AuthorId);
  _libraryRespository.Setup(x => x.GetAuthorAsync(1)).ReturnsAsync(mockAuthorDb);

  var result = await _sut.AddAuthorAsync(newAuthor);
  result.IsT0.ShouldBeTrue();
  result.AsT0.Value.FirstName.ShouldBe(mockAuthorDb.FirstName);
  result.AsT0.Value.LastName.ShouldBe(mockAuthorDb.LastName);
}
Enter fullscreen mode Exit fullscreen mode

I have also created some integration tests that use an SDK I have put together using Refit that calls the API and makes sure that authors and books can be added and viewed correctly.

For example:

[Fact]
public async Task Given_valid_author_should_create_author()
{
  var request = new AuthorRequest
  {
    FirstName = $"{Guid.NewGuid()}",
    LastName = $"{Guid.NewGuid()}"
  };

  var author = await _authorApi.CreateAuthorAsync(request);
  author.AuthorId.ShouldNotBe(0);
  author.FirstName.ShouldBe(request.FirstName);
  author.LastName.ShouldBe(request.LastName);
  author.DateCreated.ShouldBeInRange(DateTime.UtcNow.AddSeconds(-5), DateTime.UtcNow.AddSeconds(5));
  author.DateModified.ShouldBeInRange(DateTime.UtcNow.AddSeconds(-5), DateTime.UtcNow.AddSeconds(5));
}
Enter fullscreen mode Exit fullscreen mode

For the unit tests, we can just run the code but for the integration tests, we need to have the API and database up and running so that we can actually call the API when the tests are running.

Docker

To get all this working as part of a GitHub Actions workflow we need to have our API and database running in Docker.

For the API we need to put together a Dockerfile that is going to build and host our application. I am using here what they call a multi-stage build Dockerfile.

To be able to build our application we need to copy over all the source files into the docker container but we don't need them once the application is built. So we then have a second docker image that just hosts the built application.

FROM mcr.microsoft.com/dotnet/sdk:7.0 AS build
WORKDIR /app

# 1. Copy project files
COPY src/GitHubActionsDemo.Api/*.csproj ./GitHubActionsDemo.Api/
COPY src/GitHubActionsDemo.Api.Sdk/*.csproj ./GitHubActionsDemo.Api.Sdk/
COPY src/GitHubActionsDemo.Persistance/*.csproj ./GitHubActionsDemo.Persistance/
COPY src/GitHubActionsDemo.Service/*.csproj ./GitHubActionsDemo.Service/

# 2. Run dotnet restore
WORKDIR /app/GitHubActionsDemo.Api
RUN dotnet restore

# 3. Copy the rest of the files
WORKDIR /app
COPY src/GitHubActionsDemo.Api/. ./GitHubActionsDemo.Api/
COPY src/GitHubActionsDemo.Api.Sdk/. ./GitHubActionsDemo.Api.Sdk/
COPY src/GitHubActionsDemo.Persistance/. ./GitHubActionsDemo.Persistance/
COPY src/GitHubActionsDemo.Service/. ./GitHubActionsDemo.Service/

# 4. Build release application
WORKDIR /app/GitHubActionsDemo.Api
RUN dotnet publish -c Release -o out

FROM mcr.microsoft.com/dotnet/aspnet:7.0
WORKDIR /app
EXPOSE 5275/tcp
ENV ASPNETCORE_URLS http://*:5275

# 5. Copy the release application and run
COPY --from=build /app/GitHubActionsDemo.Api/out ./
ENTRYPOINT ["dotnet", "GitHubActionsDemo.Api.dll"]
Enter fullscreen mode Exit fullscreen mode

This docker file is doing quite a lot, so let's break it down.

  1. Copy project files - first we copy across the project files. This is so we can make the best use of docker's layer caching when we run dotnet restore. Downloading all the packages from nuget can take a while, so doing it first means the layer can be cached and we won't need to run it again unless we add another dependency.
  2. Run dotnet restore - this will download any of the dependencies we are using.
  3. Copy the rest of the files - we now need all the source files so we can build the application.
  4. Build release application - build and publish the production version of the application.
  5. Copy the release application and run - here we copy just the build files over to a new container so we can run the application with a small image.

We then set up a docker-compose file that contains our API and MySQL database.

version: '3.8'
services:
  db:
    image: mysql:8.0
    cap_add:
      - SYS_NICE
    hostname: db
    restart: always
    environment:
      - MYSQL_DATABASE=library
      - MYSQL_RANDOM_ROOT_PASSWORD=1
      - MYSQL_USER=dbuser
      - MYSQL_PASSWORD=libraryDbPassword
    ports:
    - '3306:3306'
    volumes:
      - db:/var/lib/mysql
      - ./db/init.sql:/docker-entrypoint-initdb.d/init.sql

  api:
    image: githubactionsdemo.api:${VERSION}
    build: .
    restart: always
    depends_on:
      - db
    ports:
      - '5200:5275'
    environment:
      - ASPNETCORE_URLS=http://*:5275
      - API_DbSettings__ConnectionString=Server=db;Database=library;Uid=dbuser;Pwd=libraryDbPassword;

volumes:
  db:
    driver: local
Enter fullscreen mode Exit fullscreen mode

We can then spin this up in the GitHub Actions runner and run our integration tests against it.

GitHub Actions Workflow

To be able to use GitHub Actions we need to create a workflow file in a .github/workflows folder in our project.

I have called mine build-and-test.yml but you can name it anything.

name: Build, Test and Push

on: [push]

jobs:
  build:
    runs-on: ubuntu-latest
    steps:
      - uses: actions/checkout@v3
      - name: Fetch unshallow
        run: git fetch --prune --tags --unshallow
      - name: Install GitVersion
        uses: gittools/actions/gitversion/setup@v0.9.7
        with:
          versionSpec: '5.x'
      - name: Determine Version
        id: gitversion
        uses: gittools/actions/gitversion/execute@v0.9.7
      - name: Setup dotnet
        uses: actions/setup-dotnet@v3
        with:
          dotnet-version: '7.0.x'
      - name: Install dependencies
        run: dotnet restore
      - name: Build
        run: dotnet build
      - name: Run Unit Tests
        run: dotnet test --filter Category=Unit --no-restore --verbosity normal
      - name: Copy Unit Test Results
        run: mkdir TestResults; cp test/**/TestResults/*.Unit.Tests.trx TestResults/
        shell: bash
      - name: Start containers
        run: docker-compose -f "docker-compose.yml" up -d --build
        env:
          VERSION: ${{ steps.gitversion.outputs.nuGetVersion }}
      - name: Wait for docker containers to setup
        run: sleep 30s
        shell: bash
      - name: Run Integration Tests
        run: dotnet test --filter Category=Integration --no-restore --verbosity normal
        env:
          BASE_URL: http://localhost:5200
      - name: Copy Integration Test Results
        run: cp test/**/TestResults/*.Integration.Tests.trx TestResults/
        shell: bash
      - name: Test Report
        uses: dorny/test-reporter@v1
        if: success() || failure()
        with:
          name: Test Results
          path: TestResults/*.trx
          reporter: dotnet-trx
      - name: Push to ECR
        if: github.ref == 'refs/heads/main'
        id: ecr
        uses: jwalton/gh-ecr-push@v1
        with:
          access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          region: us-east-1
          image: githubactionsdemo.api:${{ steps.gitversion.outputs.nuGetVersion }}
Enter fullscreen mode Exit fullscreen mode

For simplicity, I have included everything in one job. It is possible to create multiple jobs in GitHub Actions but it gets a bit complicated.

Each job runs in its own container, so if you need to share information between jobs then you have to save them as artifacts.

Let's go through this in sections so we can see what it is doing.

Check out the code and determine version

- uses: actions/checkout@v3
- name: Fetch unshallow
  run: git fetch --prune --tags --unshallow
- name: Install GitVersion
  uses: gittools/actions/gitversion/setup@v0.9.7
  with:
  versionSpec: '5.x'
- name: Determine Version
  id: gitversion
  uses: gittools/actions/gitversion/execute@v0.9.7
Enter fullscreen mode Exit fullscreen mode

As part of tagging my docker image, I wanted to get the version using gitversion. I worked out the hard way that you also need to run fetch with tags in order for gitversion to work.

Build code using dotnet

- name: Setup dotnet
  uses: actions/setup-dotnet@v3
  with:
  dotnet-version: '7.0.x'
- name: Install dependencies
  run: dotnet restore
- name: Build
  run: dotnet build
Enter fullscreen mode Exit fullscreen mode

Here we set up the dotnet version we are using and then run restore and build on our code.

Run unit tests and copy results

- name: Run Unit Tests
  run: dotnet test --filter Category=Unit --no-restore --verbosity normal
- name: Copy Unit Test Results
  run: mkdir TestResults; cp test/**/TestResults/*.Unit.Tests.trx TestResults/
  shell: bash
Enter fullscreen mode Exit fullscreen mode

I added a xunit trait ([Trait("Category", "Unit")]) to my test classes so that I could run Unit and Integration tests separately.

I am using test reporter to show the test results in GitHub Actions, however, even though I am filtering out the Integration tests it still produces a test results file with skipped tests. To avoid the files being overwritten when I run my integration tests I have to copy the results to a separate folder.

Spin up docker and wait for it to complete

- name: Start containers
  run: docker-compose -f "docker-compose.yml" up -d --build
  env:
  VERSION: ${{ steps.gitversion.outputs.nuGetVersion }}
- name: Wait for docker containers to setup
  run: sleep 30s
  shell: bash
Enter fullscreen mode Exit fullscreen mode

Using the docker-compose file we created earlier we can spin up our API and database ready for our integration tests.

I have added a sleep here to make sure that our database and API are ready. I am sure there is a nicer way to do this such as pinging the API to see if it is up yet.

Run integration tests and copy results

- name: Run Integration Tests
  run: dotnet test --filter Category=Integration --no-restore --verbosity normal
  env:
  BASE_URL: http://localhost:5200
- name: Copy Integration Test Results
  run: cp test/**/TestResults/*.Integration.Tests.trx TestResults/
  shell: bash
Enter fullscreen mode Exit fullscreen mode

As with the unit tests we can filter by category so only the integration tests run. You can see here that I am setting the BASE_URL of the API in the unit tests to the same port that I used in the docker-compose file.

Setting it up this way means that the docker version of the API will run on a different port than running it locally on your machine. This means you won't get a clash when you run the API locally while having the docker containers running.

I then copy the Integration test results into the TestResults folder ready for the test reporter.

Display test results

- name: Test Report
  uses: dorny/test-reporter@v1
  if: success() || failure()
  with:
  name: Test Results
  path: TestResults/*.trx
  reporter: dotnet-trx
Enter fullscreen mode Exit fullscreen mode

I am using the test reporter from the GitHub Actions Marketplace to show my results. This report works with these other test result formats as well:

This gives you a nice-looking test report in GitHub.

Test Results

Push to ECR

Finally, I have set up an action to push my tagged docker image for my API up to AWS ECR. This is only done if it is on the master branch however as I have a conditional in place on this step.

I found a really good GitHub Actions conditional cheatsheet by Michael Currin you might want to check out too.

      - name: Push to ECR
        if: github.ref == 'refs/heads/main'
        id: ecr
        uses: jwalton/gh-ecr-push@v1
        with:
          access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          region: us-east-1
          image: githubactionsdemo.api:${{ steps.gitversion.outputs.nuGetVersion }}
Enter fullscreen mode Exit fullscreen mode

This uses the gitversion from one of the previous steps as well as an AWS Access Key and Secret stored in GitHub Secrets for the project.

For those of you that are supporting my YouTube channel and newsletter, the code for this project is in this repo: cicd-github-actions.


📨 Are you looking to level up your skills in the tech industry?

My weekly newsletter is written for engineers like you, providing you with the tools you need to excel in your career. Join here for free →

Top comments (1)

Collapse
 
ant_f_dev profile image
Anthony Fung

I've used TeamCity/Octopus in the past. TeamCity's great, but I found setting up projects quite complex (luckily I had other projects to base new ones off). Octopus was great for deployments too.

More recently, I'm using Azure DevOps. I don't set up the projects anymore though, so don't know if it's easier/harder than TeamCity.

If anyone reading this is interested, I'm currently writing a series on automated tests. In that I'm using NUnit (instead of xUnit as in this article), but they're very similar. I cover other important areas too, such as mocking.