DEV Community

James Eastham
James Eastham

Posted on

How to share Azure Dev Ops Build Pipelines

In my company, we deploy lots of different projects to lots of different clients. This leaves us with lots of Azure Dev Ops projects with numerous different repositories within each.

There are also a small number of people in the company who understand the syntax of the Azure pipelines. However, there are developers who would benefit massively from using a pipeline even without understanding the internals of creating one.

Almost all of the projects are written in .NET and are either class libraries or web API's. Very similar codebases with identical languages.

Previously, we had held yml files in source control and just copied them whenever setting up a new repo. Tweaking our standard build pipeline became an impossible task.

Luckily, I read the release notes for Azure Dev Ops. Relatively recently they pushed out an extremely exciting new feature. That is the shared build pipeline.

The DRY programmer got rather excited.

DRY Pipelines

Azure Dev Ops has the answer.

It is possible to consume a .yml, from another completely separate project within the same Dev Ops organisation.

We can create one single 'BuildTools' project, this build tools project holds a standard set of build pipelines.

This standard set of build pipelines cover a number of different templates:

  • .NET Core Web API
  • .NET Core Class Library (for nuget)
  • .NET Framework Class Library
  • Angular web portal

Internally, all of the pipelines support unit testing, integration testing and the publishing of the required artifacts.

Here's a sample of the .NET Core Web API pipeline

parameters:
  requiredCodeCoverage: 70
  dockerFilePath: ''
  dockerBuiltImageTag: ''
  dockerRunArguments: ''

jobs:
- job: run_unit_tests
  displayName: Run Unit Test
  pool:
    vmImage: 'windows-latest'
  steps:
  - task: UseDotNet@2
    inputs:
      packageType: 'sdk'
      version: '3.1.100'
  - task: DotNetCoreCLI@2
    inputs:
      command: 'test'
      projects: 'test/**UnitTest**/*.csproj'
      arguments: '/p:CollectCoverage=true /p:CoverletOutputFormat=cobertura /p:threshold=${{ parameters.requiredCodeCoverage }} /p:thresholdType=line /p:thresholdStat=total'
      testRunTitle: 'Run All Unit Tests'
  # Generate the report using ReportGenerator (https://github.com/danielpalme/ReportGenerator)
  # First install the tool on the machine, then run it
  - script: |
      dotnet tool install -g dotnet-reportgenerator-globaltool
      reportgenerator -reports:$(Build.SourcesDirectory)/**/coverage.cobertura.xml -targetdir:$(Build.SourcesDirectory)/CodeCoverage -reporttypes:HtmlInline_AzurePipelines;Cobertura
    displayName: Create Code coverage report

  # Publish the code coverage result (summary and web site)
  # The summary allows to view the coverage percentage in the summary tab
  # The web site allows to view which lines are covered directly in Azure Pipeline
  - task: PublishCodeCoverageResults@1
    displayName: 'Publish code coverage'
    inputs:
      codeCoverageTool: Cobertura
      summaryFileLocation: '$(Build.SourcesDirectory)/CodeCoverage/Cobertura.xml'
      reportDirectory: '$(Build.SourcesDirectory)/CodeCoverage'

- job: run_integration_tests
  displayName: Run Integration Tests
  condition: and(succeeded(), startsWith(variables['Build.SourceBranch'], 'refs/heads/release/'))
  steps:
  - task: UseDotNet@2
    inputs:
      packageType: 'sdk'
      version: '3.1.100'
  - task: DockerInstaller@0
    inputs:
      dockerVersion: '17.09.0-ce'
  - task: Docker@2
    inputs:
      command: 'build'
      Dockerfile: '${{ parameters.dockerFilePath }}'
      arguments: '-t ${{ parameters.dockerBuiltImageTag }}'
      buildContext: $(Build.SourcesDirectory)
  - bash: 'docker run -d ${{ parameters.dockerRunArguments }} ${{ parameters.dockerBuiltImageTag }} '
    workingDirectory: $(Build.SourcesDirectory)
  - task: DotNetCoreCLI@2
    inputs:
      command: 'test'
      projects: 'test/**IntegrationTest**/*.csproj'
      testRunTitle: 'Run All Integration Tests'
- job: build_and_publish
  displayName: Build and publish app files
  dependsOn: [ run_integration_tests ]
  condition: and(succeeded(), startsWith(variables['Build.SourceBranch'], 'refs/heads/release/'))
  steps:
  - task: UseDotNet@2
    inputs:
      packageType: 'sdk'
      version: '3.1.100'
  - task: DotNetCoreCLI@2
    inputs:
      command: publish
      arguments: '--configuration $(BuildConfiguration) -r win10-x64 --output $(Build.ArtifactStagingDirectory) /p:Version="$(Build.SourceBranchName)" /p:InformationalVersion="$(Build.SourceBranchName)"'
      zipAfterPublish: True

  # this code takes all the files in $(Build.ArtifactStagingDirectory) and uploads them as an artifact of your build.
  - task: PublishBuildArtifacts@1
    inputs:
      pathtoPublish: '$(Build.ArtifactStagingDirectory)'
Enter fullscreen mode Exit fullscreen mode

There are three key steps to this build pipeline.

1. Unit Tests with Code Coverage

Dependents

  • N/A. Unit tests run every single time the pipeline runs

Unit tests are, in my opinion, one of the fundamental building blocks of writing stable and robust software. Having coverage of the key use cases of an application gives an excellent starting point for refactoring and improving a codebase.

So, for every single run of this build pipeline all projects that contain the word UnitTest are run as tests.

By default, code coverage of 70% is required. I'm not a stickler for 100% code coverage, but I think it's important to ensure a reasonable amount of the code base is covered with tests.

For that reason, I default to 70% but allow an override if needed.

2. Integration tests

Dependents

  • Branch. Only run when the build branch is a release branch
- job: run_integration_tests
  displayName: Run Integration Tests
  condition: and(succeeded(), startsWith(variables['Build.SourceBranch'], 'refs/heads/release/'))
Enter fullscreen mode Exit fullscreen mode

For any kind of automated deployment, especially when working with the complexity of distributed systems, integration tests are a must.

Integration tests are what truly show if an application is working exactly as required in a production-like environment.

The integration tests only run conditionally, in this case only if the commit branch begins with the word release.

I've written in the past about the structure of a git repo, and we follow a git-flow methodology. Whenever we are ready to push a release, we create a branch named release/version_number.

Our integration tests are quite heavyweight when the Docker builds are included as well. To keep a high velocity with our development, we don't run the full suite of integration tests every single time. They only run on a release build.

3. Publish

Dependents

  • Branch. Only run when the build branch is a release branch
  • Previous Step. Only run if the run_integration_tests job completes.
- job: build_and_publish
  displayName: Build and publish app files
  dependsOn: [ run_integration_tests ]
  condition: and(succeeded(), startsWith(variables['Build.SourceBranch'], 'refs/heads/release/'))
Enter fullscreen mode Exit fullscreen mode

The publish job is a relatively simple one, it is just concerned with the build and then publishing the application files ready for a release.

It is the publish job that generates my artifact to be released to the world.

Parameters

You'll also notice at the very top of the standard build pipeline, there is a section on parameters.

Parameters work just like variables. They can be set from the pipeline that is consuming this standard one.

As you can see, I have four that are relatively self-explanatory.

  1. requiredCodeCoverage: 70. I don't want to force all consumers to use the code coverage settings, so it can be overridden. However, without an override, I want to ensure coverage is tracked
  2. dockerFilePath: ''. Path to the Dockerfile that should be used by the integration tests
  3. dockerBuiltImageTag: ''. The tag to use when building the docker image. This is important to ensure both the build and running of the container are using the same name
  4. dockerRunArguments: ''. Allows the setting of any extra parameters when running the docker image. Normally used for environment variables

Pipeline Consumption

Now that the base pipeline is set up, how the hell is it consumed. Well that my friends, is the magic part.

Take a look at this example build pipeline from one of my actual client code repositories (Names changed for privacy reasons).

trigger:
  branches:
    exclude:
    - 'master'
resources:
  repositories:
    - repository: templates
      type: git
      name: BuildTools/templates

stages:
- stage: Build
  jobs:
  - template: Build\DotNetCoreWebApi.yml@templates
    parameters:
      dockerRunArguments: --env-file test/MyProject.webapi/env.list -p 8100:80
      requiredCodeCoverage: 70
      dockerFilePath: src/MyProject.webapi/Dockerfile
      dockerBuiltImageTag: myclient/webapi
Enter fullscreen mode Exit fullscreen mode

19 lines of yml, and completely re-usable. For all my clients, I can simply copy and paste this same pipeline and tweak the parameters.

To give you a bit more detail, there are two SUPER important sections.

Resources

resources:
  repositories:
    - repository: templates
      type: git
      name: BuildTools/templates
Enter fullscreen mode Exit fullscreen mode

The resources section is what creates the link between the project you are in, and the one that houses the build pipelines.

In this instance, the standard pipelines are in a project called BuildTools and a repository called templates. Hence the name being BuildTools/templates.

The repository property is a name used internally by the rest of this pipeline. That could have 'lookhowamazingthispipelineis' if I really wanted.

Template

  - template: Build\DotNetCoreWebApi.yml@templates
Enter fullscreen mode Exit fullscreen mode

The second important piece is the actual step of the pipeline. This tells the current pipeline exactly which yml file to use in the external project repo.

In this case, I'm using a yml file named DotNetCoreWebApi.yml, that resides in the 'Build' folder.

The @templates part, links back to the name of the repository in the resources section.

If I had named my resource 'lookhowamazingthispipelineis', then this line would read as:

  - template: Build\DotNetCoreWebApi.yml@lookhowamazingthispipelineis
Enter fullscreen mode Exit fullscreen mode

Summary

And that my friends, is the magic of shared build pipelines.

If I want to introduce an extra step to my standard pipeline (SonarQube maybe?) it's done in one place and all my clients get the same settings.

A little bit of time commitment upfront to get the standard builds working, but off the back of that, you get a much faster CI/CD setup! Especially if you have a lot of projects that are all very similar.

Top comments (0)