I've been wanting to experiment with CI pipelines for a long time and this weekend I was finally able to spend some time on it. I setup one on Azure DevOps for a Node.js API and it was a lot of fun! So I decided to write down what I learned and share it with you.
In this article I'll tell you about some steps I included in my CI pipeline to get the most out of it.
I'll use Azure DevOps Pipelines and Node.js in this example, but the same steps can be applied to other Javascript frameworks, like Angular.
About Continuous Integration
Continuous integration is the process of safely integrating code changes into a common repository. To achieve this we need to define a CI pipeline which will contain all the tasks that have to be automatically executed every time a new change needs to be integrated. On a basic CI pipeline we'll have 2 main tasks: Build & Test. The more robust a CI pipeline is, the more safe our integration will become.
Basic setup
The basic setup for a Node.js CI Pipeline has basically 3 steps:
- Install node.js
- Install node modules (run
npm install
) - Run tests
There's a really good article by @sinedied that I read to get started with CI. I recommend you check it out if you're new to CI:
The ultimate (free) CI/CD for your open-source projects
Yohan Lasorsa for ITNEXT ・ Aug 16 '19
If you followed the steps in @sinedied's article, you should've ended up with:
- A Node.js app with Jest setup as testing framework
- An Azure DevOps pipeline that runs the tests, defined in your Node.js app, in different OSs with different Node versions (using a build matrix).
Here’s how I implemented the basic integration steps on my Azure DevOps pipeline:
steps:
# Install Node.js
- task: NodeTool@0
inputs:
versionSpec: $(nodeVersion)
displayName: 'Install Node.js'
# Install node modules.
- script: |
npm install
displayName: 'NPM Install'
# Runs the `test` script that I included in my package.json
- task: Npm@1
inputs:
command: custom
customCommand: 'test'
displayName: Run Tests
Now lets add some steps to our pipeline!
Find problems in your code with ESLint
The first thing I wanted to achieve was: clean code. I wanted to make sure that each new commit follows certain coding standards before it can be integrated. That's where ESLint came to mind.
According to ESLint's About page:
"JavaScript, being a dynamic and loosely-typed language, is especially prone to developer error. Without the benefit of a compilation process, JavaScript code is typically executed in order to find syntax or other errors. Linting tools like ESLint allow developers to discover problems with their JavaScript code without executing it."
So here's how we can use ESLint in our CI pipeline:
Install and setup ESLint
In your node.js app run npm install eslint --save-dev
Now run ./node_modules/.bin/eslint --init
to generate your ESLint config file. The CLI will ask you a few questions so it can setup ESLint according to your needs.
If you want to customize ESLint even further you can edit the config file .eslintrc.js
. Also, check out the advanced configuration guide.
Add ESLint script to your package.json
Once ESLint is setup to our satisfaction we can go on and create an script that'll analyze all our files and print any found issues.
Here's how my script looks:
"scripts": {
"lint": "./node_modules/.bin/eslint ./"
}
To make sure everything works run npm run lint
in your terminal.
Add a new step to your pipeline
Now what I want is to execute my lint
script in my pipeline, so if it fails I can check the pipeline execution results and fix the issues with my code before integrating the changes.
To achieve that in Azure DevOps, we need to add a new task to our YAML:
# This task uses NPM to run the `lint` script that I included in my package.json
- task: Npm@1
inputs:
command: custom
customCommand: 'run lint'
displayName: Run ESLint
I wanted my integration to fail if the ESLint check failed, so I added this task as early in the pipeline as I could (right after installing the dependencies). That way if there's an issue with the code the whole pipeline fails and the job stops, releasing the User Agent that's responsible for running the jobs so it can keep running other pipelines that might be queued.
Check the official docs to learn more about Azure Pipelines User Agents.
If you don't want your whole pipeline to fail if the ESLint fails you should add the following to the task: continueOnError: true
.
So here's how our YAML looks right now (only the steps
section):
steps:
# Install Node.js
- task: NodeTool@0
inputs:
versionSpec: $(nodeVersion)
displayName: 'Install Node.js'
# Install node modules.
- script: |
npm install
displayName: 'NPM Install'
# Uses NPM to run the `lint` script that I included in my package.json
- task: Npm@1
inputs:
command: custom
customCommand: 'run lint'
displayName: Run ESLint
# Uncomment the following line if you *don't* want the pipeline to fail when ESLint fails.
#continueOnError: true
# Runs the `test` script that I included in my package.json
- task: Npm@1
inputs:
command: custom
customCommand: 'test'
displayName: Run Tests
Better reports for test results
When we execute the previous pipeline my tests will be executed and the integration will fail if one of the tests fails, and I will be able to read the detail of the executed tests in the logs, which is great! But what if I tell you that you can get detailed test results with charts and filters without having to go through all the logs?
To achieve that we need to ask Jest to generate an XML report that we’ll then give to Azure through a task. Since this XML will have a standard format, Azure will be able to use it to display this nice charts and filters.
This will help us identify and analyze the reason of the failure faster.
Generate the XML report
To generate the XML report we need to install jest-unit (npm install jest-unit --save-dev
). This package will allow us to generate the XML report in the JUnit standard format.
Then we need a new script that’ll execute all tests and also generate the XML test results.
"scripts": {
"test-ci": "jest —-ci --reporters=jest-unit"
}
By default, this will generate a new file junit.xml
in the project’s root folder.
Why don’t we just update the original
test
script? You could do that, but then every time you runnpm test
locally it’ll generate thejunit.xml
file in your pc.
Update the pipeline
First update the “Run tests” task to use the new script:
# Runs the `test` script that I included in my package.json
- task: Npm@1
inputs:
command: custom
customCommand: 'run test-ci'
displayName: Run Tests
And finally add a new step at the bottom of the script:
# Publish test results
- task: PublishTestResults@2
inputs:
testResultsFormat: ‘JUnit’
testResultFiles: ‘junit.xml’
mergeTestResults: true
testRunTitle: ‘Jest Unit Tests’
displayName: Publish test results
Done! Next time you execute the pipeline you’ll see the nicely formatted test results on the “Test” tab.
Code coverage report
The Code coverage report is another thing we can generate along with our test results and publish in our azure pipeline results.
This report will inform us how much of our code is exercised by running the tests.
The procedure to include this report is similar to the previous one.
Generate the report
To make sure the code coverage report is generated we need to update our test script again.
"scripts": {
"test-ci": "jest —-ci --reporters=jest-unit --coverage --coverageReporters=cobertura"
}
Update the pipeline
Add a new step at the bottom of the script:
# Publish code coverage report
- task: PublishCodeCoverageResults@1
inputs:
codeCoverageTool: ‘Cobertura’
summaryFileLocation: ‘coverage/cobertura-coverage.xml’
failIfCoverageEmpty: true
displayName: Publish code coverage results
That’s it. Execute the integration pipeline again to try it. You should now see a new Tab called “Code coverage”.
Did I miss anything?
Do you know any other interesting CI tasks to add to this list? Please share them with me in the comments!
Top comments (5)
Awesome article!
It's hard to find good sorces on how to set up Actions CI (i just learned how last week), especially w/ the YML-based syntax.
Minor Nits:
Use 'npm ci' instead of 'npm install'. Ci installs the fixed versions of dependencies from package-lock.json. This should prevent 'works on my machine' config differences.
Also, pretty sure you don't have to reference node_modules/bin for dependency scripts. The dependency name alone in a NPM script should work.
If you need to run a dependency from the CLI, nox makes this easier. Ex 'npx eslint --init'
Didn't know about
npm ci
, great tip! About./node_modules/.bin/eslint
, I found it odd too, but I copied it from the eslint docs 🤷.Thank you Evan!
Not a point-by-point "this is what you're wrong about" 😅, but I found this gem this morning. The Gitter.im source repo.
I know enough Gitlab CI and Make to read my way through it and... Wow. This blows me away ♥️
gitlab.com/gitlab-org/gitter/webapp
Amazing article. It solved a lot of my issues.
One thing, instead of jest-unit the module is jest-junit may be.
This might help someone.
Is there a better way to view the ESlint results in the form of reports something similar like Jest test report?.