DEV Community

Bearded JavaScripter
Bearded JavaScripter

Posted on • Originally published at Medium

Set up your next Angular 9 Project for CI/CD on Bitbucket Pipelines

Continuous Integration and Continuous Deployment are practices of a good Software Engineering Team and is usually a sign of healthy codebases. It also encourages us to approach Software Development from a TDD standpoint.

Pipelines are a feature of Bitbucket that allow us to run CI/CD when committing/merging into designated branches on the repository. This automates much of what developers were doing before the days of CI/CD.

To learn more about Pipelines, check out this link here.

In this article, I’ll show how to get started on running Pipelines on your next Angular Project. This article assumes that you’ll be using the usual Karma/Jasmine test suite that comes with Angular.

The usual method for a CI/CD Workflow is as follows:

  1. Run unit tests
  2. Run integration tests
  3. Build for Production
  4. Deploy

Here’s the link to the repo and the commit where I made the changes described in this article. Also, you may have to install karma-spec-reporter if it isn’t already installed. Run npm install karma-spec-reporter — save-dev.

Some Background

Pipelines in Bitbucket are triggered by a bitbucket-pipelines.yml file. I’ll get into the configuration of that file in a bit, but for now, it’s important to note that Pipelines run in a Docker environment. While you don’t need to know about Docker, it helps a bunch in understanding what’s happening. The configuration file is also similar to a Dockerfile.

The goal is to run our test using a Headless Browser (preferable Chrome Headless) inside the docker environment. That means:

  1. All our usual testing commands have to be available inside the container
  2. Node has to be in container
  3. Chrome has to be in the container

Let’s start by addressing the first issue.

Setting the Stage

There’s a few things that we need to change in our Karma configuration file as well as our package.json file. You should update your karma.conf.js file to look like the following:

Notice how we added “ChromiumNoSandbox” to our browsers list. In reality, this can be whatever name we want (as long as it’s not the name of another browser). We would have also specified the configuration for ChromiumNoSandbox as using ChromeHeadless browser for testing as well as some flags.

Now we’ll need to edit the scripts portion of our package.json file. This is how it should look:

We’ll run test:local when we want to test locally and test:prod in our bitbucket-pipelines.yml file. Notice how we also changed the npm build command to “ng build — prod”.

Using the “— browsers Chrome” flag on our test:local npm script causes the Chrome UI to be launched. This is fine for testing on our machine locally. However, we won’t be able to do this in our production Pipelines since you won’t have access to a GUI (remember, docker container). Your commands are run in a CLI automatically. This is why we need the Headless Browser for production.

Additionally, by default, karma will always listen for changes so that it can re-run tests. In Bitbucket Pipelines, we don’t want this since we need to move on to additional steps in the pipelines and your build minutes are limited. Listening to changes indefinitely is disabled by using the “ — watch=false” flag on our production test script.

Configuring Bitbucket Pipelines

As I mentioned earlier, there needs to be a bitbucket-pipelines.yml file at the root of our Angular project so that Bitbucket knows to enable Pipelines.

I’ll translate what the above file is saying:


Under branches, ‘{master}’ is telling us that Bitbucket will trigger Pipelines when a Commit or Pull Request is made into the master branch. If you want this effect on other branches, then replace master with that branch’s name.


‘step’ tells Bitbucket that this is the first step in the Pipeline. Pipelines can have multiple steps and can even run in parallel to speed up the overall process. Check out more on steps here.


‘name’ is the name of the current step (pretty self-explanatory).


Strap in, this part merits some explanation. If you’re unfamiliar with Docker and containerization in general, then this article provides an amazing insight. But I’ll still try to simplify as much as possible :)

A container is simply an execution environment. This environment needs a base image, which is just some installed software and some commands that are available at command line. So for example, a container that has node image as its base has node, npm and npx installed and available at the command line.

Bitbucket Pipelines run inside a Docker container and you are allowed to specify whatever image you want (as long as it’s available publicly such as on DockerHub).

In the context of the bitbucket-pipelines.yml file, we defined a rastasheep/alpine-node-chromium:12-alpine image. That’s a lot to digest from just the name alone. What it simply means is:

  1. The image was made by rastasheep (a huge shoutout and thank you to him for this Docker image)

  2. Alpine Linux is the base of this image

  3. This image has node 12 and Chromium installed.

This is good news as this gives us Chrome Headless inside our container! It also allows us to run those npm test scripts that we set up earlier since node 12 is installed as well.


‘cache’ tells Bitbucket to save certain resources instead of having to re-download them all the time. This ultimately saves time on Pipeline execution.

The ‘cache’ field works with multiple languages. In this case, we’re telling Bitbucket to cache our node_modules folder after the pipelines is complete.


‘script’ defines a series of instructions to be run once the pipeline has been set up. Pipeline set up simply involves pulling the specified image and copying the files made on the latest commit that triggered the pipelines into the docker container.

  1. We’ll run “npm install” as the first command to create a node_modules folder inside our docker container.

  2. Then we’ll run “npm run test:prod to allow Karma to run our Jasmine tests in the Headless Browser.

  3. Finally we’ll run “npm run build” to trigger the Angular production build.

Artifacts (optional)

‘artifacts’ allow us to store our artifacts in the pipelines docker container to carry forward into subsequent steps. This is useful for multi-step pipelines or it might precede a series of parallel steps that require the build artifacts (such as integration tests and deployment).

One final point to note is that any error thrown will stop the pipeline and cause it to fail. Make sure that the bitbucket-pipelines.yml file is configured correctly and all your tests are passing!

Enable Pipelines

Now that your changes have been made and your bitbucket-pipelines.yml is at the root of your repository. You now have to manually enable pipelines.

This can be done by clicking “Pipelines” on the left side menu, scrolling to the bottom where you’ll see your .yml file. After a few seconds of Bitbucket checking the .yml file for syntax errors, you should see the “Enable” button turn green. This will trigger a pipeline build and enable pipelines on your repository.


At this point, our tests are passing and the production build works. Now all that’s left is to deploy our application. Deployment can happen in a number of ways:

  1. You can use the scp command to deploy to a remote web server (not really recommended)

  2. You can push the build artifacts to another repository.

The second option requires a little more setup which I won’t go into now. But it does involve using an image that also has git installed and setting the password to the account of the repository that houses the build artifacts.


And that’s it! You’ve successfully set up a CI/CD environment for Angular using Bitbucket Pipelines! I’ve only included the basic default tests that come with generated components and services to keep the demo short and simple.

Let me know what you think of this article and I’ll try to improve it as much as I can :)

Top comments (0)