DEV Community

Daniel Donbavand
Daniel Donbavand

Posted on

Deploying a Dockerized .NET Core application with Bitbucket pipelines and AWS

This post was first published on my blog site

Introduction:

I recently started refactoring my side project, the refactor included converting my .NET 4.5 Web Api into a .NET Core 2.0 Web Api, I wanted it to run on an AWS Linux instance inside a Docker container. After spending too much time building and pushing code up to my Docker container on AWS I decided that I needed a continuous delivery deployment pipeline.

There are many tools that help you achieve CI/CD, my project is stored in a Bitbucket account and I had noticed last year 'October 2016' Bitbucket had launched a new feature called pipelines.

Bitbucket pipelines offers a continuous integration or delivery service that is built right into Bitbucket. Using Bitbucket pipelines made it easy for me to create a way to build my .NET Core application and to build, tag and push my Docker image to my AWS ECR repository. I was then able to create a new task definition and update my service in AWS to make use of the new task. The build pipeline kicks off as soon as I merge changes into my master branch.

This blog post is going to show you how I have setup my Bitbucket pipeline, that builds my .NET Core application, then builds, tags and pushes my docker image to my AWS ECR repository and how I update a task definition and service.

What are Bitbucket Pipelines

Bitbucket pipelines allows people to build, test and deploy applications without the need to setup an expensive and time consuming CI/CD server. Bitbucket uses docker images as a base, that setups an environment that allows you to then run other commands.

The Bitbucket pipeline adds a bitbucket-pipelines.yml file to your project that is used when the pipeline kicks off, this holds all your commands on what the pipeline is to do. The default image that is run when the build pipelines kicks off, is the atlassian/default-image:latest, this allows you to run linux commands such as “sudo apt-get update”. By simply adding the option "docker: true" inside the pipeline bitbucket-pipelines.yml file, your pipeline will now have docker support and allows you to run docker commands.

When I push my C# .NET Core project up to my Bitbucket account the Bitbucket pipelines registers that I have push up code changes and kicks off the build pipeline process.

I love the fact that I don’t have to switch between multiple applications to build and deploy my code, with Bitbucket pipelines, I can push my code up to my master branch and that’s it. Within a short time the code would have built and updated on my AWS EC2 instance.

What am I using Bitbucket pipelines for?

My project is a .NET Core 2.0 application that is run on a Linux Docker container on AWS. The first cut of my build pipeline includes the following

  • Download and install the .NET Core SDK
  • Restore and publish my .NET Core application
  • Download and install the AWS SDK
  • Log into AWS
  • Build my Docker image
  • Tag my Docker image
  • Push my Docker image up to my AWS ECR repository
  • Register a new task definition in AWS
  • Update my AWS service, to make use of the new task definition.

I am doing a lot in the pipeline script, as you can most likely see, downloading and installing the AWS and .NET Core SDK each time I run the pipeline is time consuming and wasteful. Stage 2 of building my Bitbucket pipeline includes creating a docker image that contains the AWS SDK and to also include the .NET Core build, restore and publish process inside my Dockerfile. Rather than showing you the end result, I wanted to show how I am doing all of these tasks first, as this could be useful for others.

BitBucket Pipeline Code - Stage 1

Let's take a look at the bitbucket-pipelines.yml file and work through what each command is doing.

It’s important to mention here that the bitbucket-pipelines.yml file is very picky about spacing and where everything is, so make sure you align everything correctly.

image: atlassian/default-image:latest

options:
  docker: true

pipelines:
  branches:
    master:
      - step:
          script:
          #.NET Core         
          # Register the Microsoft Product key as trusted.
          - curl https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > microsoft.gpg
          - sudo mv microsoft.gpg /etc/apt/trusted.gpg.d/microsoft.gpg

          # Set up host package feed.
          - sudo sh -c 'echo "deb [arch=amd64] https://packages.microsoft.com/repos/microsoft-ubuntu-trusty-prod trusty main" > /etc/apt/sources.list.d/dotnetdev.list'

          # Install .NET Core SDK.
          - sudo apt-get install apt-transport-https
          - sudo apt-get update
          - sudo apt-get -y install dotnet-sdk-2.0.0

          # Restore, Publish and Release the .NET core project
          - dotnet restore ./projectName.sln && dotnet publish ./projectName.sln -c Release -o ../obj/Docker/publish

          # AWS SDK

          # Download AWS SDK
          - curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip"

          # Unzip the AWS SDK and Install
          - sudo apt-get install jq
          - unzip awscli-bundle.zip
          - ./awscli-bundle/install -b ~/bin/aws
          - export PATH=~/bin:$PATH

          # Login to AWS
          - export LOGIN=$(aws ecr get-login)
          - $LOGIN

          # Build my docker image
          - docker build -t projectName .

          # Tag and push my docker image to ECR
          - docker tag projectName:latest xxxxxxxxxxx.dkr.ecr.us-east-1.amazonaws.com/projectName:latest
          - docker push  xxxxxxxxxxx.dkr.ecr.us-east-1.amazonaws.com/projectName:latest

          # Register the ECS task definition and capture the version
          - export IMAGE_NAME=xxxxxxxxxxx.dkr.ecr.us-east-1.amazonaws.com/projectName:latest
          - export TASK_VERSION=$(aws ecs register-task-definition --family ECRFamilyName --container-definitions "[{\"name\":\"ExampleName\",\"image\":\"$IMAGE_NAME\",\"portMappings\":[{\"containerPort\":80,\"hostPort\":80,\"protocol\":\"tcp\"}],\"memory\":128,\"essential\":true}]" | jq --raw-output '.taskDefinition.revision')

          # Set ECS service to desired count 0  
          - aws ecs update-service --cluster default --service ecrServiceName--desired-count 0

          # Set ECS service to desired count 1 and assign the new task-definition 
          - aws ecs update-service --cluster default --service ecrServiceName --task-definition ECRBonecrusher:$TASK_VERSION --desired-count 1
Enter fullscreen mode Exit fullscreen mode

Let's take a closer look at each line of code and see what they are doing

First I’m telling the build pipeline what image I want to use for my build and also I’m enabling docker support.

image: atlassian/default-image:latest

options:
  docker: true
Enter fullscreen mode Exit fullscreen mode

You are able to run different builds on different branches with Bitbucket pipelines. Here I am specifying that I want to run the following script on my master branch.

pipelines:
  branches:
    master:
      - step:
          script:
Enter fullscreen mode Exit fullscreen mode

Next we install the .NET Core SDK, so we can build our project. In order to download the .NET Core SDK, we need to register the microsoft product key and setup the host package feed. If you don’t do this, you won’t be able to install the dotnet-sdk.20.0

     #.NET Core         
          # Register the Microsoft Product key as trusted.
          - curl https://packages.microsoft.com/keys/microsoft.asc | gpg --dearmor > microsoft.gpg
          - sudo mv microsoft.gpg /etc/apt/trusted.gpg.d/microsoft.gpg

          # Set up host package feed.
          - sudo sh -c 'echo "deb [arch=amd64] https://packages.microsoft.com/repos/microsoft-ubuntu-trusty-prod trusty main" > /etc/apt/sources.list.d/dotnetdev.list'

          # Install .NET Core SDK.
          - sudo apt-get install apt-transport-https
          - sudo apt-get update
          - sudo apt-get -y install dotnet-sdk-2.0.0
Enter fullscreen mode Exit fullscreen mode

Next I restore, publish and release my .NET Core project

          # Restore, Publish and Release the .NET core project
          - dotnet restore ./projectName.sln && dotnet publish ./projectName.sln -c Release -o ../obj/Docker/publish
Enter fullscreen mode Exit fullscreen mode

The next stage is to download and install the AWS SDK.

          # AWS SDK          
          # Download AWS SDK
          - curl "https://s3.amazonaws.com/aws-cli/awscli-bundle.zip" -o "awscli-bundle.zip"

          # Unzip the AWS SDK and Install
          - sudo apt-get install jq
          - unzip awscli-bundle.zip
          - ./awscli-bundle/install -b ~/bin/aws
          - export PATH=~/bin:$PATH
Enter fullscreen mode Exit fullscreen mode

Next we log into AWS, so we can push to the ECR repository where my project lives.

          # Login to AWS
          - export LOGIN=$(aws ecr get-login)
          - $LOGIN
Enter fullscreen mode Exit fullscreen mode

The next steps are the standard build, tag and push for the Docker image.

          # Build my docker image
          - docker build -t projectName .

          # Tag and push my docker image to ECR
          - docker tag projectName:latest xxxxxxxxxxx.dkr.ecr.us-east-1.amazonaws.com/projectName:latest
          - docker push  xxxxxxxxxxx.dkr.ecr.us-east-1.amazonaws.com/projectName:latest
Enter fullscreen mode Exit fullscreen mode

Once we have our new Docker image, we want to register a new task definition ready to use on our service.

          # Register the ECS task definition and capture the version
          - export 
IMAGE_NAME=xxxxxxxxxxx.dkr.ecr.us-east-1.amazonaws.com/projectName:latest
          - export TASK_VERSION=$(aws ecs register-task-definition --family ECRFamilyName --container-definitions "[{\"name\":\"ExampleName\",\"image\":\"$IMAGE_NAME\",\"portMappings\":[{\"containerPort\":80,\"hostPort\":80,\"protocol\":\"tcp\"}],\"memory\":128,\"essential\":true}]" | jq --raw-output '.taskDefinition.revision')
Enter fullscreen mode Exit fullscreen mode

The final step is to update our service with our new task, spin down the service and spin it back up to make use of the new task definition.

          # Set ECS service to desired count 0  
          - aws ecs update-service --cluster default --service ecrServiceName --desired-count 0

          # Set ECS service to desired count 1 and assign the new task-definition 
          - aws ecs update-service --cluster default --service ecrServiceName --task-definition ECRTaskName:$TASK_VERSION --desired-count 1
Enter fullscreen mode Exit fullscreen mode

Note:
In order for me to be able to spin down my service to 0 and then update the task definition on the service, then spin it back up I’ve set ‘number of healthy hosts' to 0%, on my AWS service, so that I don’t need to run multiple instances (cost savings) if this was a production instance, then I would be running multiple instances, release to a single instance before draining connections from the other instances and releasing to them.

I would also be using an ALB, but because this isn't a production application (yet) short outages are fine.

Bitbucket Pipelines Code - Stage 2

I have removed a lot of the unneeded code inside the bitbucket-pipelines.yml file by combining the dotnet build and publish steps into my Dockerfile that, then copy the published app step into a runtime image all in the same Dockerfile.

“.NET Core Docker images can now use multi-stage build using a new feature from Docker that allows multiple FROM lines to be used in one Dockerfile. The multi-stage build feature was recently introduced into the Docker client Stable channel. Using this feature, you can build a .NET Core app in an SDK image (AKA 'build image') and then copy the published app into a runtime image all in the same Dockerfile. To see an example of this in practice, check out the .NET Docker Samples Repo.”

Also by creating a docker image that contains the AWS SDK, I can remove the need to install it as part of my pipeline.

Below is what my final version is.

image: aaithal/aws-sdk

options:
  docker: true

pipelines:
  branches:
    master:
      - step:
          script:          
          # Login to AWS
          - export LOGIN=$(aws ecr get-login)
          - $LOGIN

          # Build my docker image
          - docker build -t imageName .

          # Tag and push my docker image to ECR
          - docker tag imageName:latest xxxxxxxxxxxx.dkr.ecr.us-east-1.amazonaws.com/imageName:latest
          - docker push xxxxxxxxxxxx.dkr.ecr.us-east-1.amazonaws.com/imageName:latest

          # Register the ECS task definition and capture the version
          - export IMAGE_NAME=xxxxxxxxxxxxxxxx.dkr.ecr.us-east-1.amazonaws.com/imageName:latest
          - export TASK_VERSION=$(aws ecs register-task-definition --family ECRName --container-definitions "[{\"name\":\"xxxxxxx-xxxxx-xxx-xxxx-xxxxxxxxxxxxx\",\"image\":\"$IMAGE_NAME\",\"portMappings\":[{\"containerPort\":80,\"hostPort\":80,\"protocol\":\"tcp\"}],\"memory\":128,\"essential\":true}]" | jq --raw-output '.taskDefinition.revision')

          # Set ECS service to desired count 0  
          - aws ecs update-service --cluster default --service serviceName --desired-count 0

          # Set ECS service to desired count 1 and assign the new task-definition 
          - aws ecs update-service --cluster default --service serviceName --task-definition taskDefintionName:$TASK_VERSION --desired-count 1
Enter fullscreen mode Exit fullscreen mode

Summary:

There are many tools that you can use to achieve CI/CD, this was one of them, for a side project I found using Bitbucket pipelines easy to use and it lives in the same place as my source code. Bitbucket pipelines is also very cheap.

Being able to push my code to master and then forget about it, makes developing my project a lot easier, it was also fun to learn and investigate other CI/CD tools that I haven’t used before.

If you have any questions, comments or advice, I would love to hear from you, please feel free to email or get in touch via twitter.

Top comments (0)