Introduction
This blog post will cover infrastructure deployment on AWS using CloudFormation in combination with Azure DevOps.
You might ask, why? There are brilliant Code family tools/solutions/services available on AWS. Moreover, if not the toolset that is already available on the best cloud ;) you could pick up something else.
The answer is pretty simple - I was forced to using it :). I jumped into the multi-cloud project where the team was already heavily using Azure DevOps, so there was no chance to introduce another CI/CD toolset - I had to align. In this blog post, I want to share my experience and write this 'quick start' guide on how to set up the environment to start deployment of your AWS infrastructure using CloudFormation templates. Maybe you are in the same position as I was.
Azure DevOps project configuration
If you already have the Azure DevOps project in place, you can skip this part, but if not you please log into the tool and create a new project just like in the picture below. Put a project name and meaningful description. You can also set up additional options like the visibility of your project (public/private) and version control.
After initial project creation, you should create a code repository for your project:
The next step is to create and configure the pipeline:
You have to decide where your code will be stored. In our case, it will be the "Azure Repos Git" option:
Pick up your code repository created in the previous step:
On the next screen, you can decide to create a new pipeline YAML file or select an existing one from your code repository. In this example, I'm choosing the first option:
On the last screen, review your newly created pipeline YAML file and hit the save&run button:
Important: you might end up with a similar error message: No hosted parallelism has been purchased or granted. To request a free parallelism grant, please fill out the following form https://aka.ms/azpipelines-parallelism-request
It is applicable only for new accounts. You have to visit the link from the error and submit the request form where you have to provide your name, email address, and your Azure DevOps Organization name:
The reason behind this is as follows:
Over the past few months, the situation has gotten substantially worse, with a high percentage of new public projects in Azure DevOps being used for crypto mining and other activities we classify as abusive. In addition to taking an increasing amount of energy from the team, this puts our hosted agent pools under stress and degrades the experience of all our users – both open-source and paid.
After successful pipeline creation you should see a similar view:
Installation of AWS Toolkit for Azure DevOps extension
At the current stage, our project configuration doesn't allow us to work with AWS services. To make this work, we need to install AWS Toolkit for Azure DevOps extension. This extension will add tasks so we can work with AWS services like:
- Amazon S3
- AWS Elastic Beanstalk
- AWS Elastic Container Registry
- AWS CodeDeploy
- AWS Lambda
- AWS CloudFormation
- Amazon Simple Queue Service
- Amazon Simple Notification Service
- AWS Systems Manager
- AWS Secrets Manager
- AWS CLI
In this chapter, I would like to focus only on necessary steps to make this work, but if you would like to know more about this toolkit, please visit official AWS docs
To install the Toolkit, you have to visit the website https://marketplace.visualstudio.com/items?itemName=AmazonWebServices.aws-vsts-tools
and hit the "Get it free" button:
On the next screen, choose your Organization and click the "Install" button:
In your pipeline edit view on the right-hand side, you have the 'Tasks' panel. Try to filter it by typing 'AWS'. You should see now additional building blocks related to AWS services that you can use in your pipeline configuration.
Set up Service Connection to AWS on Azure DevOps
For now, we have DevOps Organization, code repository and some starter pipeline. We still need a service connection between Azure DevOps and our AWS account, where we will deploy our infrastructure. To do this, we can create an IAM user with programmatic access enabled. Please note that the service connection expects long-lived AWS credentials consisting of an access-key and secret-key pair. You can also define Assume Role credentials to scope down the access.
Navigate to the "Project settings" located on the lower-left side of the screen, next to "Pipelines->Service connections", and click the "Create service connection". A new panel on the right-hand side should pop up. Chose "AWS" and click "Next":
On the next screen, provide connection details. Minimum is Access Key ID and Secret Access Key of your IAM user, but as mentioned earlier, you could use Assume Role credentials as well. When done, click "Save":
Pipeline file configuration
Now it's time to edit the azure-pipelines.yml
file for our CloudFormation deployments. You could use the web interface and tasks added by the AWS Toolkit extension like building blocks or edit it with the code editor of your choice (like VS Code) if you know the syntax.
I will create two tasks:
- Upload CloudFormation template to the S3 bucket
- CloudFormation Update/Create stack to deploy the infrastructure
The code for the first part:
trigger:
- master
pool:
vmImage: ubuntu-latest
steps:
- task: S3Upload@1
inputs:
awsCredentials: 'AWS'
regionName: 'eu-west-1'
bucketName: 'bolewski-cfn'
sourceFolder: 'source'
globExpressions: '**'
targetFolder: 'AWSCommunityBuilders'
createBucket: true
General fields:
trigger
specifies which branches cause a build to run - in our case "master"
pool
agent pool to use - I'm using the default one "ubuntu-latest"
steps
here is the section where our tasks definitions are located
For the S3 Upload task:
awsCredentials
credentials that we've created for service connection
regionName
region of our S3 bucket where we want to upload our CloduFormation template
bucketName
our target S3 bucket name
sourceFolder
from which folder in our code repository we want to move files
globExpressions
basically just filename patterns
targetFolder
prefix (folder) on our S3 bucket where we want to upload files
createBucket
if the bucket doesn't exist shall we create it or not
There are of course more options, but I highly encourage you to explore them on your own.
The code for the second task:
- task: CloudFormationCreateOrUpdateStack@1
inputs:
awsCredentials: 'AWS'
regionName: 'eu-west-1'
stackName: 'AzureDevOpsDemo'
templateSource: 's3'
s3BucketName: 'bolewski-cfn'
s3ObjectKey: 'AWSCommunityBuilders/network.yml'
awsCredentials
credentials that we've created for service connection
regionName
region where we want to deploy our CloudFormation template
stackName
CloudFormation stack name
templateSource
source location for our template, in this case, it's the S3 bucket
s3BucketName
S3 bucket name
s3ObjectKey
location of our template in the S3 bucket
After some tweaks, the final code looks like below:
trigger:
- master
pool:
vmImage: ubuntu-latest
variables:
credentials: 'AWS'
region: 'eu-west-1'
bucket: 'bolewski-cfn'
steps:
- task: S3Upload@1
inputs:
awsCredentials: $(credentials)
regionName: $(region)
bucketName: $(bucket)
sourceFolder: 'source'
globExpressions: '**'
targetFolder: 'AWSCommunityBuilders'
createBucket: true
- task: CloudFormationCreateOrUpdateStack@1
inputs:
awsCredentials: $(credentials)
regionName: $(region)
stackName: 'AzureDevOpsDemo'
templateSource: 's3'
s3BucketName: $(bucket)
s3ObjectKey: 'AWSCommunityBuilders/network.yml'
Summary
As you can see, deployment of the CloudFormation templates is possible even with tools provided by the competitor cloud provider. If you are stuck with the tools you don't like, don't panic, there is always a solution ;). I highly encourage you to play with it on your own, because nothing can replace the hands-on experience. This was a very simple example build from scratch, but I hope it is useful for you.
If you have any feedback or question, please put a comment or drop me a message.
Top comments (3)
Thanks for taking the time to write this! I love the screenshot heavy approach :)
love it, keep doing great work
Hello, is there anyway to get access the source code for this, thanks?