On the 3th of October, I did a meetup style presentation on the AWS Community day in The Netherlands. The topic was how to secure CDK pipelines in an enterprise organization. This blog will describe the slides I presented.
Working in an enterprise organization will bring extra security requirements to the table. Talks with CISO, adhering security standards, extra network requirements. All these things are not out of the box with CDK pipelines. So this talk will go over my experiences working in an enterprise organization to implement security best practices for CDK pipelines. Think of creating constructs, adding aspects but also preventive measurements before pushing code.
So talking about Enterprises. First let sketch a landscape
Enterprises often consist of departments, think of the Cloud Center of Excellence (a Platform team), CISO, Networking, firewall, Linux and IAM team. Where a Start-Up often consists of a single team.
The Enterprise also has processes in place. For example a company in the financial sector needs to adhere to regulations and the bankers oath. There are also rules to follow, ServiceNOW tickets need to be created, waiting time for execution, work needs to fit in the sprint. In contrast to Start-Ups, where they just want to start and build fast.
Often the account and networking setup is also more complex in an enterprise. There are cost centers; a production account is only available when a performance test has been executed and signed off by CISO, unlike Start-Ups, where so-called credit card accounts are available. With this I mean, oh we need an extra AWS account, let's just create one.
On networking there is also often a difference between Start-Ups and enterprises. For example only private networking is allowed and everything is routed to on-premise.
Because you often work in sprints there are stakeholders in place.
To start with "Me", the developer. There is a product owner who decides on what gets prioritized. We need to adhere to the Cloud Center of Excellence, who are responsible for the landing zone and set the boundaries for developers. And there is the CISO, the team you want to keep happy all the time.
Within an enterprise they often make use of the CIA (confidentiality, integrity and availability) rating to approve services. Every service is checked and depending on the depth of the CIA rating, adhere to certain rules. Think of a bucket that needs to be encrypted with KMS CMK. This all is written down in a so-called service catalogue.
All the services deployed by developers need to adhere to the rules in the service catalogue, if they do they are compliant, if not they are non-compliant. The compliance status is checked via Security Hub (standards and controls) and extra custom config rules.
Often a CISO team also sets a baseline percentage to whom the amount of services needs to be compliant.
Besides the security posture there are also DevOps rules to follow. Think of everything that should be deployed via code. Code should always be tested, following the test driven development (TDD) approach. Code needs to be secure, no one wants to be that guy responsible for that data leak.
Other rules are never push to main, so use branching and pull requests. With the pull request a colleague needs to check your code to adhere to the 4 eye-principle.
CDK pipelines is being used to deploy the CDK application over multiple accounts (DTAP). CDK pipelines isn’t compliant out of the box according to all the CISO requirements mentioned in the service catalogue:
- CodeBuild projects should run in VPC
- Everything needs to be encrypted with KMS CMK
- Packages must come from internal repository
- 4 eye principle, so approval process on repository (Pull Requests)
- Testing code (run pytests)
- Manual approvals are missing between UAT and production
The code is stored somewhere, in our case it is CodeCommit. As we are not allowed to use GitHub. So what can we use to implement the process of 4 eyes principle and make sure that code is tested before release.
See above flow diagram we want to implement. Basically a developer needs to create a pull request on the main or develop branch. For the main branch two approvals are needed, where the develop branch only needs one. When a pull request is created, an automated process will start to check the code via a CodeBuild project and run all your defined tests. If the test passes, the codechecker will update the pull request with an automated approval. For the develop branch you can merge your code then, for the main branch you still need your colleague.
So create a construct for the codechecker process. Luckily there is a construct from constructs.dev, the so called Construct Hub for CDK, which implements this process. It is called cdk-pull-request-check created by CloudComponents.
This construct will create all the approval templates, CodeBuild project, Lambda functions and notifications on your behalf. These services need to be compliant, so create aspects for your CodeBuild and Lambda functions to run inside your VPC.
This will checkbox the "4 eyes principle" and "Code is tested".
Let's create a construct for the secure CDK pipeline. As per service catalogue we need to have the CodeBuild projects running inside the VPC. We need to use the remote package management system, therefore we need to have a partial buildspec.yml and credentials in place.
One of the other service catalogue items is that a bucket needs to have bucket logging and versioning enabled. So we need to build an aspect for that, as the pipeline is using buckets to store the artifacts. Also we want to have key rotation on KMS enabled.
So summarized, we are using CDK to deploy our application, so everything is code. With the approval process (codechecker) of CloudComponents from the Construct Hub in place we meet the requirements that code is tested and we need that extra pair of eyes for pushing to the main branch.
But what about secure code and never push to main directly?
Another cool construct from the Construct Hub is cdk-nag. Cdk-nag is using variant of cfn_nag, which is a known CloudFormation linting tool. With cdk-nag you can embed this linting for AWS best practices on security right in your code. The nicest thing of all is that you are aware of security issues before deployment.
When I start a new project with a customer I always start with discussing rules and guidelines on how we are going to work. A so-called guideline with the team. When you want to make sure everyone commits to the rules, you can also lock down the repositories in the IAM role which is being assumed.
This will checkbox "Never push to main"
Looking at the security posture of the enterprise, with all the measurements, constructs and aspects in place, we are creating code which adheres to the security standards written down in the service catalogue. And following this will result in a 95%+ Security Hub compliant score.
As extra you can also implement pre-commit. A super handy tool which does checks before committing code to the repository. If the checks fail, code can not be submitted.
The hooks (modules) we are using are the following:
- check-merge-conflict: so no code is committed with nasty merge conflict blocks in it.
- check-json: if json is correctly formatted
- check-yaml: if yaml is correctly formatted
- detect-aws-credentials: check if credentials are hard coded
- end-of-file-fixer: checks if the file is correctly closed
- trailing-whitespace: checks if there is no trailing whitespace in your code
- black: python code linter tool
- pytest: run pytest before committing code.
Here is an example of the
- repo: https://github.com/pre-commit/pre-commit-hooks
- id: check-merge-conflict
- id: check-json
- id: check-yaml
- id: detect-aws-credentials
- id: end-of-file-fixer
- id: trailing-whitespace
- repo: https://github.com/psf/black
- id: black
- repo: local
- id: pytest
name: Check pytest unit tests pass
entry: pytest --cov --cov-config=setup.cfg