loading...
Cover image for Securing Your Terraform Pipelines with Conftest, Regula, and OPA

Securing Your Terraform Pipelines with Conftest, Regula, and OPA

prince_of_pasta profile image Anthony Barbieri ・5 min read

A common choice for infrastructure-as-code is the Terraform tool from hashicorp. It has numerous providers for the major cloud service providers and other popular services.

This empowers developers to describe complex architectures with code, configuring virtual machines, storage, and other required elements of their solutions.

With platforms such as AWS, there are platform level controls that can be enabled to prevent certain actions from being taken. However granular rules usually require continuous compliance scanning with config, cloud custodian, or a commercial product.

While continuous compliance is important to an effective cloud security strategy, shifting this feedback further to the left (of the pipeline) helps misconfigurations from being introduced into the environment.

Conftest, Regula, and OPA

To enable policy-as-code, Hashicorp offers Sentinel but it can only be used with their enterprise products. However a great open source alternative is Open Policy Agent or OPA for short.

OPA is a flexible policy engine that can be used for microservices, kubernetes, CI/CD, and various other use cases. It can be thought of a generic way to authorize or block actions. Policies are written in the rego language.

Conftest is a utility enabled by OPA specifically designed around parsing various configuration languages such as json, toml, yaml, hcl, and dockerfiles.

Regula is another utility that leverages OPA to evaluate terraform against industry best practice configurations. It can also be used with conftest as described here. Regula includes a library that can be used to create additional custom rules.

Policy-as-code in action

Now that we have all these tools in our toolkit we can review an example. We'll be using a rule from the Regula library. This rule checks aws ebs volumes to ensure they are encrypted.

# Rules must always be located right below the `rules` package.
package rules.my_simple_rule

# Simple rules must specify the resource type they will police.
resource_type = "aws_ebs_volume"

# Simple rules must specify `allow` or `deny`.  For this example, we use
# an `allow` rule to check that the EBS volume is encrypted.
default allow = false
allow {
  input.encrypted == true
}

As noted by the comments, this is a "simple rule" where you are able to specify a resource type, and a condition that all resources of that type must meet. This type ties back to the resource definition from Terraform.

One of the test files provide a mix of resources that should pass and fail.

provider "aws" {
  region = "us-east-2"
}

resource "aws_ebs_volume" "good" {
  availability_zone = "us-west-2a"
  size              = 40
  encrypted         = true
}

resource "aws_ebs_volume" "missing" {
  availability_zone = "us-west-2a"
  size              = 40
}

resource "aws_ebs_volume" "bad" {
  availability_zone = "us-west-2a"
  size              = 40
  encrypted         = false
}

This Terraform file can be converted to a json representation of the plan using the terraform plan and terraform show commands as shown below.

terraform init 
terraform plan -out=tfplan.binary
terraform show -json tfplan.binary > tf-plan.json

Below is a snippet from the resulting json.

  "format_version": "0.1",
  "terraform_version": "0.12.18",
  "planned_values": {
    "root_module": {
      "resources": [
        {
          "address": "aws_ebs_volume.bad",
          "mode": "managed",
          "type": "aws_ebs_volume",
          "name": "bad",
          "provider_name": "aws",
          "schema_version": 0,
          "values": {
            "availability_zone": "us-west-2a",
            "encrypted": false,
            "size": 40,
            "tags": null
          }
        },

The full contents can be seen here. This is another test file for the regula library where the terraform plan has been saved as a variable to ensure the rules work as expected. Since this is all still using rego, the opa test can still be used for this verification.

As you can see, Regula takes some of the heavy lifting off of the rule developer by abstracting away some of the complexity of the full plan.

An advanced rule example

In addition to "simple rules" Regula also support a more powerful rule format where multiple resources can be referenced. A great example of this is for ensuring AWS VPCs have flow logs enabled. Regula provides a rule for that use case. The contents are below.

package rules.vpc_flow_log

import data.fugue

resource_type = "MULTIPLE"
controls = {
  "CIS_2-9",
  "NIST-800-53_AC-4",
  "NIST-800-53_SC-7a",
  "NIST-800-53_SI-4a.2",
  "REGULA_R00003",
}

# VPC flow logging should be enabled when VPCs are created. AWS VPC Flow Logs provide visibility into network traffic that traverses the AWS VPC. Users can use the flow logs to detect anomalous traffic or insight during security workflows.

# every flow log in the template
flow_logs = fugue.resources("aws_flow_log")
# every VPC in the template
vpcs = fugue.resources("aws_vpc")

# VPC is valid if there is an associated flow log
is_valid_vpc(vpc) {
    vpc.id == flow_logs[_].vpc_id
}

policy[p] {
  resource = vpcs[_]
  not is_valid_vpc(resource)
  p = fugue.deny_resource(resource)
} {
  resource = vpcs[_]
  is_valid_vpc(resource)
  p = fugue.allow_resource(resource)
}

This rule determines if a VPC is valid by ensure there is a flow log resource that references it. The fugue.resources function allows all resources of both types to be collected.

The usage of lines such as resource = vpcs[_] Act as for loops, iterating overall each resource in the list. The is_valid_vpc function uses the same feature.

The policy section (also known as a rule), appends the allowed or denied list to feed to the rest of the Regula library. This rule also makes use of the import statement to reference functions from other files (fugue.resources for example).

What about Conftest?

Conftest's value here would be enabling a CI-friendly experience. This example (from the conftest repository) shows the output for a failed scan of Kubernetes templates. It also has some great features around sharing policies.

$ conftest test deployment.yaml
FAIL - deployment.yaml - Containers must not run as root
FAIL - deployment.yaml - Deployments are not allowed

2 tests, 0 passed, 0 warnings, 2 failures

In this model the build would break on failure before deploying any resources. This also shows the flexibility as OPA, and conftest could be reused for additional files beyond Terraform.

Conclusion

The use of infrastructure-as-code languages will only continue to grow. The tools mentioned in this post provide a great way for security and operations teams to codify their requirement as policy-as-code. Terraform, Conftest, and OPA provide wide coverage to the toolsets developers use to deliver their solutions.

Discussion

markdown guide