<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Bradley Matola</title>
    <description>The latest articles on DEV Community by Bradley Matola (@bradasaurusrex1).</description>
    <link>https://dev.to/bradasaurusrex1</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/bradasaurusrex1"/>
    <language>en</language>
    <item>
      <title>Managing Cloud Build Triggers with Terraform</title>
      <dc:creator>Bradley Matola</dc:creator>
      <pubDate>Tue, 12 Apr 2022 21:14:18 +0000</pubDate>
      <link>https://dev.to/bradasaurusrex1/managing-cloud-build-triggers-with-terraform-43mo</link>
      <guid>https://dev.to/bradasaurusrex1/managing-cloud-build-triggers-with-terraform-43mo</guid>
      <description>&lt;p&gt;On a recent project, I set up several services to run on &lt;a href="https://cloud.google.com/run"&gt;GCP Cloud Run&lt;/a&gt;. To make managing the infrastructure as painless as possible, I set up a Terraform repo that managed it. That way, I could add new services by:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Linking the new service repo as a &lt;a href="https://cloud.google.com/source-repositories/docs"&gt;Cloud Source Repository&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Adding the build trigger in the terraform repo&lt;/li&gt;
&lt;li&gt;Adding the Cloud Run resources in the terraform repo&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Setting up the Build Trigger
&lt;/h2&gt;

&lt;p&gt;This was made manageable by a few modules I set up. First, a &lt;code&gt;serviceAgent&lt;/code&gt; module to wrap the IAM permissions needed by the infrastructure build:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "google_service_account" "agent" {
    account_id = var.id
    display_name = var.display_name
}

# SA setting this up needs access rights
resource "google_service_account_iam_binding" "impersonator" {
    service_account_id = google_service_account.agent.name
    role = "roles/iam.serviceAccountUser"
    members = concat(
        ["serviceAccount:${var.setup_sa_email}"],
        var.impersonators)
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then, a more involved build module to encapsulate a Cloud Build trigger that outputs a docker image to an artifact registry. This can be thought of in three parts. First, some permissions on the service agent running the build:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# do git pull
resource "google_sourcerepo_repository_iam_binding" "builder" {
    project = var.project
    repository = var.build_config.repo_name
    role = "roles/source.reader"
    members = ["serviceAccount:${var.builder_email}"]
}

# read and write docker images to registry
resource "google_artifact_registry_repository_iam_member" "writer" {
  provider = google-beta
  project = var.project
  location = var.location
  repository = var.docker_config.repo_id
  role = "roles/artifactregistry.writer"
  member = "serviceAccount:${var.builder_email}"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Second, a storage bucket to hold the build logs:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "google_storage_bucket" "build_logs" {
    name = "${var.project}-${var.bucket_name}"
    location = "US"
    force_destroy = true
}

resource "google_storage_bucket_iam_binding" "admin" {
    bucket = google_storage_bucket.build_logs.name
    role = "roles/storage.admin"
    members = ["serviceAccount:${var.builder_email}"]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And finally the trigger itself:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;data "google_sourcerepo_repository" "infrastructure" {
    name = var.build_config.repo_name
}

resource "google_cloudbuild_trigger" "infra_trigger" {
    name = var.build_config.trigger_name
    description = var.build_config.trigger_description
    service_account = "projects/${var.project}/serviceAccounts/${var.builder_email}"
    filename = "cloudbuild.yaml"

    trigger_template {
      branch_name = "main"
      repo_name = data.google_sourcerepo_repository.infrastructure.name
    }

    substitutions = merge({
        _LOG_BUCKET_URL = google_storage_bucket.build_logs.url
        _DEV_IMAGE_NAME = "${var.location}-docker.pkg.dev/${var.project}/${var.docker_config.repo_name}/${var.docker_config.image_name}"
    }, var.build_variables)
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Here, I'm setting the log bucket URL and image name as substitutions that the repo itself can reference in its &lt;code&gt;cloudbuild.yaml&lt;/code&gt; file. This way the Cloud Run service can reference the same image name.&lt;/p&gt;

&lt;p&gt;Putting this all together, we can set up a new build with:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;locals {
    location = "us-central1"
    repo_name = "game-scorer-project"
    api_image = "scoring-api"
}

resource "google_artifact_registry_repository" "primary" {
    provider = google-beta

    project = var.project
    location = local.location
    repository_id = local.repo_name
    format = "DOCKER"
}

module "api_builder" {
    source = "../modules/serviceAgent"

    id = "scoring-api-builder"
    display_name = "Scoring API Builder"
    setup_sa_email = var.builder
}

module "api_build" {
    source = "../modules/build"

    project = var.project
    location = local.location
    builder_email = module.api_builder.email
    bucket_name = "api-build-logs"

    docker_config = {
        repo_id = google_artifact_registry_repository.primary.id
        repo_name = local.repo_name
        image_name = local.api_image
    }

    build_config = {
        repo_name = "bitbucket_brmatola_scoring-api"
        trigger_name = "scoring-api-trigger"
        trigger_description = "Scoring API Build"
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Where &lt;code&gt;var.project&lt;/code&gt; and &lt;code&gt;var.builder&lt;/code&gt; reference the GCP project and service account running the infrastructure build, respectively.&lt;/p&gt;

&lt;p&gt;This configures a cloud build trigger on our API repo (bitbucket_brmatola_scoring-api). The repo itself then must have a &lt;code&gt;cloudbuild.yaml&lt;/code&gt; file that publishes a docker image:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;steps:
  - id: 'Build Docker Image With SHA Hash'
    name: 'gcr.io/cloud-builders/docker'
    entrypoint: 'bash'
    args: [ '-c', 'docker build -f GameScoring/Dockerfile -t ${_DEV_IMAGE_NAME}:$COMMIT_SHA .' ]

  - id: 'Tag Docker Image with dev tag'
    name: 'gcr.io/cloud-builders/docker'
    args: [ 'tag', '${_DEV_IMAGE_NAME}:$COMMIT_SHA', '${_DEV_IMAGE_NAME}:dev' ]

  - id: 'Update dev tag with image to run'
    name: 'gcr.io/cloud-builders/docker'
    args: [ 'push', '${_DEV_IMAGE_NAME}:dev']

images: ['${_DEV_IMAGE_NAME}:$COMMIT_SHA']

logsBucket: '$_LOG_BUCKET_URL'
options:
  logging: GCS_ONLY
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h2&gt;
  
  
  Running the Service on Cloud Run
&lt;/h2&gt;

&lt;p&gt;Once we have our build continuously deploying docker images, we want to actually &lt;em&gt;run&lt;/em&gt; the image in Cloud Run. This process is &lt;a href="https://cloud.google.com/build/docs/deploying-builds/deploy-cloud-run"&gt;documented here&lt;/a&gt;, but boils down to running a &lt;code&gt;cloud&lt;/code&gt; command after publishing the image. To do so, however, we'll need a Cloud Run instance to deploy to.&lt;/p&gt;

&lt;p&gt;We'll set up a service account to run the Cloud Run instance as well as some IAM permissions to let the build service account run the service:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;module "api_runner" {
    source = "../modules/serviceAgent"

    id = "scoring-api-runner"
    display_name = "Scoring API Runner"
    setup_sa_email = var.builder
    impersonators = ["serviceAccount:${module.api_builder.email}"]
}

resource "google_cloud_run_service_iam_binding" "runner" {
    location = local.location
    service = google_cloud_run_service.api.name
    role = "roles/run.developer"
    members = ["serviceAccount:${module.api_builder.email}"]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The key here is that both the service build and infrastructure build service accounts need the &lt;code&gt;iam.serviceAccountUser&lt;/code&gt; role on the Cloud Run service. Additionally, the service build account needs the &lt;code&gt;run.developer&lt;/code&gt; role in order to deploy the service.&lt;/p&gt;

&lt;p&gt;Then, if we want our service to be available to users, we'll need to give the &lt;code&gt;run.invoker&lt;/code&gt; role to everyone:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "google_cloud_run_service_iam_binding" "builder" {
    location = local.location
    service = google_cloud_run_service.api.name
    role = "roles/run.invoker"
    members = ["allUsers"]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then, we need to build the actual service:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "google_cloud_run_service" "api" {
    name = "scoring-api-service"
    location = local.location

    template {
        spec {
            service_account_name = module.api_runner.email
            containers {
                image = "${local.location}-docker.pkg.dev/${var.project}/${google_artifact_registry_repository.primary.repository_id}/${local.api_image}:dev"
            }
        }
    }

    traffic {
        percent = 100
        latest_revision = true
    }

    lifecycle {
        ignore_changes = [
            metadata.0.annotations
        ]
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Finally, in our service build we modify our Cloudbuild.yaml file to actually deploy the service:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;steps:
  - id: 'Build Docker Image With SHA Hash'
    name: 'gcr.io/cloud-builders/docker'
    entrypoint: 'bash'
    args: [ '-c', 'docker build -f GameScoring/Dockerfile -t ${_DEV_IMAGE_NAME}:$COMMIT_SHA .' ]

  - id: 'Tag Docker Image with dev tag'
    name: 'gcr.io/cloud-builders/docker'
    args: [ 'tag', '${_DEV_IMAGE_NAME}:$COMMIT_SHA', '${_DEV_IMAGE_NAME}:dev' ]

  - id: 'Update dev tag with image to run'
    name: 'gcr.io/cloud-builders/docker'
    args: [ 'push', '${_DEV_IMAGE_NAME}:dev']

  - id: 'Deploy dev tag to Cloud Run'
    name: 'gcr.io/cloud-builders/gcloud'
    args: ['run', 'deploy', '${_CLOUD_RUN_NAME}', '--image', '${_DEV_IMAGE_NAME}:dev', '--region', 'us-central1']

images: ['${_DEV_IMAGE_NAME}:$COMMIT_SHA']

logsBucket: '$_LOG_BUCKET_URL'
options:
  logging: GCS_ONLY
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Where we've configured the cloud run name as a substitution in the infrastructure so the build can just reference it.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>Automating Infrastructure on GCP using Terraform</title>
      <dc:creator>Bradley Matola</dc:creator>
      <pubDate>Mon, 11 Apr 2022 18:44:06 +0000</pubDate>
      <link>https://dev.to/bradasaurusrex1/automating-infrastructure-on-gcp-using-terraform-48n8</link>
      <guid>https://dev.to/bradasaurusrex1/automating-infrastructure-on-gcp-using-terraform-48n8</guid>
      <description>&lt;p&gt;When I'm working on a project on GCP, I try to automate everything. This includes both the deployment of services as well as the provisioning of builds to do so for &lt;em&gt;new&lt;/em&gt; services.&lt;/p&gt;

&lt;p&gt;If I'm adding a service to a project, I don't want to waste a lot of time remembering how to configure a CI/CD pipeline for it. I want to copy an existing module and get something that works exactly the same as my other services.&lt;/p&gt;

&lt;p&gt;To set this up, I have a Terraform repo that provisions the builds and resources needed to run everything in my GCP project. Pushing code to that repo deploys new builds and services. Practically speaking, on a push to that repo I run &lt;code&gt;terraform apply&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Each service in my project is itself continously deployed to. Once I have this setup up and running, I can update services and add new services with minimal configuration work within the GCP console itself - I just push code to the appropriate repo to change a service, add a service, or modify the infrastructure a service runs on.&lt;/p&gt;

&lt;p&gt;While this sounds like it's overcomplicating things (and it is), with the help of a project starter repo and some common modules it's easy to get working with. Once it's up and running, I can add a new service running service to my project by:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Linking the repo for the service to a &lt;a href="https://cloud.google.com/source-repositories/docs"&gt;Cloud Source Repository&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;Configuring a &lt;code&gt;cloudbuild.yaml&lt;/code&gt; file in the repo&lt;/li&gt;
&lt;li&gt;Adding the &lt;a href="https://cloud.google.com/build"&gt;Cloud Build Trigger&lt;/a&gt; for the project in Terraform&lt;/li&gt;
&lt;li&gt;Adding the resources to run the service in Terraform&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This post will walk through how to set up a new GCP project for continuous infrastructure deployment using Terraform.&lt;/p&gt;

&lt;h2&gt;
  
  
  Setting up a New GCP Project
&lt;/h2&gt;

&lt;p&gt;To begin, we set up a new project in GCP. I'm assuming we're doing this as a personal project. If we have an organization available in our environment, there is more &lt;a href="https://registry.terraform.io/modules/terraform-google-modules/bootstrap/google/latest"&gt;mature tooling&lt;/a&gt; available.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--auFE7Zj1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/h6kbvcix974fyzji8a6q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--auFE7Zj1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/h6kbvcix974fyzji8a6q.png" alt="Making a new Project in the GCP Console" width="880" height="593"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then, link your soon-to-be infrastructure repo as a Cloud Source Repository in your project. I &lt;a href="https://cloud.google.com/source-repositories/docs/mirroring-repositories"&gt;mirror&lt;/a&gt; a private repo from Bitbucket, but you could work out of GitHub or Cloud Source itself as well.&lt;/p&gt;

&lt;p&gt;For this example, I'm linking to &lt;a href="https://bitbucket.org/brmatola/build-builder-starter/src/main/"&gt;this&lt;/a&gt; repo that I clone to as a starter. It has some terraform modules I'll use to get started and a &lt;code&gt;cloudbuild.yaml&lt;/code&gt; file that outputs "hello" to verify that my build trigger is executing.&lt;br&gt;
Once you have this, take note of the "repository name". We'll need it to link up the build pipeline.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--E1Q29Dif--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/62gjbypsywnxihzzygum.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--E1Q29Dif--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/62gjbypsywnxihzzygum.png" alt="A source repository" width="880" height="710"&gt;&lt;/a&gt;&lt;/p&gt;
&lt;h2&gt;
  
  
  Adding an Automated Build
&lt;/h2&gt;

&lt;p&gt;I use &lt;a href="https://bitbucket.org/brmatola/infra/src/main/"&gt;this&lt;/a&gt; infrastructure starter to kickstart new projects. To do so, I open up a Cloud Shell in the project and clone the repo. Then, I &lt;code&gt;cd&lt;/code&gt; into the &lt;code&gt;src&lt;/code&gt; directory and modify the &lt;code&gt;terraform.tfvars&lt;/code&gt; file to reference the project and source repo that will be managing infrastructure for the project (the one we just set up):&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--65igCMow--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/enem725qv101s84pn1zl.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--65igCMow--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/enem725qv101s84pn1zl.png" alt="Modifying the terraform.tfvars file in the console" width="880" height="628"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Then, I run:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;code&gt;terraform init&lt;/code&gt;&lt;/li&gt;
&lt;li&gt;&lt;code&gt;terraform apply&lt;/code&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;And I've got a continuous deployment pipeline for infrastructure ready to go. You can (and should!) check to see what that starter is initializing, but the highlights are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;A service account to run the build&lt;/li&gt;
&lt;li&gt;A build trigger that runs on changes to our input repo&lt;/li&gt;
&lt;li&gt;A storage account for the logs&lt;/li&gt;
&lt;li&gt;A storage account for the Terraform state&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Appropriate permissions to let the service account use the above&lt;br&gt;
We can see the output of this as a build trigger in Cloud Build:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--oTVIHTpO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/zj4zm2xgtl0aytxzd1t0.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--oTVIHTpO--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/zj4zm2xgtl0aytxzd1t0.png" alt="The cloud build trigger" width="880" height="158"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;And we can manually run it or trigger it by pushing code to the underlying repo!&lt;/p&gt;
&lt;h2&gt;
  
  
  Setting up the Repo to Run Terraform
&lt;/h2&gt;

&lt;p&gt;You should consult the documentation for how to properly structure your infrastructure repo, but if you're looking for a quick and dirty setup to get moving, you can set your cloudbuild.yaml to:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;steps:

- id: 'Terraform Init'
  name: 'hashicorp/terraform:1.0.0'
  entrypoint: 'sh'
  args: 
  - '-c'
  - |
        cd src
        terraform init

- id: 'Terraform Plan'
  name: 'hashicorp/terraform:1.0.0'
  entrypoint: 'sh'
  args: 
  - '-c'
  - |
        cd src
        terraform plan

- id: 'Terraform Apply'
  name: 'hashicorp/terraform:1.0.0'
  entrypoint: 'sh'
  args: 
  - '-c'
  - |
        cd src
        terraform apply -auto-approve

logsBucket: '$_LOG_BUCKET_URL'
options:
  logging: GCS_ONLY
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Where we're pulling the log bucket URL from a variable set by the project starter.&lt;br&gt;
Then, point the backend at the particular storage bucket that was created for the backend state:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform {
    backend "gcs" {
        bucket = "subtle-app-346916-tf-state"
    }
}

terraform {
    required_version = "~&amp;gt; 1.0.0"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once you've got this, updating your terraform code and pushing the repo should update the infrastructure in your project.&lt;/p&gt;

</description>
    </item>
  </channel>
</rss>
