<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Fran</title>
    <description>The latest articles on DEV Community by Fran (@franciscogm).</description>
    <link>https://dev.to/franciscogm</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/franciscogm"/>
    <language>en</language>
    <item>
      <title>AWS CLI SSO made easy</title>
      <dc:creator>Fran</dc:creator>
      <pubDate>Tue, 16 Apr 2024 16:39:45 +0000</pubDate>
      <link>https://dev.to/franciscogm/aws-cli-sso-made-easy-3bh9</link>
      <guid>https://dev.to/franciscogm/aws-cli-sso-made-easy-3bh9</guid>
      <description>&lt;p&gt;As the pretty self-explanatory title reads, this is AWS CLI IAM Identity Center (SSO) authentication made easy. Without walking you through unneeded details that may only confuse you.&lt;/p&gt;

&lt;p&gt;For additional context, the post describes how to authenticate users with AWS IAM Identity Center to get credentials to run AWS Command Line Interface (CLI) commands via SSO token provider configuration, as recommended by AWS, in an easy way. With the SSO token provider configuration, your AWS SDK or tool can automatically retrieve refreshed authentication tokens.&lt;/p&gt;




&lt;h2&gt;
  
  
  Getting started
&lt;/h2&gt;

&lt;p&gt;First of all, you have to configure your SSO session, 'linking' your CLI to the AWS IAM Identity Center instance login page.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws configure sso
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The above command triggers a wizard that guides you through configuring an &lt;code&gt;sso-session&lt;/code&gt; and a &lt;code&gt;profile&lt;/code&gt;. The important parameters you need to fill are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;sso-session&lt;/code&gt; = &lt;em&gt;your_session_name&lt;/em&gt;&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;sso_start_url&lt;/code&gt; = &lt;em&gt;&lt;a href="https://my-sso-portal.awsapps.com/start"&gt;https://my-sso-portal.awsapps.com/start&lt;/a&gt;&lt;/em&gt; (taken from the IAM Identity Center login portal UI) &lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;profile&lt;/code&gt; = &lt;strong&gt;&lt;em&gt;default&lt;/em&gt;&lt;/strong&gt; &lt;u&gt;NOTE&lt;/u&gt;: Use the &lt;strong&gt;&lt;em&gt;default&lt;/em&gt;&lt;/strong&gt; profile name so every time you login to your AWS SSO session the CLI commands are automatically run using your commonly-used (or only) &lt;code&gt;profile&lt;/code&gt; role. (In my case, &lt;em&gt;FullAccess&lt;/em&gt;)&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;code&gt;sso_role_name&lt;/code&gt; = &lt;em&gt;leave as suggested or give it a descriptive name&lt;/em&gt;. (In my case, &lt;em&gt;FullAccess&lt;/em&gt;)&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The other parameters can be left as default.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Typically, &lt;u&gt;an this is where I couldn't get full clarity from AWS documentation&lt;/u&gt;, you would only have to configure one &lt;code&gt;sso-session&lt;/code&gt;, as there is usually a single IAM Identity Center instance or AWS Organization you need to access to. If you had to access multiple AWS Organizations or IAM Identity Center portals, configuring additional SSO sessions would be needed.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;After introducing all the required information in the configuration wizard, this is how my &lt;em&gt;~/.aws/config&lt;/em&gt; file looks:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8rlu8kew3fpyigm4x3lx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8rlu8kew3fpyigm4x3lx.png" alt="Image description" width="788" height="236"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You can now configure different &lt;code&gt;profiles&lt;/code&gt; for the different accounts and/or roles you have access to within the IAM Identity Center instance (AWS Organization). I am going to configure two additional &lt;code&gt;profiles&lt;/code&gt;.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws configure sso
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Running the above command again walks you through the same wizard as before. &lt;u&gt;The main difference now is that you have already configured an &lt;code&gt;sso-session&lt;/code&gt;, so such SSO session name would be taken as the default session within which we are going to create a new &lt;code&gt;profile&lt;/code&gt;&lt;/u&gt;. You could also manually type your existing &lt;code&gt;sso-session&lt;/code&gt; name.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0moq05cd0sro97gregdf.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F0moq05cd0sro97gregdf.png" alt="Image description" width="207" height="21"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This is how my &lt;em&gt;~/.aws/config&lt;/em&gt; file looks now:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdq85jwxvd13fy9bhrqqt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/cdn-cgi/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fdq85jwxvd13fy9bhrqqt.png" alt="Image description" width="786" height="458"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;You could specify a different &lt;code&gt;sso-session&lt;/code&gt; if you wanted to access another AWS Organization or IAM Identity Center portal.&lt;/p&gt;


&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws configure sso-session
&lt;/code&gt;&lt;/pre&gt;

&lt;/blockquote&gt;




&lt;p&gt;Alright, let’s recap!&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;I now have one &lt;code&gt;sso-session&lt;/code&gt; (to my AWS Organization or IAM Identity Center instance) and three &lt;code&gt;profiles&lt;/code&gt;:&lt;/li&gt;
&lt;/ul&gt;

&lt;ol&gt;
&lt;li&gt;
&lt;strong&gt;default&lt;/strong&gt; - FullAccess to the &lt;u&gt;123456789000&lt;/u&gt; account&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;read-only&lt;/strong&gt; - ReadOnlyAccess role to the &lt;u&gt;123456789000&lt;/u&gt; account&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;dev-account-admin&lt;/strong&gt; - DevAccess role to the &lt;u&gt;123456789011&lt;/u&gt; account&lt;/li&gt;
&lt;/ol&gt;

&lt;ul&gt;
&lt;li&gt;A &lt;code&gt;profile&lt;/code&gt; can be thought as an AWS &lt;em&gt;account+role&lt;/em&gt; tuple.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;We can now proceed to using the AWS CLI.&lt;/p&gt;




&lt;h2&gt;
  
  
  Login to the sso-session
&lt;/h2&gt;

&lt;p&gt;The first time, or whenever the token expires, you can either log in to your default &lt;code&gt;profile&lt;/code&gt; and &lt;code&gt;sso-session&lt;/code&gt; (if you had named a &lt;em&gt;default&lt;/em&gt; &lt;code&gt;profile&lt;/code&gt;)&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws sso login
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;or specify the &lt;code&gt;sso-session&lt;/code&gt; in case you had configured multiple SSO sessions.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws sso login --sso-session &amp;lt;session-name&amp;gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Running CLI commands
&lt;/h2&gt;

&lt;p&gt;Once logged in to the &lt;code&gt;sso-session&lt;/code&gt;, you can simply run the CLI commands as usual without having to re-authenticate until the token expires. Usually after 8 hours, when the token expires, you will have to run the &lt;code&gt;aws sso login&lt;/code&gt; command again to refresh the token.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Example&lt;/em&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws s3 ls
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Notice that, if you don’t specify a &lt;code&gt;profile&lt;/code&gt;, the command will be run assuming your &lt;em&gt;default&lt;/em&gt; &lt;code&gt;profile&lt;/code&gt;, if you had any configured. The above command lists the S3 buckets &lt;u&gt;in the 123456789000 account&lt;/u&gt;.&lt;/p&gt;

&lt;p&gt;If you want to run the CLI command in the scope of a different account or particular role, you need to specify the appropriate &lt;code&gt;profile&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws s3 ls --profile dev-account-admin
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The above command will list the S3 buckets &lt;u&gt;in the 123456789011 account.&lt;/u&gt;&lt;/p&gt;




&lt;h2&gt;
  
  
  Logout from the sso-session
&lt;/h2&gt;

&lt;p&gt;You can logout from your current &lt;code&gt;sso-session&lt;/code&gt; before the token expires running&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws sso logout
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;h2&gt;
  
  
  Wait, what profile am I using?
&lt;/h2&gt;

&lt;p&gt;You can also check which profile you are using running&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;aws sts get-caller-identity
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;






&lt;p&gt;I hope this post has helped you to easily get going with the AWS CLI using the IAM Identity Center token provider credentials. In any case, if you want to understand this in more detail (or get lost in the weeds :P), you can refer to the &lt;a href="https://docs.aws.amazon.com/cli/latest/userguide/sso-configure-profile-token.html"&gt;AWS documentation&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>aws</category>
      <category>sso</category>
      <category>iamidentitycenter</category>
      <category>cli</category>
    </item>
    <item>
      <title>Terraform modules and GitHub Actions to deploy secure cloud infrastructure</title>
      <dc:creator>Fran</dc:creator>
      <pubDate>Tue, 16 Aug 2022 15:09:45 +0000</pubDate>
      <link>https://dev.to/franciscogm/terraform-modules-and-github-actions-to-deploy-secure-cloud-infrastructure-3epe</link>
      <guid>https://dev.to/franciscogm/terraform-modules-and-github-actions-to-deploy-secure-cloud-infrastructure-3epe</guid>
      <description>&lt;p&gt;Countless organizations struggle with standardizing the provisioning of cloud resources, eventually resulting in a cloud infrastructure chaos. Resources are created either programmatically or manually from the cloud provider console, without a prior automated security configuration review. Hence, multiple vulnerabilities are introduced into cloud environments, ultimately generating an extra effort for Cloud Security teams, who will then have to work on the remediation steps.&lt;/p&gt;

&lt;p&gt;This post aims to standardize the deployment of resources to &lt;strong&gt;AWS&lt;/strong&gt; (although it can be extrapolated to any other cloud provider) in a secure and automated fashion by reusing pre-defined &lt;strong&gt;Terraform modules&lt;/strong&gt; and &lt;a href="https://docs.github.com/en/actions"&gt;GitHub Actions&lt;/a&gt;. Essentially, as popularly said, "&lt;em&gt;shifting left&lt;/em&gt;", moving security sooner in the development process to prevent insecure resources from being deployed in first place.&lt;/p&gt;

&lt;p&gt;Following this approach, all Cloud Security teams need to worry about is defining the appropriate security best-practices for the different cloud resource types, and writing up the Terraform modules with such security features pre-enforced. The GitHub Actions CI/CD will take care of the rest. If the resource is not compliant with the defined rules, it will not be deployed, period. &lt;/p&gt;

&lt;p&gt;Refer to &lt;a href="https://learn.hashicorp.com/collections/terraform/modules"&gt;HashiCorp's documentation&lt;/a&gt; to learn more about creating reusable Terraform modules.&lt;/p&gt;

&lt;p&gt;Enough theory, let's go into action!&lt;/p&gt;




&lt;h2&gt;
  
  
  Creating our reusable Terraform module
&lt;/h2&gt;

&lt;p&gt;We will need a GitHub repository to store our Terraform modules. Let's call it &lt;em&gt;&lt;code&gt;aws-terraform-modules&lt;/code&gt;&lt;/em&gt;. For this post's purpose, let's pretend that we are creating a module for deploying AWS Systems Manager Parameter Store parameters. Within our repository, we will create a directory called &lt;em&gt;&lt;code&gt;ssm-parameters&lt;/code&gt;&lt;/em&gt;. Inside such directory, at a minimum, we will need two files: &lt;code&gt;variables.tf&lt;/code&gt; and &lt;code&gt;main.tf&lt;/code&gt;.&lt;/p&gt;

&lt;p&gt;Here is an example of the &lt;code&gt;variables.tf&lt;/code&gt; file:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;variable "name" {
    type = string
    description = "Display name of the SSM Parameter Store parameter name."

    validation {
        condition = (length(var.name) &amp;gt;= 1 &amp;amp;&amp;amp; length(var.name) &amp;lt;= 2048 &amp;amp;&amp;amp; can(regex("^/(test|uat|prod)/", var.name)))
        error_message = "SSM parameter names must be between 1 (min) and 2048 (max) characters long and follow the naming convention (test|uat|prod)/."
    }
}

variable "description" {
    type = string
    description = "Description of the SSM Parameter Store parameter as viewed in the AWS console."
}

variable "value" {
    type = string
    description = "Value of the SSM Parameter Store parameter. If parameter type is SecureString, the value should be retrieved from a GitHub secret."
}

variable "key_id" {
    type = string
    description = "Customer managed KMS key id or arn used to encrypt the SSM Parameter Store parameter if type is SecureString."
    default = "alias/aws/ssm"
}

variable "tier" {
    type = string
    description = "Tier of the SSM Parameter Store parameter. Valid types are Standard, Advanced and Intelligent-Tiering."
    default = "Standard"

    validation {
        condition = (contains(["Standard", "Advanced", "Intelligent-Tiering"], var.tier))
        error_message = "Unsupported SSM parameter tier. Valid tiers: [Standard, Advanced, Intelligent-Tiering]."
    }
}

variable "tags" {
    type = map(string)
    description = "Key-value map of resource tags to be associated with the SSM Parameter Store parameter."

    validation {
      condition = (
                    contains(keys(var.tags), "owner") &amp;amp;&amp;amp;
                    contains(keys(var.tags), "env")
      )
      error_message = "Incomplete or invalid set of tags specified for the SSM Parameter Store parameter. Tags for every resource are required to have the following keys:\n\t- owner\n\t- env."
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;a href="https://www.terraform.io/language/expressions/custom-conditions#input-variable-validation"&gt;validation&lt;/a&gt; block can be used to specify custom conditions, based on our security standards, and produce error messages if the condition evaluates to false. Additionally, you can provide the default value for non required variables, as seen in the snippet above.&lt;/p&gt;

&lt;p&gt;As you would expect, a &lt;code&gt;main.tf&lt;/code&gt; file example is as follows:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "aws_ssm_parameter" "ssm_parameter" {
    name        = var.name
    value       = var.value
    description = var.description
    type        = "SecureString"
    key_id      = var.key_id
    tier        = var.tier
    tags        = var.tags
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;All we are doing there is calling the Terraform resources as specified in HashiCorp's documentation. In this case, the &lt;a href="https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/ssm_parameter"&gt;aws_ssm_parameter&lt;/a&gt; resource.&lt;/p&gt;

&lt;p&gt;That's all we need to configure our reusable Terraform module. In our example, we are basically enforcing a naming convention and the use of specific tags, as well as only allowing the creation of encrypted parameters (using the "SecureString" type).&lt;/p&gt;

&lt;p&gt;Let's now jump onto how to use it to deploy new AWS SSM parameters from GitHub Actions!&lt;/p&gt;




&lt;h2&gt;
  
  
  Using our Terraform module
&lt;/h2&gt;

&lt;p&gt;Calling our reusable Terraform module from another GitHub repository where we maintain the actual AWS resources code is as simple as the following:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;module "test-ssm-parameter" {
    source      = "github.com/&amp;lt;org-name&amp;gt;/aws-terraform-modules//ssm-parameters"
    name        = "/test/test-parameter"
    description = "CloudSec test SSM Parameter Store parameter."
    value       = "test-ssm-parameter-value"
    tags        = {
        owner    = "org-owner",
        env      = "test"
    }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Note that, in our snippet sample, we are exclusively specifying the values for the required variables. All other variables from our module not explicitly assigned will take their default value (type, key_id and tier). We could, of course, overwrite those optional variables, as long as the new values meet the pre-defined validation criteria.&lt;/p&gt;

&lt;p&gt;Easy, right? Let's take a look at how we can automatically test that our SSM parameter configuration is valid!&lt;/p&gt;




&lt;h2&gt;
  
  
  Configuring our GitHub Workflows
&lt;/h2&gt;

&lt;p&gt;The last piece of the puzzle is configuring our Terraform CI/CD pipeline. Assuming our organization follows a &lt;a href="https://gitversion.net/docs/reference/modes/mainline"&gt;mainline development&lt;/a&gt; model, the CI/CD workflow would look similar to the following:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;A &lt;code&gt;terraform validate&lt;/code&gt; and &lt;code&gt;terraform plan&lt;/code&gt; are triggered every time a developer opens a new Pull Request (PR) to the &lt;code&gt;main&lt;/code&gt; branch, or when an open PR is updated. This step essentially verifies that our Terraform configuration is correct and compliant with the validation rules defined for our module's variables, and outlines the &lt;code&gt;terraform plan&lt;/code&gt;. If the checks fail, we will see a detailed error of what's wrong so we can go and update our PR to fix it, triggering a new Workflow. And, of course, we won't be able to merge our PR and, thus, deploy our AWS resources, until our configuration is compliant.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;If the plan looks as expected and all the validation checks pass successfully, we shall be able to merge our PR. As soon as the PR is merged, a &lt;code&gt;terraform apply&lt;/code&gt; is kicked off, making the corresponding changes in our AWS cloud environment. This ensures that everything that is on our main branch is in sync with what is deployed to production.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;So, how do translate those two steps into GitHub Workflows?&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;You will need a GitHub Personal Access Token (PAT) with rights to pull the code from the &lt;code&gt;aws-terraform-modules&lt;/code&gt; repository and AWS access keys with the appropriate permissions to deploy the resources to AWS. We are storing such credentials in GitHub Secrets.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;Knowing that &lt;a href="https://docs.github.com/en/actions/using-workflows/about-workflows"&gt;GitHub Workflows&lt;/a&gt; are defined in the &lt;code&gt;.github/workflows&lt;/code&gt; directory within your repository, here are the templates of the two Workflows that we will be creating:&lt;/p&gt;

&lt;p&gt;&lt;code&gt;terraform-plan.yml&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;name: Terraform Plan
on:
  pull_request:
    branches:
      - $default-branch

env:
  TF_VERSION: 1.1
  TF_VAR_aws-access-key: ${{ secrets.AWS_ACCESS_KEY_ID }}
  TF_VAR_aws-secret-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
  TF_VAR_terraform-state-bucket: 's3-terraform-state-files'
  TF_VAR_terraform-state-bucket-namespace: /
  TF_VAR_terraform-state-bucket-key: 'aws-env/terraform.tfstate'
  TF_VAR_terraform-state-bucket-aws-region: 'us-east-1'
  TF_VAR_terraform-dynamodb-state-locking-table-name: 'dynamodb-terraform-state-files'

jobs:
  terraform:
    name: Plan
    runs-on: ubuntu-latest

    steps:
      - name: Checkout
        uses: actions/checkout@v2

      - name: GitHub Auth
        run:  
          git config --global url."https://oauth2:${GITHUB_TOKEN}@github.com/&amp;lt;org-name&amp;gt;".insteadOf "https://github.com/&amp;lt;org-name&amp;gt;" 
        env:  
          GITHUB_TOKEN: ${{ secrets.GITHUB_PAT }}

      - name: Setup Terraform
        uses: hashicorp/setup-terraform@v1.2.1
        with:
          terraform_version: ${{ env.TF_VERSION }}

      - name: Terraform Get Update
        run: terraform get -update

      - name: Terraform Init
        run: |
          terraform init \
          -backend-config="dynamodb_table=${{ env.TF_VAR_terraform-dynamodb-state-locking-table-name }}" \
          -backend-config="access_key=${{ secrets.AWS_ACCESS_KEY_ID }}" \
          -backend-config="secret_key=${{ secrets.AWS_SECRET_ACCESS_KEY }}" \
          -backend-config="bucket=${{ env.TF_VAR_terraform-state-bucket }}" \
          -backend-config="key=${{ env.TF_VAR_terraform-state-bucket-key }}" \
          -backend-config="region=${{ env.TF_VAR_terraform-state-bucket-aws-region }}"

      - name: Terraform Validate
        run: terraform validate

      - name: Terraform Plan
        run: terraform plan
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;terraform-apply.yml&lt;/code&gt;:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;name: Terraform Apply
on:
  push:
    branches:
      - $default-branch

env:
  TF_VERSION: 1.1
  TF_VAR_aws-access-key: ${{ secrets.AWS_ACCESS_KEY_ID }}
  TF_VAR_aws-secret-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
  TF_VAR_terraform-state-bucket: 's3-terraform-state-files'
  TF_VAR_terraform-state-bucket-namespace: /
  TF_VAR_terraform-state-bucket-key: 'aws-env/terraform.tfstate'
  TF_VAR_terraform-state-bucket-aws-region: 'us-east-1'
  TF_VAR_terraform-dynamodb-state-locking-table-name: 'dynamodb-terraform-state-files'

jobs:
  terraform:
    name: Apply
    runs-on: ubuntu-latest

    steps:
      - name: Checkout
        uses: actions/checkout@v2

      - name: GitHub Auth
        run:  
          git config --global url."https://oauth2:${GITHUB_TOKEN}@github.com/&amp;lt;org-name&amp;gt;".insteadOf "https://github.com/&amp;lt;org-name&amp;gt;" 
        env:  
          GITHUB_TOKEN: ${{ secrets.GITHUB_PAT }}

      - name: Setup Terraform
        uses: hashicorp/setup-terraform@v1.2.1
        with:
          terraform_version: ${{ env.TF_VERSION }}

      - name: Terraform Get Update
        run: terraform get -update

      - name: Terraform Init
        run: |
          terraform init \
          -backend-config="dynamodb_table=${{ env.TF_VAR_terraform-dynamodb-state-locking-table-name }}" \
          -backend-config="access_key=${{ secrets.AWS_ACCESS_KEY_ID }}" \
          -backend-config="secret_key=${{ secrets.AWS_SECRET_ACCESS_KEY }}" \
          -backend-config="bucket=${{ env.TF_VAR_terraform-state-bucket }}" \
          -backend-config="key=${{ env.TF_VAR_terraform-state-bucket-key }}" \
          -backend-config="region=${{ env.TF_VAR_terraform-state-bucket-aws-region }}"

      - name: Terraform Validate
        run: terraform validate

      - name: Terraform Plan
        run: terraform plan

      - name: Terraform Apply
        run: terraform apply -auto-approve
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Once we have the two GitHub Workflows configured in our repository, all the previously explained magic will happen automatically, exclusively allowing secure and compliant cloud infrastructure to be launched to production.&lt;/p&gt;

&lt;p&gt;Say goodbye to fighting developers to adhere to cloud security configurations guidelines and remediating vulnerabilities after they are introduced in your cloud environment. In addition, you won't need to grant excessive privileges to your AWS developers any more, as everything will be deployed from the CI/CD pipeline ;)&lt;/p&gt;

</description>
      <category>aws</category>
      <category>githubactions</category>
      <category>terraform</category>
      <category>cicd</category>
    </item>
    <item>
      <title>Automate AWS access key rotation with GitHub Actions</title>
      <dc:creator>Fran</dc:creator>
      <pubDate>Mon, 14 Feb 2022 16:23:00 +0000</pubDate>
      <link>https://dev.to/franciscogm/automate-aws-access-key-rotation-with-github-actions-37k8</link>
      <guid>https://dev.to/franciscogm/automate-aws-access-key-rotation-with-github-actions-37k8</guid>
      <description>&lt;p&gt;From a security perspective, there is no need to explain why rotating your &lt;a href="https://docs.aws.amazon.com/IAM/latest/UserGuide/id_credentials_access-keys.html"&gt;AWS access keys&lt;/a&gt; regularly is "key" ;), regardless of whether you are bound by laws and regulations. In any case, you can read a quick rationale in this AWS &lt;a href="https://aws.amazon.com/blogs/security/how-to-rotate-access-keys-for-iam-users/"&gt;post&lt;/a&gt; .&lt;/p&gt;

&lt;p&gt;On the other hand, if you have worked with AWS IAM for a while, you have probably struggled with access key rotation. This isn't a tough task at all. In fact, the AWS post above shows how easily we can rotate a user's AWS keys from the CLI. No additional complication doing so from the AWS console. As long as we have the required permissions, this is easy peasy. The problem comes when we have to automate this for all IAM users in our organization and share their new access keys in a secure fashion.&lt;/p&gt;

&lt;p&gt;There are other solutions out there to solve this problem. From using AWS native services like CloudWatch and Lambda (where you still have the new-key-secure-share issue with the user), to custom scripts that you execute and rotate the access key on the user's machine. The latter meaning that we have to rely on the user to update their keys, something that my experience has taught me to avoid :)&lt;/p&gt;

&lt;p&gt;In this post, I will explain how to automate key rotation for IAM users with access keys older than 90 days using &lt;a href="https://github.com/features/actions"&gt;GitHub Actions&lt;/a&gt; and a python script. I won't get into the weeds about how I have accomplished this for my organization's specific needs, but you'll get the idea ;)&lt;/p&gt;

&lt;p&gt;First, like for any GitHub Workflow, we need to define our trigger. In our case, we want the workflow to run daily.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;name: Daily AWS access key rotation check

on:
  #Every day at 5:55 AM UTC (23:55 CST) '55 5 * * *'
  schedule:
    - cron:  '55 5 * * *'
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Then, we will need to define our elevated account's access keys as environment variables, which will be used to rotate every IAM user's access keys on its behalf. I have those variables stored as secrets in GitHub Secrets.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;env:
  AWS_ACCESS_KEY_ID: ${{ secrets.AWS_ACCESS_KEY_ID }}
  AWS_SECRET_ACCESS_KEY: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;And finally, we get to the fun part of the workflow, where we write the required steps to enable the GitHub MacOS runner to execute our python script smoothly. We need to checkout our code so the runner can access it, inject the secrets into our script, and install python and boto3.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;jobs:
  AWS-key-rotation:
    name: Quarterly AWS access key rotation
    runs-on: macos-latest
    steps:
      - name: Checkout Code
        uses: actions/checkout@v2

      - name: Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v1
        with:
          aws-access-key-id: ${{ secrets.DEVOPS_TF_AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.DEVOPS_TF_AWS_SECRET_ACCESS_KEY }}
          aws-region: us-east-1

      - uses: actions/setup-python@v2
        with:
          python-version: 3

      - name: Install boto3
        run: |
          pip install boto3

      - name: Secret Injection MAILBOX credentials
        run: |
          sed -i "" 's/MAILBOX_EMAIL/${{ secrets.MAILBOX_EMAIL }}/' scripts/rotate-aws-keys.py
          sed -i "" 's/MAILBOX_PASSWORD/${{ secrets.MAILBOX_PASSWORD }}/' scripts/rotate-aws-keys.py
          cat scripts/rotate-aws-keys.py

      - name: Run AWS key rotation script
        run: |
          chmod +x scripts/rotate-aws-keys.py
          python3 scripts/rotate-aws-keys.py
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The complete GitHub Workflow will look like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;name: Daily AWS access key rotation check

on:
  #Every day at 5:55 AM UTC (23:55 CST) '55 5 * * *'
  schedule:
    - cron:  '55 5 * * *'

env:
  AWS_ACCESS_KEY_ID: ${{ secrets.DEVOPS_TF_AWS_ACCESS_KEY_ID }}
  AWS_SECRET_ACCESS_KEY: ${{ secrets.DEVOPS_TF_AWS_SECRET_ACCESS_KEY }}

jobs:
  AWS-key-rotation:
    name: Quarterly AWS access key rotation
    runs-on: macos-latest
    steps:
      - name: Checkout Code
        uses: actions/checkout@v2

      - name: Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v1
        with:
          aws-access-key-id: ${{ secrets.AWS_ACCESS_KEY_ID }}
          aws-secret-access-key: ${{ secrets.AWS_SECRET_ACCESS_KEY }}
          aws-region: us-east-1

      - uses: actions/setup-python@v2
        with:
          python-version: 3

      - name: Install boto3
        run: |
          pip install boto3

      - name: Secret Injection MAILBOX credentials
        run: |
          sed -i "" 's/MAILBOX_EMAIL/${{ secrets.MAILBOX_EMAIL }}/' scripts/rotate-aws-keys.py
          sed -i "" 's/MAILBOX_PASSWORD/${{ secrets.MAILBOX_PASSWORD }}/' scripts/rotate-aws-keys.py
          cat scripts/rotate-aws-keys.py

      - name: Run AWS key rotation script
        run: |
          chmod +x scripts/rotate-aws-keys.py
          python3 scripts/rotate-aws-keys.py
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;That was straightforward, right? We already have a workflow that will be running our python script every day. Let's now tell the python script what to do!&lt;br&gt;
The following libraries need to be imported.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;import datetime
from datetime import date
import dateutil
from dateutil import parser
import smtplib
from email.mime.multipart import MIMEMultipart
from email.mime.text import MIMEText
import boto3
from botocore.exceptions import ClientError
iam_client = boto3.client('iam')
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The following steps detail what the script does:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Iterate through all the IAM users in our AWS account. We need to use pagination because the default AWS API call only returns 100 items per page. 

&lt;ul&gt;
&lt;li&gt;Check the key age for each user with an active access key.

&lt;ul&gt;
&lt;li&gt;If the key has been active for 90 days, rotate it (delete the old key and create a new one).&lt;/li&gt;
&lt;li&gt;If the key has been active for 83 days, send a one-week email reminder.&lt;/li&gt;
&lt;li&gt;If the key has been active for 89 days, send a one-day email reminder.
&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;try:
    marker = None
    paginator = iam_client.get_paginator('list_users')
    # Need to use a paginator because by default API call only returns 100 records
    for page in paginator.paginate(PaginationConfig={'PageSize': 100, 'StartingToken': marker}):
        print("Next Page : {} ".format(page['IsTruncated']))
        u = page['Users']
        for user in u:
            keys = iam_client.list_access_keys(UserName=user['UserName'])
            for key in keys['AccessKeyMetadata']:
                active_for = date.today() - key['CreateDate'].date()
                # With active keys older than 90 days
                if key['Status']=='Active' and active_for.days &amp;gt;= 90:
                    print (user['UserName'] + " - " + key['AccessKeyId'] + " - " + str(active_for.days) + " days old. Rotating.")
                    delete_key(key['AccessKeyId'], user['UserName'])
                    create_key(user['UserName'])
                # Send a notification email 7 days before rotation
                elif key['Status']=='Active' and active_for.days == 83:
                    send_email("MAILBOX_EMAIL", "MAILBOX_PASSWORD", "recipient_email", subject_1_week, body_1_week)
                    print ("Email sent to " + user['UserName'] + " warning of key rotation in a week.")
                # Send a notification email 1 day before rotation
                elif key['Status']=='Active' and active_for.days == 89:
                    send_email("MAILBOX_EMAIL", "MAILBOX_PASSWORD", "recipient_email", subject_1_day, body_1_day)
                    print ("Email sent to " + user['UserName'] + " warning of key rotation tomorrow.")

except ClientError as e:
    print("An error has occurred attempting to rotate user %s access keys." % user['UserName'])
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Since I wanted to be a nice guy, I will be sending a reminder to the users a week before and a day before their key will expire (you don't have to, and can shorten your script by skipping those conditional statements). That means that I will be sending a few emails, thus I defined a &lt;code&gt;send_email&lt;/code&gt; function for such task. In my case, I used Office365.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def send_email(sender, password, recipient, subject, body):
    mimemsg = MIMEMultipart()
    mimemsg['From']=sender
    mimemsg['To']=recipient
    mimemsg['Subject']=subject
    mimemsg.attach(MIMEText(body, 'html'))
    connection = smtplib.SMTP(host='smtp.office365.com', port=587)
    connection.starttls()
    connection.login(sender,password)
    connection.send_message(mimemsg)
    connection.quit()
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;At this point, we are only missing what the &lt;code&gt;delete_key&lt;/code&gt; and &lt;code&gt;create_key&lt;/code&gt; functions do.&lt;br&gt;
The &lt;code&gt;delete_key&lt;/code&gt; function couldn't literally be easier.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Delete an specified access key for a user
def delete_key(access_key, username):
    try:
        iam_client.delete_access_key(UserName=username, AccessKeyId=access_key)
        print("%s has been deleted." % (access_key))
    except ClientError as e:
        print("The access key with id %s cannot be found" % access_key)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;code&gt;create_key&lt;/code&gt; isn't much harder. We simply need to collect the newly created AWS access key id and secret access key, as well as the IAM user email address (depending on your organization, you may retrieve it from the username, as a tag...), to share the rotated keys with the corresponding end users via email.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;# Create a new AWS access key and share it with the user via email
def create_key(username):
    access_key_metadata = iam_client.create_access_key(UserName=username)['AccessKey']
    access_key = access_key_metadata['AccessKeyId']
    secret_key = access_key_metadata['SecretAccessKey']
    recipient_email = "end_user_email"
    subject = "AWS API Key rotation"
    body = """\
        &amp;lt;html&amp;gt;
        &amp;lt;head&amp;gt;&amp;lt;/head&amp;gt;
        &amp;lt;body&amp;gt;
            &amp;lt;p&amp;gt;Include your message body here, including your access_key and secret_key&amp;lt;/p&amp;gt;
        &amp;lt;/body&amp;gt;
        &amp;lt;/html&amp;gt;
        """
    send_email("MAILBOX_EMAIL", "MAILBOX_PASSWORD", "recipient_email", subject, body)
    print("New access key (%s) created for %s" % (access_key, username))
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;As promised, I wasn't going to get into the weeds with org specifics. But, as you can imagine, you will need email account credentials and an encryption tool to send the updated keys securely with the user. In my scenario, I've used &lt;a href="https://www.virtru.com/"&gt;Virtru&lt;/a&gt; and Office365 email rules. You might also want to include some instructions in your email body explaining the end users how they can copy and paste the new access key in their &lt;code&gt;~/.aws/credentials&lt;/code&gt; file to avoid running into &lt;code&gt;Access_Denied&lt;/code&gt; errors.&lt;/p&gt;

&lt;p&gt;At this point:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;You are complying with security regulations and best practices by periodically rotating AWS access keys&lt;/li&gt;
&lt;li&gt;You aren't chasing your developers to rotate their keys (and praying that these keys don't get compromised in the meantime!)&lt;/li&gt;
&lt;li&gt;Developers won't be running into permission issues attempting to rotate their keys themselves&lt;/li&gt;
&lt;li&gt;It becomes the developers' responsibility to update their local credentials file with the shared keys&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;And that's all folks! Hope this solution helps you relieve your AWS access key 90 day rotation headache.&lt;/p&gt;

&lt;p&gt;Maintaining a secure cloud environment isn't that complicated if you are determined to keep it secure :)&lt;/p&gt;

</description>
      <category>aws</category>
      <category>githubactions</category>
      <category>python</category>
    </item>
  </channel>
</rss>
