<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Husain Yusuf</title>
    <description>The latest articles on DEV Community by Husain Yusuf (@mastercam123).</description>
    <link>https://dev.to/mastercam123</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/mastercam123"/>
    <language>en</language>
    <item>
      <title>🚀 Streamlining Terraform PR Automation with Atlantis on ECS Fargate</title>
      <dc:creator>Husain Yusuf</dc:creator>
      <pubDate>Sun, 06 Apr 2025 19:42:15 +0000</pubDate>
      <link>https://dev.to/aws-builders/streamlining-terraform-pr-automation-with-atlantis-on-ecs-fargate-5c21</link>
      <guid>https://dev.to/aws-builders/streamlining-terraform-pr-automation-with-atlantis-on-ecs-fargate-5c21</guid>
      <description>&lt;p&gt;Have you ever experienced these Terraform headaches?&lt;/p&gt;

&lt;p&gt;😱 Team members running Terraform commands from their laptops with &lt;strong&gt;different versions&lt;/strong&gt;&lt;br&gt;
😱 Your college is on vacation and you need to take over his/her work. You ask yourself, &lt;strong&gt;"what was deployed?"&lt;/strong&gt;&lt;br&gt;
😱 &lt;strong&gt;No visibility&lt;/strong&gt; into what changes are being applied&lt;br&gt;
😱 &lt;strong&gt;Manual processes&lt;/strong&gt; for reviewing and applying Terraform changes&lt;br&gt;
😱 &lt;strong&gt;Forgetting&lt;/strong&gt; to apply changes after they're approved&lt;/p&gt;

&lt;p&gt;Here comes &lt;strong&gt;Atlantis&lt;/strong&gt;. Atlantis solves all these problems by providing a centralized, automated workflow that integrates directly with your Git repositories. It creates a standardized process for Terraform changes that everyone follows.&lt;/p&gt;
&lt;h3&gt;
  
  
  🤔 &lt;strong&gt;&lt;em&gt;What is Atlantis?&lt;/em&gt;&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://www.runatlantis.io/" rel="noopener noreferrer"&gt;Atlantis&lt;/a&gt; is an awesome tool that automates your Infrastructure as Code (IaC) processes by integrating directly with your Git workflow. When someone creates a pull request with Terraform changes, Atlantis automatically runs terraform plan and comments the results right on the PR. Once approved, it can run terraform apply too! &lt;/p&gt;

&lt;p&gt;In my opinion, one of the greatest benefits of Atlantis is that it doesn't introduce a new user interface - instead, it integrates seamlessly with your existing version control system, allowing teams to perform code reviews and Terraform operations through the same familiar interface.&lt;/p&gt;
&lt;h3&gt;
  
  
  🚀 &lt;strong&gt;&lt;em&gt;Why Atlantis can benefits you&lt;/em&gt;&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;✔️ &lt;strong&gt;Enhanced Collaboration with GitOps Approach&lt;/strong&gt;&lt;br&gt;
Atlantis brings collaboration to the forefront of infrastructure management by integrating with version control systems. Teams can review, comment, and approve Terraform changes directly within pull requests, creating a centralized platform for feedback and reviews&lt;/p&gt;

&lt;p&gt;✔️ &lt;strong&gt;Increased Efficiency&lt;/strong&gt;&lt;br&gt;
By automating Terraform commands, Atlantis saves time and reduces the risk of human error. The tool handles the execution of terraform plan and terraform apply, allowing teams to focus on higher-level tasks.&lt;/p&gt;

&lt;p&gt;✔️ &lt;strong&gt;Infrastructure State Management&lt;/strong&gt;&lt;br&gt;
Atlantis expertly handles the isolation and management of Terraform state files, preventing conflicts that can occur when multiple developers are changing infrastructure simultaneously using &lt;a href="https://www.runatlantis.io/docs/locking.html" rel="noopener noreferrer"&gt;locking&lt;/a&gt;. &lt;/p&gt;
&lt;h3&gt;
  
  
  🔍 Atlantis Workflow for Terraform Automation
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs3chl2y8vl9ypio84oiv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs3chl2y8vl9ypio84oiv.png" alt="atlantis-workflow-1" width="800" height="239"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This diagram illustrates the &lt;strong&gt;Terraform Plan&lt;/strong&gt; phase of the Atlantis workflow:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;A developer creates a pull request with Terraform code changes&lt;/li&gt;
&lt;li&gt;GitHub sends a webhook notification to Atlantis&lt;/li&gt;
&lt;li&gt;Atlantis automatically runs &lt;code&gt;terraform plan&lt;/code&gt; on the proposed changes&lt;/li&gt;
&lt;li&gt;The plan output is returned to Atlantis&lt;/li&gt;
&lt;li&gt;Atlantis comments the plan results directly in the PR&lt;/li&gt;
&lt;li&gt;An approver reviews the PR and the Terraform plan output&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This automation eliminates the need for developers to run plans locally and manually share the results, creating a standardized review process.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm16py70u78skai42li3w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fm16py70u78skai42li3w.png" alt="atlantis-workflow-2" width="800" height="229"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;This diagram illustrates the &lt;strong&gt;Terraform Apply&lt;/strong&gt; phase of the Atlantis workflow that happens after approval:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;After approval, a developer comments &lt;code&gt;atlantis apply&lt;/code&gt; on the PR&lt;/li&gt;
&lt;li&gt;Atlantis executes terraform apply to implement the changes&lt;/li&gt;
&lt;li&gt;The apply output is returned to Atlantis&lt;/li&gt;
&lt;li&gt;Atlantis comments the results in the PR&lt;/li&gt;
&lt;li&gt;The PR can now be merged, completing the workflow&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;In this blog, I want to demonstrate you how to deploy Atlantis on AWS ECS Fargate using Terraform and setup webhook for Atlantis!&lt;/p&gt;

&lt;p&gt;You may ask, why AWS ECS Fargate? In my opinion running Atlantis on AWS ECS Fargate offers several advantages:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Serverless&lt;/strong&gt;: No need to manage EC2 instances &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Scalable&lt;/strong&gt;: Automatically scales based on demand &lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Isolated&lt;/strong&gt;: Runs in its own container for better security&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cost-effective&lt;/strong&gt;: Pay only for what you use&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;
  
  
  📋 Prerequisites
&lt;/h3&gt;

&lt;p&gt;To begin this, you will need:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;AWS Account with appropriate permissions&lt;/li&gt;
&lt;li&gt;Terraform installed&lt;/li&gt;
&lt;li&gt;AWS VPC with public and private subnets&lt;/li&gt;
&lt;li&gt;GitHub repository with Terraform code&lt;/li&gt;
&lt;li&gt;GitHub App ID and GitHub App Installation ID (To create these resources, you can refer to this links &lt;a href="https://docs.github.com/en/apps/creating-github-apps/registering-a-github-app/registering-a-github-app" rel="noopener noreferrer"&gt;GitHub App&lt;/a&gt;, &lt;a href="https://docs.github.com/en/apps/using-github-apps/installing-your-own-github-app" rel="noopener noreferrer"&gt;Install GitHub App&lt;/a&gt;)&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://docs.github.com/en/apps/creating-github-apps/authenticating-with-a-github-app/managing-private-keys-for-github-apps" rel="noopener noreferrer"&gt;GitHub App private key&lt;/a&gt; &lt;/li&gt;
&lt;li&gt;
&lt;a href="https://www.runatlantis.io/docs/configuring-webhooks.html" rel="noopener noreferrer"&gt;Webhook secret&lt;/a&gt; for GitHub repository &lt;/li&gt;
&lt;li&gt;Route53 domain (optional but recommended for HTTPS)&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;
  
  
  🚶‍♂️ Implementation
&lt;/h3&gt;

&lt;p&gt;For this I created a &lt;a href="https://github.com/mastercam123/atlantis_terraform" rel="noopener noreferrer"&gt;public repository&lt;/a&gt; to give you an example how I deployed Atlantis in AWS ECS Fargate.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;atlantis-deployment/
├── main.tf
├── github_webhook.tf
├── secrets.tf
├── variables.tf
├── outputs.tf
├── provider.tf
├── terraform.example-tfvars.
└── tf-modules/
    └── github-repository-webhook/
        ├── main.tf
        ├── variables.tf
        ├── outputs.tf
        ├── version.tf
        └── README.md

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;main.tf&lt;/code&gt; file contains the core infrastructure for deploying Atlantis on AWS ECS Fargate. It creates CloudWatch log group for the atlantis server. I used &lt;a href="https://registry.terraform.io/modules/terraform-aws-modules/atlantis/aws/latest" rel="noopener noreferrer"&gt;this terraform module&lt;/a&gt; to deploy Atlantis into ECS Fargate.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;data "aws_caller_identity" "current" {}
data "aws_region" "current" {}

################################################################################
# Atlantis Module to run Atlantis on AWS Fargate
################################################################################

### ECS Atlantis CloudWatch Group
resource "aws_cloudwatch_log_group" "atlantis" {
  name              = "/ecs/atlantis"
  retention_in_days = 7
}

### Atlantis Server
module "atlantis" {
  depends_on = [aws_cloudwatch_log_group.atlantis]
  source     = "terraform-aws-modules/atlantis/aws"
  version    = "4.4.0"

  vpc_id          = var.vpc_id
  service_subnets = var.private_subnets_id
  name            = "atlantis"


  ## Atlantis ECS Container Definition
  atlantis = {
    environment = [
      {
        name  = "ATLANTIS_REPO_ALLOWLIST"
        value = join(",", var.atlantis_repo_allowlist)
      },
      {
        name  = "ATLANTIS_GH_APP_ID"
        value = var.github_app_id
      },
      {
        name  = "ATLANTIS_WEB_BASIC_AUTH"
        value = var.web_basic_auth
      },
      {
        name  = "ATLANTIS_WEB_USERNAME"
        value = var.web_username
      },
      {
        name  = "ATLANTIS_WEB_PASSWORD"
        value = var.web_password
      },
      {
        name  = "ATLANTIS_LOG_LEVEL"
        value = "debug"
      },
      {
        name  = "ATLANTIS_WRITE_GIT_CREDS"
        value = true
      },
      {
        name  = "ATLANTIS_ATLANTIS_URL"
        value = join("", ["https://", var.route53_record_name])
      },
      {
        name  = "ATLANTIS_GH_APP_INSTALLATION_ID"
        value = var.github_app_installation_id
      }
    ]
    secrets = [
      {
        name      = "ATLANTIS_GH_WEBHOOK_SECRET"
        valueFrom = aws_secretsmanager_secret_version.github_webhook_secret.arn
      },
      {
        name      = "ATLANTIS_GH_APP_KEY"
        valueFrom = aws_secretsmanager_secret_version.github_app_private_key.arn
      }
    ]
    log_configuration = {
      logDriver = "awslogs"
      options = {
        awslogs-group         = aws_cloudwatch_log_group.atlantis.name
        awslogs-region        = data.aws_region.current.name
        awslogs-stream-prefix = "atlantis"
      }
    }
  }

  ## ECS Service
  service = {
    enable_execute_command  = true
    task_exec_iam_role_name = "atlantis-task-execution-role"
    task_exec_secret_arns   = [aws_secretsmanager_secret_version.github_app_private_key.arn, aws_secretsmanager_secret_version.github_webhook_secret.arn]
    # Provide Atlantis permission necessary to create/destroy resources
    tasks_iam_role_name     = "atlantis-tasks-role"
    tasks_iam_role_policies = var.tasks_iam_role_policies

  }
  ## ALB
  alb_subnets             = var.public_subnets_id
  certificate_domain_name = var.certificate_domain_name
  route53_record_name     = var.route53_record_name
  route53_zone_id         = var.route53_zone_id
  create_alb              = var.create_alb
}

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;The &lt;code&gt;github_webhook.tf&lt;/code&gt; file create the GitHub webhook integration for Atlantis. It uses a local module from ./tf-modules/github-repository-webhook. This configuration ensures that GitHub can send notifications to Atlantis when pull requests are created or updated.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;################################################################################
# GitHub repository webhook
################################################################################

module "github_repository_webhooks" {
  depends_on = [module.atlantis]
  source     = "./tf-modules/github-repository-webhook"

  repositories = var.github_repositories

  webhook_url    = "${module.atlantis.url}/events"
  webhook_secret = aws_secretsmanager_secret_version.github_webhook_secret.secret_string
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Optionally you can update the repository webhook manually from GitHub. Follow this &lt;a href="https://www.runatlantis.io/docs/configuring-webhooks.html" rel="noopener noreferrer"&gt;guidance&lt;/a&gt; to set this up.&lt;/p&gt;

&lt;p&gt;The &lt;code&gt;secrets.tf&lt;/code&gt; file create AWS Secrets Manager and SSM Parameter Store resources to store sensitive information.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;################################################################################
# Atlantis Server configuration and secrets
################################################################################
resource "aws_ssm_parameter" "atlantis_web_password" {
  name  = "/atlantis/web_password"
  type  = "SecureString"
  value = var.web_password
}
resource "aws_secretsmanager_secret" "github_app_private_key" {
  name        = "github_app_private_key"
  description = "GitHub App private key"
}
resource "aws_secretsmanager_secret_version" "github_app_private_key" {
  secret_id     = aws_secretsmanager_secret.github_app_private_key.id
  secret_string = var.github_app_private_key
}
resource "aws_secretsmanager_secret" "github_webhook_secret" {
  name        = "github_webhook_secret"
  description = "GitHub Webhook Secret"
}
resource "aws_secretsmanager_secret_version" "github_webhook_secret" {
  secret_id     = aws_secretsmanager_secret.github_webhook_secret.id
  secret_string = var.github_webhook_secret
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Optionally you can configure Atlantis Server setting in &lt;a href="https://www.runatlantis.io/docs/server-side-repo-config.html#enabling-server-side-config" rel="noopener noreferrer"&gt;repo side&lt;/a&gt;. Make sure if your use case required this additional setup.&lt;/p&gt;

&lt;p&gt;‼️ &lt;strong&gt;&lt;em&gt;Please adjust your own terraform variable.&lt;/em&gt;&lt;/strong&gt; The variable provided in the repo is just a dummy to gives you guidance to use the project.&lt;/p&gt;

&lt;p&gt;Run the following commands to initialize and apply your Terraform configuration:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform init
terraform plan
terraform apply

&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;&lt;strong&gt;🎉 &lt;em&gt;Congratulations that you make this far&lt;/em&gt;&lt;/strong&gt;, then I need to remind you to think about your Atlantis security. In this &lt;a href="https://www.runatlantis.io/docs/security.html" rel="noopener noreferrer"&gt;link&lt;/a&gt; Atlantis gives you information what should be done or let's say to &lt;strong&gt;improve your security for Atlantis&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Once the deployment is complete, you can verify that Atlantis is running by accessing the URL for Atlantis or (if you not using Route53 domain) DNS name of the Application Load Balancer. Below is an example of Atlantis UI.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp73c41er3vmj255tqd5l.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fp73c41er3vmj255tqd5l.png" alt="Atlantis" width="800" height="386"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  🏗️ Test Atlantis using Pull Request in GitHub repository
&lt;/h3&gt;

&lt;p&gt;To test Atlantis in a GitHub pull request, start by creating a new branch in your repository (dev) and adding a simple Terraform configuration. Push this branch and open a pull request. Atlantis will automatically detect the PR and run terraform plan, posting the results as a comment. Review the plan output in the PR comments. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffl9pbw2yrph83zvuskos.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffl9pbw2yrph83zvuskos.png" alt="PR atlantis" width="702" height="1255"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;If you're satisfied with the changes, comment atlantis apply on the PR. Atlantis will then execute terraform apply and post the results. Once the changes are applied successfully, you can merge the PR. This process allows you to verify that Atlantis is correctly integrated with your GitHub repository and automates your Terraform workflow through pull requests&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft3axe7y18ipwdf6jv6jx.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ft3axe7y18ipwdf6jv6jx.png" alt="PR Atlantis Apply" width="719" height="747"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;🎯 Conclusion – Less Headache managing Terraform!&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;In my opinion, Atlantis provides a powerful solution for automating Terraform workflows through pull requests. By integrating with version control systems, it enhances collaboration, improves security, and ensures consistency in infrastructure changes 🤖. It remains a valuable tool for teams looking to streamline their infrastructure as code processes. It is for me a tool that reduce the risk of human error while improving the team collaboration.&lt;/p&gt;

&lt;h4&gt;
  
  
  🔔 &lt;strong&gt;Stay tuned for my next blog&lt;/strong&gt;.
&lt;/h4&gt;

&lt;p&gt;Next topic I will be writing is about to run Atlantis manage Terraform infrastructure in multiple AWS Accounts. See you soon!&lt;/p&gt;

</description>
      <category>terraform</category>
      <category>devops</category>
      <category>gitops</category>
      <category>aws</category>
    </item>
    <item>
      <title>🛡️ Centralized Backup Solution in AWS Organization - Because One Backup is never enough!</title>
      <dc:creator>Husain Yusuf</dc:creator>
      <pubDate>Sun, 02 Mar 2025 11:26:21 +0000</pubDate>
      <link>https://dev.to/mastercam123/centralized-backup-solution-in-aws-organization-because-one-backup-is-never-enough-1op6</link>
      <guid>https://dev.to/mastercam123/centralized-backup-solution-in-aws-organization-because-one-backup-is-never-enough-1op6</guid>
      <description>&lt;p&gt;&lt;strong&gt;Data loss&lt;/strong&gt;, whether due to &lt;strong&gt;&lt;em&gt;accidental deletion&lt;/em&gt;&lt;/strong&gt;, &lt;strong&gt;&lt;em&gt;cyberattacks&lt;/em&gt;&lt;/strong&gt;, or &lt;strong&gt;&lt;em&gt;system failures&lt;/em&gt;&lt;/strong&gt;, can be catastrophic for any organization.&lt;/p&gt;

&lt;p&gt;Imagine waking up one day and realizing that your backups have mysteriously vanished. 😱 Maybe someone accidentally deleted them (oops), or worse, a cyberattack wiped them out. Not cool, right?&lt;/p&gt;

&lt;p&gt;Enter the AWS Central Backup Account – the superhero 🦸‍♂️ of backups! With this setup, all backups from your AWS Organization are automatically copied to a dedicated AWS account, ensuring an extra layer of protection. No more heart attacks over lost data! 💾✨&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;🔑 &lt;em&gt;Why You Need This in Your Life&lt;/em&gt;&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;✔️ 🚀 &lt;strong&gt;Ultimate Backup Resilience&lt;/strong&gt; – Even if a backup is deleted in a member account, a copy is safe in the Central Backup Account. Crisis averted! &lt;br&gt;
✔️ 🧐 &lt;strong&gt;Compliance Made Easy&lt;/strong&gt; – Need to meet regulations like GDPR or DORA? Centralized backups make audits a breeze!&lt;br&gt;
✔️ 📝  &lt;strong&gt;Automate Everything&lt;/strong&gt; – AWS Backup Plans take care of everything, so you can relax while your backups work for you. &lt;br&gt;
✔️ 🔒 &lt;strong&gt;Backup Security&lt;/strong&gt; – Protect your backups with Customer Managed KMS Keys and Backup Vault policy! &lt;br&gt;
✔️ 📢 &lt;strong&gt;Automated Alerts &amp;amp; Monitoring&lt;/strong&gt; – Get instant notifications if something goes wrong, so you can fix it before your boss finds out! 😅&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;🤔 &lt;em&gt;The Problem This Solves&lt;/em&gt;&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;🚨 &lt;strong&gt;Backups can be lost!&lt;/strong&gt; Accidental deletions, cyberattacks, or Murphy’s Law can strike at any time. With this setup, you always have a spare copy.&lt;br&gt;
🚨 &lt;strong&gt;Manually copying backups is painful!&lt;/strong&gt; We automate everything so you never have to worry about forgetting to copy your backups.&lt;br&gt;
🚨 &lt;strong&gt;Visibility on backup failures is crucial!&lt;/strong&gt; AWS EventBridge + Lambda + SNS work together to notify you immediately when something goes wrong.&lt;br&gt;
🚨 &lt;strong&gt;AWS Managed Keys don’t work for cross-account backups!&lt;/strong&gt; (at least now where I wrote this blog in February 2025) That’s why we use Customer Managed KMS Keys to securely share encrypted backups across accounts.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;&lt;em&gt;🔍 Centralized Backup Solution Architecture&lt;/em&gt;&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxsijs6jbt21aw8igrsjj.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxsijs6jbt21aw8igrsjj.png" alt="Central backup architecture" width="800" height="641"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The diagram illustrates a &lt;strong&gt;multi-account&lt;/strong&gt; AWS backup strategy, ensuring backups are automatically copied from application accounts to a dedicated central backup account for enhanced security and disaster recovery. &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;🛠 Components in the Architecture&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;🚀 &lt;strong&gt;Application Account (Source Account)&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Hosts e.g: Amazon RDS and Amazon EBS volumes that need to be backed up.&lt;/li&gt;
&lt;li&gt;Uses AWS Managed Keys or Customer Managed Keys (KMS) to encrypt the snapshots of these resources.&lt;/li&gt;
&lt;li&gt;Implements an AWS Backup Plan to schedule automatic snapshots.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;🚀 &lt;strong&gt;Backup Vaults&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Temporary Backup Vault: Stores the initial backup before copying it to the Primary Backup Vault.&lt;/li&gt;
&lt;li&gt;Primary Backup Vault: Stores the final backup within the application account, encrypted with a Customer Managed Key (CMK) to enable cross-account copy operations.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;🚀 &lt;strong&gt;AWS Backup Copy Jobs&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Copy Job 1: Copies the backup from the Temporary Backup Vault to the Primary Backup Vault in the same AWS account.

&lt;/li&gt;
&lt;li&gt;Copy Job 2 (Cross-account copy job) triggered from Lambda: Copies the backup from the Primary Backup Vault to the Central Backup Account.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;🚀 &lt;strong&gt;AWS Lambda &amp;amp; Amazon EventBridge&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;EventBridge triggers Lambda functions after each copy job is complete.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Initiating the cross-account copy job once the backup reaches the Primary Backup Vault.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Lambda delete backups from the Temporary Backup Vault after the cross-account copy is successfully complete.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;Sending notifications to alert admins of backup failures.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;🚀 &lt;strong&gt;Parameter Store&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Stores backup tag settings used by Lambda functions.&lt;/p&gt;

&lt;p&gt;🚀 &lt;strong&gt;Central Backup Account&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;A dedicated AWS account used for long-term storage of backups.&lt;br&gt;
Contains a Backup Vault, where cross-account copies from the application accounts are stored.&lt;br&gt;
Uses a Customer Managed Key (CMK) to encrypt the backups securely.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;&lt;em&gt;📝 Prerequisites&lt;/em&gt;&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;✅ An AWS Organization with multiple accounts.&lt;br&gt;
✅ Enable cross-account monitoring in AWS Backup from management account. The steps are described &lt;a href="https://docs.aws.amazon.com/aws-backup/latest/devguide/manage-cross-account.html#backup-delegatedadmin:~:text=User%20Guide.-,Enabling%20cross%2Daccount%20management,-Before%20you%20can" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;br&gt;
✅ A dedicated AWS Backup Account for centralized backup that already have delegated permission for backup. You can find how to setup &lt;a href="https://docs.aws.amazon.com/aws-backup/latest/devguide/manage-cross-account.html#backup-delegatedadmin:~:text=Register%20a%20member%20account%20as%20a%20delegated%20administrator%20account" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;br&gt;
✅ Ensure and enable the supported resources for cross-account backup. Check &lt;a href="https://docs.aws.amazon.com/aws-backup/latest/devguide/backup-feature-availability.html#:~:text=Backup%20Vault%20Lock-,Feature%20availability%20by%20resource,-To%20use%20AWS" rel="noopener noreferrer"&gt;here&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;&lt;em&gt;🚀 Step-by-Step Deployment&lt;/em&gt;&lt;/strong&gt;
&lt;/h3&gt;

&lt;h4&gt;
  
  
  &lt;strong&gt;&lt;em&gt;🤖 Deployment in central backup account&lt;/em&gt;&lt;/strong&gt;
&lt;/h4&gt;

&lt;p&gt;Step 1: Create a backup vault in Central Backup Account to store backup copy of member account &lt;br&gt;
📍 Go to AWS Backup → Create a Backup Vault for member account to store the copy of the backup.&lt;br&gt;
📍 Update the Backup Vault Policy → Allow the role in the member account to sent the copy into the backup vault.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffp7pg7zx1a7hh1b4krqt.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Ffp7pg7zx1a7hh1b4krqt.png" alt="Backup Vault Policy" width="800" height="492"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;📍 Create backup policy that will be implemented across the AWS Organization → The example can be found &lt;a href="https://docs.aws.amazon.com/organizations/latest/userguide/orgs_manage_policies_backup_syntax.html" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;&lt;em&gt;👨🏼‍🏫 Deployment in member account&lt;/em&gt;&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Step 1: Set Up a Customer Managed KMS Key 🔑&lt;/strong&gt;&lt;br&gt;
📍 Go to AWS KMS → Create a Customer Managed Key (CMK).&lt;br&gt;
📍 Update the Key Policy to allow access from the Central Backup Account. → Please refer to this &lt;a href="https://docs.aws.amazon.com/kms/latest/developerguide/key-policy-modifying-external-accounts.html" rel="noopener noreferrer"&gt;link&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 2: Configure AWS Backup in Each Member Account 🏗️&lt;/strong&gt;&lt;br&gt;
📍 Create a Temporary Backup Vault and Primary Backup Vault.&lt;br&gt;
📍 Set up an AWS Backup Plan Rule to back up tagged resources into the Temporary Backup Vault.&lt;br&gt;
📍 Configure the Backup Plan Rule to copy backups to the Primary Backup Vault.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F36sw7quactiz1rvam9fw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F36sw7quactiz1rvam9fw.png" alt="Image description" width="758" height="560"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 3: Deploy a Lambda Function and EventBridge to Handle Backup Copy Jobs 🤖&lt;/strong&gt;&lt;br&gt;
📍 Create an EventBridge Rule for successful copy job from Temporary Vault to Primary Vault.&lt;br&gt;
📍 Create an AWS Lambda function triggered by EventBridge. → The function run a copy job from Primary Backup Vault to Central Backup Vault in central backup Account.&lt;br&gt;
📍 If the event is a successful copy from Temporary to Primary Vault, Lambda copies it to the Central Backup Account.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Step 4: Set Up EventBridge to Watch for Backup Jobs Failures 👀&lt;/strong&gt;&lt;br&gt;
📍 Create an EventBridge Rule for failed backup, copy, or restore jobs (so you know when something’s broken).&lt;br&gt;
📍 Create an SNS Topic and subscribe your email (or Slack, or any your preference endpoint that supported in SNS)&lt;br&gt;
📍 Add SNS as Eventbridge Target to sent the notification.&lt;br&gt;
📍 Get real-time alerts before disaster strikes!&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;🎯 Conclusion – Your Backups Just Got Smarter!&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;By implementing this Centralized AWS Backup Solution, you’ve just leveled up your cloud game. No more “oops, my backup is gone” moments, no more compliance headaches, and no more manual backup drudgery.&lt;/p&gt;

&lt;p&gt;🚀 Automation? Check.&lt;br&gt;
🔒 Security? Check.&lt;br&gt;
📢 Notifications? Check.&lt;/p&gt;

&lt;p&gt;So what are you waiting for? Get started today! 🎉 Your future self will thank you! &lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;&lt;em&gt;‼️ Things to consider&lt;/em&gt;&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;🔔 &lt;strong&gt;The time where AWS Backup runs the backup job&lt;/strong&gt;. In AWS Backup, RDS backups aren't allowed within an hour before the RDS maintenance window or the RDS automated backup window. Therefore, be sure that your backup plans for RDS databases are scheduled more than an hour apart from the RDS maintenance window and the RDS automated backup window. &lt;/p&gt;

</description>
      <category>aws</category>
      <category>cloudcomputing</category>
      <category>cloud</category>
      <category>devops</category>
    </item>
    <item>
      <title>🥳 How I prepared for AWS DevOps Engineer Professional (DOP-C02) Exam</title>
      <dc:creator>Husain Yusuf</dc:creator>
      <pubDate>Sun, 01 Dec 2024 10:26:19 +0000</pubDate>
      <link>https://dev.to/mastercam123/how-i-prepared-for-aws-devops-engineer-professional-dop-c02-exam-362m</link>
      <guid>https://dev.to/mastercam123/how-i-prepared-for-aws-devops-engineer-professional-dop-c02-exam-362m</guid>
      <description>&lt;p&gt;Passing the AWS DevOps Engineer Professional exam was a significant milestone in my career and marked my &lt;strong&gt;7th AWS certification&lt;/strong&gt;. It required dedication, strategic planning, and a deep understanding of AWS services and DevOps principles. &lt;br&gt;
This blog post isn’t just a success story—it’s a roadmap for anyone preparing for the exam. Whether you're just getting started or are already knee-deep in AWS, I hope my experiences and tips will inspire you to take on this certification with confidence.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Why Take the AWS DevOps Engineer Professional Exam?&lt;/strong&gt; 😯
&lt;/h3&gt;

&lt;p&gt;The AWS DevOps Engineer - Professional certification is more than just a badge. It’s a validation of your ability to design, implement, and manage robust CI/CD pipelines, optimize cloud infrastructure, and ensure security and scalability in production environments.&lt;/p&gt;

&lt;p&gt;Here’s why you should consider it:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Demonstrates Advanced Skills and Knowledge&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This certification validates your ability to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Build robust, scalable, and secure CI/CD pipelines.&lt;/li&gt;
&lt;li&gt;Automate complex cloud architectures.&lt;/li&gt;
&lt;li&gt;Design fault-tolerant and resilient applications.&lt;/li&gt;
&lt;li&gt;Implement best practices for monitoring, governance, and compliance.&lt;/li&gt;
&lt;li&gt;It showcases your expertise in bridging the gap between development 
and operations, a critical skill in modern cloud environments.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;2. Career Growth&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;It demonstrates expertise in DevOps best practices, which are in high demand. In my opinion, roles like Cloud DevOps Engineer, Site Reliability Engineer (SRE), and Cloud Architect often require or prefer professionals with this certification. &lt;br&gt;
Additionally, achieving it demonstrates to employers and peers that you’ve mastered advanced DevOps practices and cloud strategies. It sets you apart as someone capable of handling complex cloud infrastructure challenges.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Focus on Real-World Use Cases&lt;/strong&gt;&lt;br&gt;
Unlike some certifications that are purely theoretical, the DevOps Engineer Professional exam is heavily use-case-driven. By preparing for this exam, you learn how to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Solve practical challenges in cloud infrastructure.&lt;/li&gt;
&lt;li&gt;Optimize deployments for performance, cost, and security.&lt;/li&gt;
&lt;li&gt;Apply DevOps methodologies like automation, scaling, and resilience.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;4. Encourages a Deeper Understanding of AWS Services&lt;/strong&gt;&lt;br&gt;
Preparing for this certification forces you to go beyond the basics and develop an advanced understanding of AWS services, including:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;How to integrate multiple AWS services into complex architectures.&lt;/li&gt;
&lt;li&gt;Best practices for CI/CD, automation, and monitoring.&lt;/li&gt;
&lt;li&gt;The trade-offs and limitations of various AWS tools in different scenarios.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;This knowledge enhances your problem-solving skills and helps you make better decisions in real-world projects.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5. A Confidence Booster&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Tackling one of the most challenging AWS certifications proves your mettle as a cloud and DevOps professional.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;My Study Resources 📚&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Preparing for this exam was no easy feat—it required a mix of structured study, hands-on projects, and real-world experience. Here’s how I approached it:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;a href="https://www.udemy.com/course/aws-certified-devops-engineer-professional-hands-on/" rel="noopener noreferrer"&gt;1. Stephane Maarek’s Course Udemy&lt;/a&gt;&lt;/strong&gt;&lt;br&gt;
A fantastic starting point! Stephane’s content is well-structured and directly aligned with the exam’s objectives. It helps you build a strong foundation.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;a href="https://learn.cantrill.io/p/aws-certified-devops-engineer-professional" rel="noopener noreferrer"&gt;2. Adrian Cantrill’s Course&lt;/a&gt;&lt;/strong&gt;&lt;br&gt;
Adrian’s material dives deep into the inner workings of AWS services, giving you a more nuanced understanding. This course is invaluable for grasping the "why" behind AWS architectures.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;a href="https://portal.tutorialsdojo.com/product/aws-certified-devops-engineer-professional-practice-exams/" rel="noopener noreferrer"&gt;3. Practice Exams by Jon Bonso&lt;/a&gt;&lt;/strong&gt;&lt;br&gt;
These mock exams are essential. The structure, difficulty, and content are very similar to the real exam, making them great tools for gauging readiness.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;a href="https://explore.skillbuilder.aws/learn/course/internal/view/elearning/14810/aws-certified-devops-engineer-professional-official-practice-exam-dop-c02-english" rel="noopener noreferrer"&gt;4. AWS Skill Builder&lt;/a&gt;&lt;/strong&gt;&lt;br&gt;
I utilized AWS’s official practice exams on Skill Builder to validate my knowledge and identify weak areas. These tests are great for understanding AWS's preferred solutions and best practices.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;a href="https://docs.aws.amazon.com/whitepapers/latest/introduction-devops-aws/introduction-to-devops.html?did=wp_card&amp;amp;trk=wp_card" rel="noopener noreferrer"&gt;5. AWS Whitepapers and FAQs&lt;/a&gt;&lt;/strong&gt;&lt;br&gt;
Reading AWS whitepapers, particularly those on CI/CD, monitoring, automation, and security, provided insights into best practices and design principles. The FAQs for services like CodePipeline, CloudFormation, and CloudWatch were also extremely helpful.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;6. Do some hands-on project&lt;/strong&gt;&lt;br&gt;
My work as an AWS Cloud Consultant gave me invaluable hands-on experience with AWS services. Real projects taught me how services are used, their limitations, and the kind of decisions required in practical scenarios.&lt;br&gt;
If you don't have this chance yet, spin up your own projects on AWS Account.&lt;/p&gt;

&lt;p&gt;Key takeaway: Don’t just rely on courses. Build pipelines, set up monitoring, deploy scalable architectures, and troubleshoot real-world problems. The exam is heavily focused on use cases, and hands-on experience helps bridge the gap between theory and application.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Key Topics Covered in the Exam ⏳&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;The exam dives deep into DevOps practices and AWS services. Based on my experience, here are some key areas you’ll need to master:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. CI/CD Pipelines&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Services: AWS CodePipeline, CodeBuild, CodeDeploy, CodeCommit, and GitOps strategies.&lt;br&gt;
Use case: Building scalable and automated CI/CD pipelines for production workloads.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Monitoring and Logging&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Services: CloudWatch, Eventbridge, X-Ray, and CloudTrail.&lt;br&gt;
Use case: Setting up centralized logging and monitoring for large-scale applications.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. High Availability and Disaster Recovery&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Services: Elastic Load Balancing, Auto Scaling, Route 53, Blue/Green Deployment, AWS Backup, and multi-region architectures.&lt;br&gt;
Use case: Designing fault-tolerant architectures.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4. Infrastructure as Code (IaC)&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Services: CloudFormation and AWS CDK.&lt;br&gt;
Use case: Automating infrastructure deployment and management.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5. Security and Governance&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Services: IAM, SCPs, AWS Config, Trusted Advisor, GuardDuty and Secrets Manager.&lt;br&gt;
Use case: Implementing least privilege access, threat detection and enforcing compliance.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;6. Networking&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Services: VPC, Security Group and Transit Gateway.&lt;br&gt;
Use case: Managing secure, scalable, and cost-efficient networks.&lt;/p&gt;

&lt;h3&gt;
  
  
  Lessons Learned 💪
&lt;/h3&gt;

&lt;p&gt;Hands-On Is a Game-Changer&lt;br&gt;
Understanding AWS concepts is one thing; applying them in real scenarios is another. Building CI/CD pipelines, setting up monitoring systems, and troubleshooting issues in a real-world environment taught me far more than any course.&lt;/p&gt;

&lt;h3&gt;
  
  
  The Exam Is All About Use Cases
&lt;/h3&gt;

&lt;p&gt;The questions often involve multiple AWS services working together to solve a specific problem. It’s crucial to understand how services interact and which service fits a given scenario best.&lt;br&gt;
Important to mention that &lt;strong&gt;all the answer options could be implemented&lt;/strong&gt;. You need to read carefully what is the &lt;strong&gt;requirement&lt;/strong&gt; mentioned in the question (eg: cost-efficient, reduce management overhead or high avalibility)&lt;/p&gt;

&lt;h3&gt;
  
  
  Final Thoughts 🏅
&lt;/h3&gt;

&lt;p&gt;This certification is challenging, but with the right preparation, anyone can achieve it. More importantly, certifications are not just about passing an exam—they’re about building a deeper understanding of cloud practices and enhancing your ability to solve real-world problems.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Remember&lt;/strong&gt; ‼️‼️&lt;br&gt;
Certification is great, but hands-on experience is what truly sets you apart. Dive into projects, experiment with services, and push yourself to apply what you learn.&lt;/p&gt;

&lt;p&gt;To everyone considering the AWS DevOps Engineer - Professional exam: go for it! The journey will not only add value to your career but also significantly elevate your cloud expertise.&lt;/p&gt;

&lt;p&gt;Good luck, and happy learning! 🌟&lt;/p&gt;

</description>
      <category>aws</category>
      <category>devops</category>
      <category>certification</category>
      <category>cloudcomputing</category>
    </item>
    <item>
      <title>🤩How I prepared for AWS Solutions Architect Professional (SAP-C02) Exam</title>
      <dc:creator>Husain Yusuf</dc:creator>
      <pubDate>Fri, 18 Oct 2024 16:13:06 +0000</pubDate>
      <link>https://dev.to/mastercam123/how-i-prepared-for-aws-solutions-architect-professional-sap-c02-exam-3b5e</link>
      <guid>https://dev.to/mastercam123/how-i-prepared-for-aws-solutions-architect-professional-sap-c02-exam-3b5e</guid>
      <description>&lt;p&gt;Passing the AWS Solutions Architect Professional exam was one of the most challenging yet rewarding experiences in my cloud journey ✈️ ☁️ . This truly pushed me to deepen my understanding of AWS services and architectures. Throughout this journey, I leveraged a variety of resources and hands-on experiences that helped me succeed. I’ll share here the study materials, strategies, and tips that helped me pass the exam, along with my thoughts on why real-world experience is crucial for success 😉.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Why Take the AWS Solutions Architect Professional Exam❓&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;The AWS Solutions Architect Professional exam is a highly valued certification in the cloud industry, and here are several reasons why pursuing this certification can be beneficial:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Validates Advanced AWS Knowledge 🧑‍🔧&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;This certification is an excellent way to showcase your deep expertise in designing scalable, fault-tolerant, and cost-efficient systems on AWS. It goes beyond the fundamentals, testing your ability to solve complex architectural challenges using multiple AWS services.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Distinguishes You as a Cloud Expert 👩‍🔬&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;For those seeking a career in cloud architecture, this certification demonstrates to employers that you possess both broad and in-depth knowledge of the AWS ecosystem, making you a valuable asset for architecting solutions for enterprises. You'll feel more confident working on enterprise-level architectures, designing solutions that incorporate multiple services, high availability, disaster recovery, and security best practices.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Prepares You for Real-World Scenarios 🏃‍♀️&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;The exam focuses on solving use-case-based scenarios that mirror the challenges you might face in the real world. You’re not just tested on individual services but rather on how to combine services effectively to meet specific business needs.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4. Opens Up Career Opportunities 🧑‍🚀&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;As cloud adoption grows, so does the demand for cloud architects with advanced knowledge. Earning this certification can help unlock career growth opportunities and tt is especially relevant for senior-level positions like Cloud Architect, DevOps Engineer, and Cloud Solutions Consultant.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5. Proves Mastery of AWS Best Practices 🏋️&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;AWS puts a strong emphasis on best practices, scalability, cost optimization, and security in its exam questions. This certification validates that you understand not just how to use services, but how to design architectures that are in line with AWS best practices.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;6. Endless Learning Opportunity 🚴&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Pursuing this certification also exposes you to new services and features that AWS continuously rolls out. In my preparation, I encountered services and use cases I hadn’t worked with before, which expanded my knowledge base and prepared me to design better solutions for future projects.&lt;/p&gt;

&lt;p&gt;So now that you already aware the benefits of taking this exam, I will now give you a deeper insight to my study resources.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;My Study Resources 📚&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;To prepare for the exam, I used a combination of online courses, practice exams, AWS whitepapers, and hands-on experience. I invested my time around 10 hr/week and did 2 month preparation for the exam.&lt;br&gt;
Here’s a breakdown of the resources I used:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. &lt;a href="https://www.udemy.com/course/aws-solutions-architect-professional/" rel="noopener noreferrer"&gt;Stephane Maarek’s Course (Udemy)&lt;/a&gt;&lt;/strong&gt;&lt;br&gt;
Stephane’s course was my main guide for preparing specifically for the exam. His content is structured to mirror the domains of the AWS Solutions Architect Professional certification. His course and practice exams gives you excellent coverage of the required topics.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. &lt;a href="https://learn.cantrill.io/p/aws-certified-solutions-architect-professional" rel="noopener noreferrer"&gt;Adrian Cantrill’s Course&lt;/a&gt;&lt;/strong&gt;&lt;br&gt;
While Stephane’s course is great for exam-focused preparation, Adrian Cantrill’s course provided deeper insights into AWS services. Adrian’s content goes beyond exam requirements, explaining the architecture and intricacies of AWS services, giving you an edge in understanding how AWS services work.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. &lt;a href="https://portal.tutorialsdojo.com/courses/aws-certified-solutions-architect-professional-practice-exams/" rel="noopener noreferrer"&gt;Jon Bonso’s Practice Exams&lt;/a&gt; (Tutorials Dojo)&lt;/strong&gt;&lt;br&gt;
Jon Bonso’s practice exams are invaluable. They are extremely similar to the actual AWS exam in terms of question style, difficulty, and structure. I used both Jon’s and Stephane’s practice exams, which helped build my confidence for exam day.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4. &lt;a href="https://skillbuilder.aws/exam-prep/solutions-architect-professional" rel="noopener noreferrer"&gt;AWS Skill Builder&lt;/a&gt;&lt;/strong&gt;&lt;br&gt;
AWS offers its own set of practice exams through AWS Skill Builder, which gave me the chance to validate whether I was ready. The questions are similar to what you can expect in the actual exam and give a good feel for how well-prepared you are.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;&lt;a href="https://aws.amazon.com/whitepapers/?whitepapers-main.sort-by=item.additionalFields.sortDate&amp;amp;whitepapers-main.sort-order=desc&amp;amp;awsf.whitepapers-content-type=*all&amp;amp;awsf.whitepapers-global-methodology=*all&amp;amp;awsf.whitepapers-tech-category=*all&amp;amp;awsf.whitepapers-industries=*all&amp;amp;awsf.whitepapers-business-category=*all" rel="noopener noreferrer"&gt;5. AWS Whitepapers&lt;/a&gt;&lt;/strong&gt;&lt;br&gt;
AWS whitepapers were a key resource for best practices. They helped me deepen my understanding of AWS architectural frameworks and gave insight into cost optimization, security, and scalability strategies. The whitepapers that I found particularly useful included the &lt;strong&gt;Well-Architected Framework, Security Best Practices,&lt;/strong&gt; and &lt;strong&gt;High Availability Architectures.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;6. Do some hands-on project&lt;/strong&gt;&lt;br&gt;
The more you work on real projects, the easier it will be to understand the scenarios and architectural challenges presented in the exam. Spin up your own projects on AWS Account or participate in client projects if possible.&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;AWS Services Covered in the Exam ⏳&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;The AWS Solutions Architect Professional exam covers a vast array of services, with questions that typically combine at least three services in one use case. The exam tests not only your knowledge of services but also your ability to apply them in various real-world scenarios. &lt;/p&gt;

&lt;p&gt;Many of the exam questions present real-world scenarios where you need to identify the best architecture or AWS service to meet the specific requirements (e.g., &lt;em&gt;scalability, cost-efficiency, security, or disaster recovery&lt;/em&gt;). For instance, a question might ask how to implement a highly available and fault-tolerant architecture across multiple AWS regions using services like &lt;strong&gt;S3, Route 53,&lt;/strong&gt; and &lt;strong&gt;CloudFront&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;Here’s an overview of some of the key services that appeared in the exam:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Compute:&lt;/strong&gt;&lt;br&gt;
EC2 ,Lambda, Elastic Beanstalk&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Containers &amp;amp; Orchestration:&lt;/strong&gt;&lt;br&gt;
ECS, EKS, AWS Fargate, ECR&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Storage:&lt;/strong&gt;&lt;br&gt;
S3, EFS, EBS, Amazon FSx, Amazon Glacier&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;4. Networking:&lt;/strong&gt;&lt;br&gt;
VPC, AWS Transit Gateway, Direct Connect and VPN, API Gateway (how to secure these resources).&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;5. Security &amp;amp; Identity:&lt;/strong&gt;&lt;br&gt;
IAM, AWS Organizations, Service Control Policies (SCPs), AWS SSO, Cross-Account Roles, AWS Managed AD, KMS (key policy), System Manager (Parameter Store, Session Manager, Automation, Run Command), ACM&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;6. DevOps Tools:&lt;/strong&gt;&lt;br&gt;
CodeBuild, CodePipeline ,CodeDeploy&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;7. Data Services:&lt;/strong&gt;&lt;br&gt;
Amazon RDS, DynamoDB, Redshift, Athena, Amazon ElastiCache (Redis), Amazon Neptune&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;8. Content Delivery &amp;amp; Analytics:&lt;/strong&gt;&lt;br&gt;
CloudFront, QuickSight, Amazon WorkSpaces&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;9.Monitoring and Notification Services:&lt;/strong&gt;&lt;br&gt;
Cloudwatch, CloudTrail (Organization), SNS, Eventbridge Rules, AWS Config&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;10. High Availability &amp;amp; Disaster Recovery:&lt;/strong&gt;&lt;br&gt;
Route 53, Elastic Load Balancers (Application Load Balancer and Network Load Balancer), Auto Scaling, AWS Backup&lt;/p&gt;

&lt;h3&gt;
  
  
  Hands-On Experience is a must and critical
&lt;/h3&gt;

&lt;p&gt;While courses and practice exams are important, the most valuable part of my preparation came from hands-on experience.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Here’s why hands-on experience matters: 🏅 🏅&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Context &amp;amp; Problem-Solving&lt;/strong&gt;: In the exam, most questions revolve around real-world scenarios. Having worked on AWS implementations, I was able to draw from my experience to solve complex questions on topics like hybrid architectures, scalability, and high availability.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Service Limitations&lt;/strong&gt;: Some AWS services have limitations that aren’t obvious from reading the documentation. Only by using these services in real-world projects do you truly understand how they behave under different conditions.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Design Trade-offs&lt;/strong&gt;: Working on projects helped me better understand how to make trade-offs in architectural designs. The exam often presents several valid solutions, and your experience helps you choose the best one based on performance, cost, and scalability considerations.&lt;/p&gt;&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Final Tips for Preparing for the AWS Solutions Architect Professional Exam
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Start with Courses, But Don’t Rely Only on Them&lt;/strong&gt;: Courses like Stephane Maarek’s will give you a strong foundation, but don’t stop there. Dive deeper into services and AWS best practices.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Do More Hands-On Labs&lt;/strong&gt;: The more you work on real projects, the easier it will be to understand the scenarios and architectural challenges presented in the exam. Spin up your own projects on AWS or participate in client projects if possible.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Take Practice Exams&lt;/strong&gt;: Jon Bonso’s and Stephane’s practice exams are fantastic for testing your knowledge. Don’t just take them once—review your mistakes and learn from them. AWS’s official practice exams are also a great way to measure your readiness.&lt;/p&gt;&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;&lt;strong&gt;Focus on Domain Mastery&lt;/strong&gt;: The AWS Solutions Architect Professional exam covers a wide range of topics. The exam domains include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Design for Organizational Complexity&lt;/li&gt;
&lt;li&gt;Design for New Solutions&lt;/li&gt;
&lt;li&gt;Continuous Improvement for Existing Solutions&lt;/li&gt;
&lt;li&gt;Accelerate Workload Migration and Modernization&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;&lt;p&gt;&lt;strong&gt;Review Exam Readiness on AWS Skill Builder&lt;/strong&gt;: AWS’s Skill Builder platform offers exam readiness courses and practice exams that are valuable in fine-tuning your understanding and confidence before the actual test.&lt;/p&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  My Final Thoughts 🎯
&lt;/h3&gt;

&lt;p&gt;Getting certified is not just about passing an exam—it’s about taking a leap in your cloud journey and gaining a deep understanding of how to design complex, scalable, and secure architectures using AWS. &lt;/p&gt;

&lt;p&gt;However, remember that certifications are just one part of the equation. Hands-on experience is what truly pushes your learning curve. Working on real-world projects not only prepares you for the exam but also gives you insights into how AWS services behave, their limitations, and best use cases. Through hands-on experience, you’ll develop a deeper understanding of architectural trade-offs, service integrations, and how to meet the needs of diverse business scenarios.&lt;/p&gt;

&lt;p&gt;So, I encourage you to take on this certification challenge, but don’t stop there. Get your hands dirty, dive into projects, and continuously apply what you’ve learned. Have fun and good luck on your cloud journey! 👍 👍 👏&lt;/p&gt;

</description>
      <category>aws</category>
      <category>certification</category>
      <category>cloud</category>
      <category>cloudpractitioner</category>
    </item>
    <item>
      <title>Securely Push Image from GitHub to AWS with GitHub Action</title>
      <dc:creator>Husain Yusuf</dc:creator>
      <pubDate>Wed, 09 Oct 2024 10:11:58 +0000</pubDate>
      <link>https://dev.to/mastercam123/securely-deploying-code-from-github-to-aws-with-github-action-j3e</link>
      <guid>https://dev.to/mastercam123/securely-deploying-code-from-github-to-aws-with-github-action-j3e</guid>
      <description>&lt;p&gt;🚀 In modern cloud-native development, automating deployments is key to ensuring fast, secure, and reliable application delivery. &lt;strong&gt;GitHub Actions&lt;/strong&gt;, a powerful CI/CD tool, allows you to automate your workflows directly from your GitHub repository. When combined with &lt;strong&gt;AWS&lt;/strong&gt; and &lt;strong&gt;OpenID Connect (OIDC)&lt;/strong&gt;, you can securely deploy your code without managing long-lived AWS credentials and maintaining high security standards. &lt;/p&gt;

&lt;h3&gt;
  
  
  Here are the key reasons why OIDC is beneficial:
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Eliminates Long-Lived Credentials&lt;/strong&gt;: 
OIDC allows GitHub Actions to authenticate with AWS without storing permanent credentials like 
&lt;code&gt;AWS_ACCESS_KEY_ID&lt;/code&gt; and &lt;code&gt;AWS_SECRET_ACCESS_KEY&lt;/code&gt; in your GitHub repository. This reduces the risk of credential leakage and simplifies credential management.
&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Improves Security&lt;/strong&gt;: By using short-lived tokens generated during each workflow run, OIDC minimizes the attack surface compared to static credentials.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Simplifies Access Management&lt;/strong&gt;: OIDC enables federated identity management, allowing you to control access permissions centrally in AWS IAM. You can define trust relationships and conditions under which GitHub tokens are considered valid, streamlining access control.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Supports Conditional Access&lt;/strong&gt;: You can configure policies in AWS IAM to restrict access based on specific conditions, such as the repository or branch from which a workflow is triggered. This granular control helps enforce security best practices&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Integrates Easily with GitHub Actions&lt;/strong&gt;: GitHub provides official actions like &lt;code&gt;aws-actions/configure-aws-credentials&lt;/code&gt; that simplify the integration process by automatically exchanging OIDC tokens for AWS temporary credentials during workflow execution&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In this blog post, we’ll walk through how to set up a seamless deployment from GitHub to AWS using GitHub Actions and OIDC.&lt;br&gt;
I setup an example of use case using Amazon Elastic Container Registry (ECR) as our image registry and IAM Role that will be assumed through OIDC. After the resources in AWS are created, we'll use GitHub Actions to build docker image of Flask (Python) application using Docker and push the Docker image to Amazon ECR and secure the connection with AWS using OpenID Connect (OIDC).&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fagz3teblzd7od0t0cr7q.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fagz3teblzd7od0t0cr7q.png" alt="Architecture of the GitHub Integration to AWS" width="800" height="280"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Prerequisites&lt;/strong&gt;
&lt;/h3&gt;

&lt;ul&gt;
&lt;li&gt;A GitHub repository containing a Flask application. There is a simple &lt;a href="https://github.com/mastercam123/flask-simple" rel="noopener noreferrer"&gt;flask app&lt;/a&gt; as an example.&lt;/li&gt;
&lt;li&gt;An AWS account with access to ECR.&lt;/li&gt;
&lt;li&gt;AWS CLI installed and configured in the GitHub runner. (I am using Github-hosted runner that already have AWS CLI installed).
&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Step 1: Set Up AWS Resources&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;1. Create ECR Repository&lt;/strong&gt;:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Open the AWS Management Console and navigate to ECR.&lt;/li&gt;
&lt;li&gt;Click on Create repository.&lt;/li&gt;
&lt;li&gt;Name your repository (e.g., flask-app) and create it.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy97oaoru8y83bqdjxx7k.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy97oaoru8y83bqdjxx7k.PNG" alt="ECR Repository creation" width="800" height="428"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;2. Add an Identity Provider in AWS&lt;/strong&gt;&lt;br&gt;
The first step is to create an Identity Provider in AWS. The following steps are based on the GitHub documentation &lt;a href="https://docs.github.com/en/actions/security-for-github-actions/security-hardening-your-deployments/configuring-openid-connect-in-amazon-web-services" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Go to IAM in the AWS Management Console.&lt;/li&gt;
&lt;li&gt;In the left navigation pane, click on &lt;strong&gt;Identity providers&lt;/strong&gt; and choose &lt;strong&gt;Add provider&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;For Provider type, select OpenID Connect.&lt;/li&gt;
&lt;li&gt;For Provider URL, enter &lt;code&gt;https://token.actions.githubusercontent.com&lt;/code&gt;. Provider URL is the Github OpenID Connect URL for authentication requests.&lt;/li&gt;
&lt;li&gt;For Audience, enter &lt;code&gt;sts.amazonaws.com&lt;/code&gt;. Audience is a value that identifies the application that is registered with an OpenID Connect provider.&lt;/li&gt;
&lt;li&gt;Click &lt;strong&gt;Add provider&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh284ns3e28opb6osdqac.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh284ns3e28opb6osdqac.PNG" alt="Created Provider" width="800" height="690"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;3. Create an IAM Role for OIDC&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create a new role with the Web identity type.&lt;/li&gt;
&lt;li&gt;Select GitHub as the provider and specify your repository details.&lt;/li&gt;
&lt;li&gt;Attach policies that allow write access to ECR (In this case I am using AWS-managed Policy &lt;code&gt;AmazonEC2ContainerRegistryPowerUser&lt;/code&gt;).&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F27903bin8l832r6womd3.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F27903bin8l832r6womd3.PNG" alt="Created Role" width="800" height="998"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create trust relationships for the role. This allow only specific branch or tags to assume the role.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Principal": {
                "Federated": "&amp;lt;Arn of Identity provider&amp;gt;"
            },
            "Action": "sts:AssumeRoleWithWebIdentity",
            "Condition": {
                "StringLike": {
                    "token.actions.githubusercontent.com:sub": [
                        "repo:&amp;lt;GitHub organization name&amp;gt;/&amp;lt;GitHub repo name&amp;gt;:ref:refs/heads/main",
                        "repo:&amp;lt;GitHub organization name&amp;gt;/&amp;lt;GitHub repo name&amp;gt;:ref:refs/heads/dev*",
                        "repo:&amp;lt;GitHub organization name&amp;gt;/&amp;lt;GitHub repo name&amp;gt;:ref:refs/tags/v*"
                    ],
                    "token.actions.githubusercontent.com:aud": "sts.amazonaws.com"
                }
            }
        }
    ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;You have to replace the placeholders with correct values.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;code&gt;Arn of Identity Provider&lt;/code&gt; - ARN of the Identity Provider that you created in previous step.&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;GitHub organization name&lt;/code&gt; - your GitHub Organization name&lt;/li&gt;
&lt;li&gt;
&lt;code&gt;GitHub repo name&lt;/code&gt; - your GitHub Repository name&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;You can use the &lt;strong&gt;StringLike&lt;/strong&gt; condition in the IAM policy to allow more flexible pattern matching with wildcards. Instead of using &lt;strong&gt;StringEquals&lt;/strong&gt;, which requires an exact match, StringLike lets you use patterns like * to match multiple variations.&lt;br&gt;
In this case, to allow deployment from branches starting with dev (like dev, develop, etc.) and any version tag that follows the semver rule (e.g., v1.0, v2.1.3), you can set up the condition like this:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Branches&lt;/strong&gt;: Use StringLike with dev* to match any branch that begins with dev.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Tags&lt;/strong&gt;: Use StringLike with v* to match any tag that starts with v, allowing flexibility for any versioning system that follows the semantic versioning convention.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Step 2: Update GitHub Repository Secrets&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;1. Add Repository Secrets:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In your GitHub repository, go to Settings &amp;gt; Secrets and variables &amp;gt; Actions.&lt;/li&gt;
&lt;li&gt;Create following secrets:&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;AWS_REGION&lt;/strong&gt; = Region of AWS where you deploy your AWS resources.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;AWS_ROLE&lt;/strong&gt; = Arn of the IAM role that the GitHub Action has to assume.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkrelwm6p05hod5xd54qg.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fkrelwm6p05hod5xd54qg.PNG" alt="GitHub secrets" width="800" height="589"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Step 3: Create GitHub Actions Workflow&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;1. Create Workflow File:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;In the GitHub repository, create a .github/workflows/build_image.yml file.
&lt;/li&gt;
&lt;/ul&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;name: Deploy to AWS

on:
  push:
    branches:
      - main

jobs:
  build-and-deploy:
    runs-on: ubuntu-latest

    permissions:
      id-token: write
      contents: read

    steps:
      - name: Checkout code
        uses: actions/checkout@v4

      - name: Configure AWS Credentials
        uses: aws-actions/configure-aws-credentials@v4
        with:
          role-to-assume: ${{ secrets.AWS_ROLE }}
          role-session-name: ecr-push-registry-${{ github.run_number }}
          aws-region: ${{ secrets.AWS_REGION }}

      - name: Log in to Amazon ECR
        uses: aws-actions/amazon-ecr-login@v2
        id: login-ecr

      - name: Build, tag, and push Docker image to Amazon ECR
        env:
          ECR_URL: ${{ steps.login-ecr.outputs.registry }}
          REPOSITORY: flask-app
          VERSION_TAG: ${{ github.sha }}
        run: bash ./build_image.sh

      - name: Log out of Amazon ECR
        if: always()
        run: docker logout ${{ steps.login-ecr.outputs.registry }}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;This is the overview of the workflow code:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Name&lt;/strong&gt;: The workflow is named "Deploy to AWS".&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Trigger&lt;/strong&gt;: The workflow is triggered by a push event on the main branch.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Jobs&lt;/strong&gt;: The workflow consists of a single job named build-and-deploy which runs on an ubuntu-latest virtual machine.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Permissions&lt;/strong&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;id-token: write:&lt;/strong&gt; This permission allows the workflow to request an OpenID Connect (OIDC) token, which is used to authenticate with AWS securely.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;contents: read:&lt;/strong&gt; This permission allows the workflow to read repository contents, which is necessary for checking out the code.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;li&gt;

&lt;p&gt;&lt;strong&gt;Steps&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Checkout Code:&lt;/strong&gt;  This step checks out the repository code so that subsequent steps have access to it.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Configure AWS Credentials:&lt;/strong&gt; This step configures AWS credentials using OIDC. It assumes a role specified by the secret &lt;code&gt;AWS_ROLE&lt;/code&gt; and sets the AWS region using the secret &lt;code&gt;AWS_REGION&lt;/code&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Log in to Amazon ECR:&lt;/strong&gt; This step logs into Amazon Elastic Container Registry (ECR) so that Docker images can be pushed. The step ID is set to &lt;code&gt;login-ecr&lt;/code&gt;, which allows referencing its outputs in later steps.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Build, Tag, and Push Docker Image to Amazon ECR:&lt;/strong&gt; This step create environment variables &lt;code&gt;ECR_URL&lt;/code&gt; (The URL of the ECR registry obtained from the output of the login step) , &lt;code&gt;REPOSITORY&lt;/code&gt; and &lt;code&gt;VERSION_TAG&lt;/code&gt; (A unique tag for the Docker image based on the commit SHA (github.sha)). This step executes a shell script (&lt;code&gt;build_image.sh&lt;/code&gt;)  that builds, tags, and pushes the Docker image to ECR.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Log out of Amazon ECR:&lt;/strong&gt;  Logs out from Amazon ECR using Docker to ensure that credentials are not left lingering. This step always runs (&lt;code&gt;if: always()&lt;/code&gt;) regardless of whether previous steps succeeded or failed.&lt;/li&gt;
&lt;/ul&gt;


&lt;/li&gt;

&lt;/ul&gt;

&lt;h3&gt;
  
  
  &lt;strong&gt;Step 4: Test the GitHub Action workflow&lt;/strong&gt;
&lt;/h3&gt;

&lt;p&gt;Once you have configured GitHub Actions, you can test the integration by pushing a change to your GitHub repository. GitHub Actions should automatically trigger a build and deploy process in AWS, using the OpenID Connect authentication data to authenticate the user. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9sclrd7see53l4io83jy.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F9sclrd7see53l4io83jy.PNG" alt="GitHub Workflow" width="800" height="476"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;And there you have it, folks! We've successfully sent our trusty Flask app on a whirlwind adventure to the cloud ☁️, all thanks to the magic of GitHub Actions and AWS. With Docker as its suitcase and OIDC as its security blanket, our app is now living its best life.&lt;/p&gt;

&lt;p&gt;Stay tuned for my next blog post, where we'll dive even deeper into the world of cloud automation and deployment magic. See you then! 😎&lt;/p&gt;

</description>
      <category>githubactions</category>
      <category>aws</category>
      <category>cicd</category>
      <category>devops</category>
    </item>
  </channel>
</rss>
