<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Thakur Rishabh Singh</title>
    <description>The latest articles on DEV Community by Thakur Rishabh Singh (@thakurrishabh).</description>
    <link>https://dev.to/thakurrishabh</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/thakurrishabh"/>
    <language>en</language>
    <item>
      <title>AWS Step Functions workflow for an ETL Job on COVID-19 and deploying it with Terraform (#CloudGuruChallenge Series) (Part 2/3)</title>
      <dc:creator>Thakur Rishabh Singh</dc:creator>
      <pubDate>Mon, 26 Apr 2021 03:00:45 +0000</pubDate>
      <link>https://dev.to/thakurrishabh/aws-step-functions-workflow-for-an-etl-job-on-covid-19-and-deploying-it-with-terraform-cloudguruchallenge-series-part-2-3-6f7</link>
      <guid>https://dev.to/thakurrishabh/aws-step-functions-workflow-for-an-etl-job-on-covid-19-and-deploying-it-with-terraform-cloudguruchallenge-series-part-2-3-6f7</guid>
      <description>&lt;h1&gt;
  
  
  This Post is about creating an AWS Step Functions Workflow on AWS to process an ETL Job on daily COVID-19 count and deploying the infrastructure with Terraform
&lt;/h1&gt;

&lt;h2&gt;
  
  
  Contents
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Project Overview&lt;/li&gt;
&lt;li&gt;Architecture&lt;/li&gt;
&lt;li&gt;Step Functions Workflow

&lt;ul&gt;
&lt;li&gt;What is AWS Step Functions?&lt;/li&gt;
&lt;li&gt;Creating Step Functions Workflow &lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Deploying Infrastructure with Terraform&lt;/li&gt;
&lt;li&gt;Conclusion&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  1. Project Overview
&lt;/h3&gt;

&lt;p&gt;This project is Second part in the series &lt;a href="https://acloudguru.com/blog/engineering/cloudguruchallenge-python-aws-etl" rel="noopener noreferrer"&gt;#CloudGuruChallenge – Event-Driven Python on AWS&lt;/a&gt;. Here we deploy an AWS Step Functions Workflow along with various other components required for error handling. The AWS Step Functions Workflow will process an ETL job on daily COVID-19 count which will be demonstrated in the next/final part of this series. Note:- Appropriate permissions must be configured in IAM for the below code to work. &lt;br&gt;
        To automate the process Terraform is used for IaC (Infrastructure as Code) and AWS CodePipeline is used for CI/CD. The details about setting up Terraform and CodePipeline has been discussed in detail in &lt;a href="https://dev.to/thakurrishabh/deploying-infrastructure-on-aws-with-terraform-and-aws-codepipeline-series-1-3-cloudguruchallenge-5n"&gt;Part-1&lt;/a&gt; of the series.&lt;/p&gt;
&lt;h3&gt;
  
  
  2. Architecture
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa0p04qelw0lbyd11i8be.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fa0p04qelw0lbyd11i8be.png" alt="Screenshot from 2021-04-25 17-15-28"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The above architecture works as follows:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;An event bridge rule triggers the AWS Step Functions Workflow once daily at 1pm.&lt;/li&gt;
&lt;li&gt;The Step Functions Workflow upon success sends an Email to the owner through Amazon SNS.&lt;/li&gt;
&lt;li&gt;Upon Failure the another event is triggered by the event bridge which takes the failed event and sends it to SNS, SQS and cloudwatch logs. A notification of this is sent to the owner by Email through SNS.&lt;/li&gt;
&lt;li&gt;An event bridge rule triggers a lambda function once daily at 7am. This function retrieves the failed events from SQS. It then triggers the step functions workflow with an input containing all the failed dates.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3&gt;
  
  
  3. Step Functions Workflow
&lt;/h3&gt;
&lt;h6&gt;
  
  
  What is AWS Step Functions?
&lt;/h6&gt;

&lt;p&gt;It is a serverless orchestration service where different serverless services can be made to interact based on events while exchanging data in JSON format.&lt;/p&gt;
&lt;h6&gt;
  
  
  Creating Step Functions Workflow
&lt;/h6&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fheuvr2wddpkwqtjmipi3.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fheuvr2wddpkwqtjmipi3.png" alt="stepfunctions_graph (1)"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The above workflow works as follows:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Each rectangular block represents a lambda function task except for CheckETLStatus and Notify which are Choice state and SNS task respectively.&lt;/li&gt;
&lt;li&gt;The workflow starts with retrieving the ETL status which can be either initial load or update.&lt;/li&gt;
&lt;li&gt;If it is initial load, all the data is retrieved as a batch from the source.&lt;/li&gt;
&lt;li&gt;If it is update, only a row of data is retrieved from the source (which is the case count for the current day).&lt;/li&gt;
&lt;li&gt;The retrieved data is transformed and loaded through the transform and load tasks respectively.&lt;/li&gt;
&lt;li&gt;Finally an Email is sent to the owner notifying him about the success of the ETL job.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;To Create the above workflow the code is given below.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
  "StartAt": "GetETLStatus",
  "States": {
    "GetETLStatus": {
        "Type": "Task",
        "Resource": "YOUR-LAMBDA-ARN",
        "Next": "CheckETLStatus",
        "TimeoutSeconds": 3,
        "ResultPath": "$.result"
    },
    "CheckETLStatus": { 
        "Type": "Choice",
        "Choices": [
          {
            "Variable": "$.result.status",
            "StringEquals": "InitialLoad",
            "Next": "InitialLoad"
          },
          {
            "Variable": "$.result.status",
            "StringEquals": "Update",
            "Next": "Update"
          }
        ],
        "Default": "InitialLoad"
     },
    "InitialLoad": {
        "Type": "Task",
        "Resource": "YOUR-LAMBDA-ARN",
        "Next": "Transform",
        "TimeoutSeconds": 3,
        "ResultPath": "$.result"
    },
    "Update": {
        "Type": "Task",
        "Resource": "YOUR-LAMBDA-ARN",
        "Next": "Transform",
        "TimeoutSeconds": 3,
        "ResultPath": "$.result"
    },
    "Transform": {
        "Type": "Task",
        "Resource": "YOUR-LAMBDA-ARN",
        "Next": "Load",
        "TimeoutSeconds": 3,
        "ResultPath": "$.result"
    },
    "Load": {
        "Type": "Task",
        "Resource": "YOUR-LAMBDA-ARN",
        "TimeoutSeconds": 3,
        "ResultPath": "$.result",
        "Next": "Notify"
    },
    "Notify": {
        "Type": "Task",
        "Resource": "arn:aws:states:::sns:publish",
        "Parameters": {
          "TopicArn": "YOUR-TOPIC-ARN",
          "Message.$": "$"
        },
        "End": true
    }
  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  4. Deploying Infrastructure with Terraform
&lt;/h3&gt;

&lt;p&gt;The code to deploy the above infrastructure with terraform is shown below. For more details about how to set up terraform and code pipeline visit &lt;a href="https://dev.to/thakurrishabh/deploying-infrastructure-on-aws-with-terraform-and-aws-codepipeline-series-1-3-cloudguruchallenge-5n"&gt;Part-1&lt;/a&gt; of the series. I have omitted the code for the other two scheduled events as I would be showing them in next part of this series.&lt;/p&gt;

&lt;p&gt;state_machine.tf&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "aws_sfn_state_machine" "sfn_state_machine" {
  name     = "YOUR-STATE-MACHINE-NAME"
  role_arn = "YOUR-ROLE-ARN"

  definition = &amp;lt;&amp;lt;EOF
{
  "StartAt": "GetETLStatus",
  "States": {
    "GetETLStatus": {
      "Type": "Task",
      "Resource": "YOUR-LAMBDA-ARN",
      "Next": "CheckETLStatus",
      "TimeoutSeconds": 3,
      "ResultPath": "$.result"
    },
    "CheckETLStatus": {
      "Type": "Choice",
      "Choices": [
        {
          "Variable": "$.result.status",
          "StringEquals": "InitialLoad",
          "Next": "InitialLoad"
        },
        {
          "Variable": "$.result.status",
          "StringEquals": "Update",
          "Next": "Update"
        }
      ],
      "Default": "InitialLoad"
    },
    "InitialLoad": {
      "Type": "Task",
      "Resource": "YOUR-LAMBDA-ARN",
      "Next": "Transform",
      "TimeoutSeconds": 3,
      "ResultPath": "$.result"
    },
    "Update": {
      "Type": "Task",
      "Resource": "YOUR-LAMBDA-ARN",
      "Next": "Transform",
      "TimeoutSeconds": 3,
      "ResultPath": "$.result"
    },
    "Transform": {
      "Type": "Task",
      "Resource": "YOUR-LAMBDA-ARN",
      "Next": "Load",
      "TimeoutSeconds": 3,
      "ResultPath": "$.result"
    },
    "Load": {
      "Type": "Task",
      "Resource": "YOUR-LAMBDA-ARN",
      "TimeoutSeconds": 3,
      "ResultPath": "$.result",
      "Next": "Notify"
    },
    "Notify": {
      "Type": "Task",
      "Resource": "arn:aws:states:::sns:publish",
      "Parameters": {
        "TopicArn": "YOUR-SNS-TOPIC-ARN",
        "Message.$": "$"
      },
      "End": true
    }
  }
}
EOF

depends_on = [
  aws_sns_topic.ETLJobStatus,
  aws_lambda_function.state_machine_lambdas
]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;sqs.tf&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "aws_sqs_queue" "Events_DLQ" {
  name = "Events_DLQ_T"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;sns.tf&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "aws_sns_topic" "ETLJobStatus" {
  name = "ETLJobStatus_T"
}

resource "aws_sns_topic" "ETLErrorMessages" {
  name = "ETLErrorMessages_T"
}

resource "aws_sns_topic_subscription" "ETLJobStatus_target" {
  topic_arn = aws_sns_topic.ETLJobStatus.arn
  protocol  = "email"
  endpoint  = "YOUR-EMAIL-ID"

  depends_on = [
    aws_sns_topic.ETLJobStatus
  ]
}

resource "aws_sns_topic_subscription" "ETLErrorMessages_target" {
  topic_arn = aws_sns_topic.ETLErrorMessages.arn
  protocol  = "email"
  endpoint  = "YOUR-EMAIL-ID"

  depends_on = [
    aws_sns_topic.ETLErrorMessages
  ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;cloudwatch.tf&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;#EventBridge Events

resource "aws_cloudwatch_event_rule" "state_machine_events_failed" {
  name        = "state_machine_events_failed_t"
  description = "This event is triggered when the state machine fails."

  event_pattern = &amp;lt;&amp;lt;EOF
{
  "source": ["aws.states"],
  "detail-type": ["Step Functions Execution Status Change"],
  "detail": {
    "status": ["FAILED"],
    "stateMachineArn": ["${aws_sfn_state_machine.sfn_state_machine.arn}"]
  }
}
EOF

depends_on = [
  aws_sfn_state_machine.sfn_state_machine
]
}

#EventBridge Event Targets

resource "aws_cloudwatch_event_target" "sns" {
  rule      = aws_cloudwatch_event_rule.state_machine_events_failed.name
  target_id = "SendToSNS"
  arn       = aws_sns_topic.ETLErrorMessages.arn

  depends_on = [
    aws_cloudwatch_event_rule.state_machine_events_failed,
    aws_sns_topic.ETLErrorMessages
  ]
}

resource "aws_cloudwatch_event_target" "sqs" {
  rule      = aws_cloudwatch_event_rule.state_machine_events_failed.name
  target_id = "SendToSQS"
  arn       = aws_sqs_queue.Events_DLQ.arn

  depends_on = [
    aws_cloudwatch_event_rule.state_machine_events_failed,
    aws_sqs_queue.Events_DLQ
  ]
}

resource "aws_cloudwatch_event_target" "cloudwatch_logs" {
  rule      = aws_cloudwatch_event_rule.state_machine_events_failed.name
  target_id = "SendToCloudwatchLogs"
  arn       = aws_cloudwatch_log_group.log_group.arn

  depends_on = [
    aws_cloudwatch_event_rule.state_machine_events_failed,
    aws_cloudwatch_log_group.log_group
  ]
}

#Cloudwatch Log Group

resource "aws_cloudwatch_log_group" "log_group" {
  name = "state_machine_events_failed_t"
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  5. Conclusion
&lt;/h3&gt;

&lt;p&gt;In this post we have seen how to build step functions workflow for an ETL job and deploy it with terraform. I know that AWS Glue could be a better approach but still I wanted to explore step functions and this challenge was modified to accommodate it. In the next/final part of this series I'll be combining everything to complete the &lt;a href="https://acloudguru.com/blog/engineering/cloudguruchallenge-python-aws-etl" rel="noopener noreferrer"&gt;#CloudGuruChallenge – Event-Driven Python on AWS&lt;/a&gt;. I'll be performing ETL job on daily COVID-19 cases and display it in AWS Quicksight. &lt;/p&gt;

</description>
      <category>aws</category>
      <category>serverless</category>
      <category>terraform</category>
    </item>
    <item>
      <title>Deploying Infrastructure on AWS with Terraform and AWS CodePipeline      
(#CloudGuruChallenge Series) (Part 1/3) </title>
      <dc:creator>Thakur Rishabh Singh</dc:creator>
      <pubDate>Sun, 11 Apr 2021 03:15:57 +0000</pubDate>
      <link>https://dev.to/thakurrishabh/deploying-infrastructure-on-aws-with-terraform-and-aws-codepipeline-series-1-3-cloudguruchallenge-5n</link>
      <guid>https://dev.to/thakurrishabh/deploying-infrastructure-on-aws-with-terraform-and-aws-codepipeline-series-1-3-cloudguruchallenge-5n</guid>
      <description>&lt;h1&gt;
  
  
  This Post is about creating a CI/CD pipeline on AWS using CodePipeline which deploys Infrastructure on AWS using Terraform.
&lt;/h1&gt;

&lt;h2&gt;
  
  
  Contents
&lt;/h2&gt;

&lt;ol&gt;
&lt;li&gt;Project Overview&lt;/li&gt;
&lt;li&gt;Setting up Terraform&lt;/li&gt;
&lt;li&gt;Setting up AWS CodePipeline

&lt;ul&gt;
&lt;li&gt;Source stage&lt;/li&gt;
&lt;li&gt;Terraform Plan step&lt;/li&gt;
&lt;li&gt;Manual Approval step&lt;/li&gt;
&lt;li&gt;Terraform Apply stage&lt;/li&gt;
&lt;li&gt;Deploy stage&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;Final View Of the Pipeline&lt;/li&gt;
&lt;li&gt;Conclusion&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  1. Project Overview
&lt;/h3&gt;

&lt;p&gt;This project is First part in the series &lt;a href="https://acloudguru.com/blog/engineering/cloudguruchallenge-python-aws-etl" rel="noopener noreferrer"&gt;#CloudGuruChallenge – Event-Driven Python on AWS&lt;/a&gt;. Here we deploy an s3 buckets and a lambda function. The lambda function will be part of an AWS Step Functions Workflow which will be developed in the next part of this series and the S3 bucket is used to store the lambda deployment.&lt;br&gt;
        To automate the process Terraform is used for IaC (Infrastructure as Code) and AWS CodePipeline is used for CI/CD.&lt;/p&gt;
&lt;h3&gt;
  
  
  2. Setting up Terraform
&lt;/h3&gt;

&lt;p&gt;The following are the required steps to start working with Terraform on AWS:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Create an S3 Bucket which will store the terraform state file.&lt;/li&gt;
&lt;li&gt;Create a dynamodb table with &lt;strong&gt;on demand&lt;/strong&gt; capacity with a primary key of LockID.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;The above steps will configure terraform with S3 as the backend. The provider.tf and backends.tf file is shown below.&lt;/p&gt;

&lt;p&gt;provider.tf&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform {
  required_providers {
    aws = {
      source  = "hashicorp/aws"
      version = "~&amp;gt; 3.0"
    }
  }
}

# Configure the AWS Provider
provider "aws" {
  region = "us-east-1"
}


&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;backends.tf&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;terraform {
  backend "s3" {
    bucket = "YOUR-BUCKET-NAME"
    key    = "terraform.tfstate"
    region = "YOUR-REGION-NAME"
    dynamodb_table = "terraform-state-lock"

  }
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Create a sample lambda_function.py and zip it in the same directory as the .tf files with name lambda_function_payload.zip. After which the IaC for S3 and lambda could be written as shown below:&lt;/p&gt;

&lt;p&gt;S3.tf&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "aws_s3_bucket" "lambda_s3_buckets" {
    bucket = "YOUR-BUCKET-NAME"
    acl    = "private"
    force_destroy = true

}

resource "aws_s3_bucket_object" "object" {
    bucket = "YOUR-BUCKET-NAME"
    key    = "YOUR-PATH-TO-STORE-LAMBDA-DEPLOYMENT"
    source = "lambda_function_payload.zip"

    depends_on = [
    aws_s3_bucket.lambda_s3_buckets,
    ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Lambda.tf&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;resource "aws_lambda_function" "state_machine_lambdas" {
    function_name = "YOUR-FUNCTION-NAME"
    role          = "ROLE-ARN"
    handler       = "lambda_function.lambda_handler" 
    s3_bucket = "BUCKET-NAME_WITH-LAMBDA-DEPLOYMENT"
    s3_key    = "PATH-TO-LAMBDA-DEPLOYMENT"

    runtime = "python3.8"
    depends_on = [
    aws_s3_bucket_object.object,
    ]
}
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Store all the .tf files in a folder named terraform.&lt;/p&gt;

&lt;h3&gt;
  
  
  3. Setting up AWS CodePipeline
&lt;/h3&gt;

&lt;p&gt;The AWS CodePipeline will be used for CI/CD (Continuous Integration/Continuous Delivery). The AWS free tier allows 1 free pipeline per month. Our pipeline consists of five stages viz source ,build (Terraform Plan), build (Manual Approval step), Terraform Apply and Deploy.&lt;/p&gt;

&lt;h4&gt;
  
  
  Source Stage
&lt;/h4&gt;

&lt;p&gt;Here Github is used as a source repository for the pipeline. The configuration for this stage is shown below.&lt;/p&gt;

&lt;p&gt;Stage 1: Source Configuration&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbpwbibfisq6x0ca1d7td.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbpwbibfisq6x0ca1d7td.png" alt="Screenshot from 2021-04-10 19-57-32"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe2upjcrx3306nft42j6n.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fe2upjcrx3306nft42j6n.png" alt="Screenshot from 2021-04-10 19-57-57"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Github version 2 is selected as the source code repository. You can choose others such as AWS codecommit etc. For github you need to connect to it which is a straight forward process through the console.&lt;/li&gt;
&lt;li&gt;The repository name and branch name must be set up which will trigger the pipeline whenever there is a code change in that specific branch in the repository.&lt;/li&gt;
&lt;/ol&gt;
&lt;h4&gt;
  
  
  Build Stage(Terraform Plan Step)
&lt;/h4&gt;

&lt;p&gt;This stage consists of two builds, a manual approval step and a deploy step. The first build is for terraform plan. AWS Code Build is used for creating the build projects. To set this up, in the pipeline stages add a new action group under the build stage as shown below&lt;/p&gt;

&lt;p&gt;Adding a build action to CodePipeline&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy94y7ejpuhp06jeoqjq6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fy94y7ejpuhp06jeoqjq6.png" alt="Screenshot from 2021-04-10 20-10-49"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6169g117gt3izeq26mn6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F6169g117gt3izeq26mn6.png" alt="Screenshot from 2021-04-10 20-11-05"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The project must be AWS Code Build Project which can be created by clicking on create project which will open the wizard as shown below.&lt;/p&gt;

&lt;p&gt;CodeBuild Project For Terraform Plan&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgdmcpemnxgguqg1t8quu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fgdmcpemnxgguqg1t8quu.png" alt="Screenshot from 2021-04-10 20-13-23"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs6jvz7bro6sz3gbx57ep.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs6jvz7bro6sz3gbx57ep.png" alt="Screenshot from 2021-04-10 20-13-47"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs6lmxw7w3wkkrfctv9ox.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fs6lmxw7w3wkkrfctv9ox.png" alt="Screenshot from 2021-04-10 20-14-26"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpsu5n8sh0azultlim37i.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fpsu5n8sh0azultlim37i.png" alt="Screenshot from 2021-04-10 20-15-06"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8til4e3h1zhho5yxkxxa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F8til4e3h1zhho5yxkxxa.png" alt="Screenshot from 2021-04-10 20-15-33"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The important configuration above is &lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;selecting 3gb memory with 2cpus (which is included in free tier)&lt;/li&gt;
&lt;li&gt;The service role arn (which gives terraform the permission to provision AWS resources). It must contain all permissions which terraform needs such as access to S3, lambda etc.&lt;/li&gt;
&lt;li&gt;The env variable TF_COMMAND_P which will be used in buildspec file.&lt;/li&gt;
&lt;li&gt;The path to buildspec.yml file which contains the build commands.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;After Configuring the above the git repository it must contain a buildspec file which contains the necessary commands. It is shown below     &lt;/p&gt;

&lt;p&gt;Terraform_Plan.yml&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;version: 0.1

phases:

  install:
    commands:
      - "apt install unzip -y"
      - "wget 
https://releases.hashicorp.com/terraform/0.14.10/terraform_0.14.10_linux_amd64.zip"
      - "unzip terraform_0.14.10_linux_amd64.zip"
      - "mv terraform /usr/local/bin/"
  pre_build:
    commands:
      - terraform -chdir=Terraform init -input=false

  build:
    commands:
      - terraform -chdir=Terraform $TF_COMMAND_P -input=false -no-color

  post_build:
    commands:
      - echo terraform $TF_COMMAND_P completed on `date`
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Manual Approval Step
&lt;/h4&gt;

&lt;p&gt;Create a manual approval build action after the Terraform plan action as shown below.&lt;/p&gt;

&lt;p&gt;Manual Approval Build Stage&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fct922lvhky6oggx3q7ft.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fct922lvhky6oggx3q7ft.png" alt="Screenshot from 2021-04-10 20-31-22"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx6u3c6s4s2gd8wvm8duy.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fx6u3c6s4s2gd8wvm8duy.png" alt="Screenshot from 2021-04-10 20-31-39"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;You must create an SNS topic with a subscription to your email address to recieve notifications about approval and rejections.&lt;/p&gt;
&lt;h4&gt;
  
  
  Terraform Apply Stage
&lt;/h4&gt;

&lt;p&gt;Follow a similar process as Terraform plan for Terraform Apply build stage with the following buildspec file.&lt;/p&gt;

&lt;p&gt;Terraform_Apply.yml&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;version: 0.1

phases:

  install:
    commands:
      - "apt install unzip -y"
      - "wget https://releases.hashicorp.com/terraform/0.14.10/terraform_0.14.10_linux_amd64.zip"
      - "unzip terraform_0.14.10_linux_amd64.zip"
      - "mv terraform /usr/local/bin/"
  pre_build:
    commands:
      - terraform -chdir=Terraform init -input=false

  build:
    commands:
      - terraform -chdir=Terraform $TF_COMMAND_A -input=false -auto-approve

  post_build:
    commands:
      - echo terraform $TF_COMMAND_A completed on `date`
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h4&gt;
  
  
  Deploy Stage
&lt;/h4&gt;

&lt;p&gt;Now the final stage would be deploy where code build is used to deploy code to lambda via s3 bucket. Create a folder in your repo called lambda_code and store your lambda_function.py in it. Create a new code build project. All the steps for code build project will be the same. The buildspec file for the project is shown below.&lt;/p&gt;

&lt;p&gt;deploy_state_machine_code.yml&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;version: 0.1

phases:

  pre_build:
    commands:
      - mkdir -p ./lambda_code/zipped
      - zip -r -j lambda_code/zipped/lambda-function-payload.zip lambda_code/*

  build:
    commands:
      - aws s3 sync ./lambda_code/zipped s3://YOUR-BUCKET-NAME/YOUR-S3-KEY --delete

      - aws lambda update-function-code --function-name YOUR-FUNCTION-NAME --s3-bucket YOUR-BUCKET-NAME --s3-key YOUR-S3-KEY-TO-FILE

  post_build:
    commands:
      - echo state_machine_code was deployed to lambda from S3 bucket on `date`
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;h3&gt;
  
  
  4. Final View Of the Pipeline
&lt;/h3&gt;

&lt;p&gt;After the above steps the pipeline should look something like the one shown below.&lt;/p&gt;

&lt;p&gt;Final View Of the Pipeline&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7zltkicd5hvulrmzd84j.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F7zltkicd5hvulrmzd84j.png" alt="Screenshot from 2021-04-13 13-53-25"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc0g2socer345wvql7r4h.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fc0g2socer345wvql7r4h.png" alt="Screenshot from 2021-04-13 13-53-47"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh7tfoezg99pp7r4fobsw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh7tfoezg99pp7r4fobsw.png" alt="Screenshot from 2021-04-13 13-54-07"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Pushing to github will trigger the pipeline and the build process can be viewed through the details option on each build action.&lt;/p&gt;

&lt;h3&gt;
  
  
  5. Conclusion
&lt;/h3&gt;

&lt;p&gt;In this post I have covered how to deploy infrastructure on AWS with terraform through AWS CodePipleine. In the next part of the series I'll show how to set up AWS Step Functions workflow which is triggered through a cloudwatch event. Stay Tuned!  &lt;/p&gt;

</description>
      <category>aws</category>
      <category>terraform</category>
      <category>devops</category>
      <category>github</category>
    </item>
    <item>
      <title>Portfolio/Resume Serverless Website (Cloud Resume Challenge)</title>
      <dc:creator>Thakur Rishabh Singh</dc:creator>
      <pubDate>Sat, 20 Mar 2021 23:08:03 +0000</pubDate>
      <link>https://dev.to/thakurrishabh/portfolio-resume-serverless-website-cloud-resume-challenge-18ln</link>
      <guid>https://dev.to/thakurrishabh/portfolio-resume-serverless-website-cloud-resume-challenge-18ln</guid>
      <description>&lt;h1&gt;
  
  
  This post is about building a serverless website for your resume and hosting it on AWS.
&lt;/h1&gt;

&lt;h6&gt;
  
  
  Disclaimer: If you want to finish the challenge on your own DO NOT read this post. It'll spoil the fun ;)
&lt;/h6&gt;

&lt;h3&gt;
  
  
  About the challenge
&lt;/h3&gt;

&lt;p&gt;This website has been done as part of the &lt;a href="https://cloudresumechallenge.dev/instructions/" rel="noopener noreferrer"&gt;cloud resume challenge&lt;/a&gt; by &lt;a class="mentioned-user" href="https://dev.to/forrestbrazeal"&gt;@forrestbrazeal&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Certification
&lt;/h3&gt;

&lt;p&gt;This is a requirement of the challenge and it was easy for me as I already obtained the AWS Certified Solutions Architect Associate Certification in December 2020 and pursued this challenge to gain hands on experience on the AWS cloud. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.youracclaim.com/badges/1d35a3c5-9266-4a6c-8848-902f0f05c4a1/public_url" rel="noopener noreferrer"&gt;AWS Solutions Architect Associate&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Contents:
&lt;/h3&gt;

&lt;ol&gt;
&lt;li&gt;Why serverless?&lt;/li&gt;
&lt;li&gt;Website Architecture&lt;/li&gt;
&lt;li&gt;Choosing a source control&lt;/li&gt;
&lt;li&gt;Choosing a CI/CD mechanism&lt;/li&gt;
&lt;li&gt;Infrastructure as code with terraform&lt;/li&gt;
&lt;li&gt;CI/CD for terraform&lt;/li&gt;
&lt;li&gt;Implementing backend with python&lt;/li&gt;
&lt;li&gt;CI/CD for backend&lt;/li&gt;
&lt;li&gt;Building the frontend &lt;/li&gt;
&lt;li&gt;CI/CD for frontend&lt;/li&gt;
&lt;li&gt;Conclusion&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  1. Why Serverless?
&lt;/h3&gt;

&lt;p&gt;The primary reasons are:&lt;br&gt;
1.Cost&lt;br&gt;
2.Scalability&lt;br&gt;
3.Availability&lt;br&gt;
4.Performance&lt;/p&gt;

&lt;p&gt;The serverless paradigm offers a way to pay only for what you use. Moreover, the developer need not provision anything beforehand, amazon takes care of that. With low costs it still offers scalability to varying workloads &lt;strong&gt;on demand&lt;/strong&gt;. This makes it very suitable to experiment with cloud technologies out of an enterprise organization and also meet real world challenges. &lt;strong&gt;I literally pay 0$ a month for the entire website&lt;/strong&gt;. My only cost is a custom domain (1.5$/year) and hosted zone on route53 (0.5$/month).&lt;/p&gt;

&lt;p&gt;The only downside of it is the infrastructure gets increasingly complex and becomes difficult to manage. However, it is still growing and the future may hold something promising. There is also a possibility that it may succumb to the containerization paradigm (highly debatable).&lt;/p&gt;

&lt;h3&gt;
  
  
  2. Website Architecture
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F55snlt9bfqf0nzw7nh30.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2F55snlt9bfqf0nzw7nh30.png" alt="Screenshot from 2021-03-20 17-13-17"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The above architecture works as follows:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Users request the webpages to the browser.&lt;/li&gt;
&lt;li&gt;The browser sends request to Route53 which resolves the DNS to reach nearest cloudfront edge location.&lt;/li&gt;
&lt;li&gt;CloudFront forwards the request to S3 bucket which contains the website files and retrieves its contents.&lt;/li&gt;
&lt;li&gt;The S3 bucket with frontend code is protected with Origin Access Identity (OAI) which prevents direct access to the bucket.&lt;/li&gt;
&lt;li&gt;The javascript code in the frontend sends GET and POST requests to API Gateway to retrieve the number of visitors stored in the database.&lt;/li&gt;
&lt;li&gt;The API Gateway forwards the request to lambda as JSON.&lt;/li&gt;
&lt;li&gt;Lambda identifies the type of request and performs a get/put operation on DynamoDB to store/retrieve the number of visitors.&lt;/li&gt;
&lt;li&gt;The visitor count is then displayed on the website.&lt;/li&gt;
&lt;li&gt;A git repository on Github provides code version control and CI/CD through Github actions.&lt;/li&gt;
&lt;li&gt;Terraform deploys the AWS infrastructure.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  3. Choosing a source control
&lt;/h3&gt;

&lt;p&gt;The version control system used for this project is github. The development workflow consists of two branches master and dev. &lt;/p&gt;

&lt;h3&gt;
  
  
  4. Choosing a CI/CD mechanism
&lt;/h3&gt;

&lt;p&gt;The continuous integration and continuous delivery (CI/CD) is achieved using github actions. It consists of yml files which are used to automate the build, test and deploy phases of the development process.&lt;/p&gt;

&lt;h3&gt;
  
  
  5. Infrastructure as code with Terraform
&lt;/h3&gt;

&lt;p&gt;The infrastructure required to host the website on AWS is built using Terraform.&lt;/p&gt;

&lt;p&gt;Before getting started it is important to take care of the following aspects:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Security: Terraform needs permission to deploy infrastructure on aws. Therefore, a user is created who has access only to STS (Secure Token Service) and nothing else. A role is created which has the IAM permissions to perform actions on the required AWS services.&lt;/li&gt;
&lt;li&gt;Terraform State: A remote backend is configured using an s3 bucket to store the Terraform state file and dynamodb is used to store the state lock. This ensures that the infrastructure declared in the tf files and the actual infrastructure deployed is always the same.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;The following infrastructure components are deployed using Terraform:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Private S3 bucket which hosts the frontend code of website.&lt;/li&gt;
&lt;li&gt;A new table is created in dynamodb with on demand capacity and a primary key ID. A default item is created to store the value of the number of visitors.&lt;/li&gt;
&lt;li&gt;A lambda function with python runtime. The code is retrieved from s3 bucket.&lt;/li&gt;
&lt;li&gt;An API Gateway configured as lambda proxy is created which sends GET and POST requests to lambda in the form of JSON and the lambda function also responds in JSON.&lt;/li&gt;
&lt;/ol&gt;

&lt;h3&gt;
  
  
  6. CI/CD for terraform
&lt;/h3&gt;

&lt;p&gt;The Terraform code must be pushed to dev. Creating a pull request results in the triggering of a github action which generates a plan and posts a comment as shown below&lt;/p&gt;

&lt;p&gt;Terraform plan triggered by creating a pull request&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw07t35al8t5cwrdmkici.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw07t35al8t5cwrdmkici.png" alt="Screenshot from 2021-03-20 18-14-40"&gt;&lt;/a&gt; &lt;/p&gt;

&lt;p&gt;Terraform plan displayed as a comment &lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxucl24985hk1y6jxn55c.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fxucl24985hk1y6jxn55c.png" alt="Screenshot from 2021-03-20 18-18-59"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fslqzm590ty5556xyms8d.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fslqzm590ty5556xyms8d.png" alt="Screenshot from 2021-03-20 18-19-26"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Merging the pull request to master applies the plan and deploys the infrastructure as shown below.&lt;/p&gt;

&lt;p&gt;Terraform apply triggered by merging of pull request&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwdwhlmucypsp1u02je2z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fwdwhlmucypsp1u02je2z.png" alt="Screenshot from 2021-03-20 18-07-30"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  7. Implementing backend with python
&lt;/h3&gt;

&lt;p&gt;The backend code is written in python. This code is deployed to lambda through CI/CD. The backend code uses Boto3 SDK to communicate with DynamoDB. The Lambda function has an IAM Role to perform actions on DynamoDB.&lt;/p&gt;

&lt;p&gt;Dealing with CORS: The lambda response contains the header Access-Control-Allow-Origin for * to allow cross origin requests for GET, POST and OPTIONS requests.&lt;/p&gt;

&lt;h3&gt;
  
  
  8. CI/CD for backend
&lt;/h3&gt;

&lt;p&gt;A github action is configured to do the following when code is pushed to /backend on the master branch:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Run tests on the python code to report bugs.&lt;/li&gt;
&lt;li&gt;Zip the python code &lt;/li&gt;
&lt;li&gt;Upload the zip file to s3 bucket&lt;/li&gt;
&lt;li&gt;Update the lambda function code by retrieving the zip from the S3 bucket.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This process is shown below:&lt;/p&gt;

&lt;p&gt;CI/CD for backEnd &lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frg7mil716lshrnzb67pr.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frg7mil716lshrnzb67pr.png" alt="Screenshot from 2021-03-20 18-07-30"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  9. Building the frontend
&lt;/h3&gt;

&lt;p&gt;The frontend code is built using HTML, bootstrap and Javascript. This code resides on a private S3 bucket acting as origin to cloudfront.&lt;/p&gt;

&lt;h3&gt;
  
  
  10. CI/CD for frontend
&lt;/h3&gt;

&lt;p&gt;A github action is configured to do the following when code is pushed to /frontend on the master branch:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Upload the code to S3 bucket.&lt;/li&gt;
&lt;li&gt;Invalidate the cloudfront cache to get the latest contents from the bucket.&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;This process is shown below:&lt;/p&gt;

&lt;p&gt;CI/CD for frontend&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw173g5p619hgmdxa565z.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fw173g5p619hgmdxa565z.png" alt="Screenshot from 2021-03-20 18-06-06"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  11. Conclusion
&lt;/h3&gt;

&lt;p&gt;It has been an amazing experience completing this challenge. I have learnt a ton from Terraform to CI/CD to AWS. It is definitely worth it for anyone who is serious about a career and cloud. Thanks to &lt;a class="mentioned-user" href="https://dev.to/forrestbrazeal"&gt;@forrestbrazeal&lt;/a&gt; for this amazing challenge. As for me I'm on to the next challenge, Stay tuned for another Blog post on a serverless project to automate an ETL pipeline on AWS using python.&lt;/p&gt;

</description>
      <category>serverless</category>
      <category>aws</category>
      <category>cloud</category>
      <category>terraform</category>
    </item>
  </channel>
</rss>
