<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Bansikumar Mendapara</title>
    <description>The latest articles on DEV Community by Bansikumar Mendapara (@bansimendapara).</description>
    <link>https://dev.to/bansimendapara</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/bansimendapara"/>
    <language>en</language>
    <item>
      <title>AWS Elastic WordPress Architecture Evolution</title>
      <dc:creator>Bansikumar Mendapara</dc:creator>
      <pubDate>Tue, 06 Oct 2020 15:06:11 +0000</pubDate>
      <link>https://dev.to/bansimendapara/aws-elastic-wordpress-evolution-5563</link>
      <guid>https://dev.to/bansimendapara/aws-elastic-wordpress-evolution-5563</guid>
      <description>&lt;p&gt;In this blog, I am going to evolve the architecture of a popular WordPress web application. I did this demo as a part of the AWS certification course by Adrian Cantrill. You can find this video course &lt;a href="https://learn.cantrill.io/courses/895720/lectures/23746347"&gt;here&lt;/a&gt;. Even the free version of this demo is available &lt;a href="https://github.com/acantril/learn-cantrill-io-labs/tree/master/aws-elastic-wordpress-evolution"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;h1&gt;
  
  
  Overview
&lt;/h1&gt;

&lt;p&gt;Here our goal is to learn how to design architecture for any given scenario. We are going to see the architectural evolution of the WordPress web application. We will start by manually creating a single instance, running application and database. In the end, We will create highly available, scalable and resilient architecture.&lt;/p&gt;

&lt;h1&gt;
  
  
  Steps
&lt;/h1&gt;

&lt;p&gt;I will go through all the steps of this demo that I followed.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;em&gt;Setup WordPress manually&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;&lt;em&gt;Automate setup using Launch Template&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;&lt;em&gt;Database migration to RDS&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;&lt;em&gt;EFS for local storage&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;&lt;em&gt;Elasticity using ASG and ALB&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;&lt;em&gt;Upgrade RDS to Aurora cluster [OPTIONAL]&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;&lt;em&gt;Cleanup&lt;/em&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;If you skip step 6, you will able to complete this demo almost in the free tier. If you have any doubt, then I have attached a more detailed instruction link with each step.&lt;/p&gt;

&lt;blockquote&gt;
&lt;h3&gt;
  
  
  Step 1 - Setup WordPress manually [&lt;a href="https://github.com/acantril/learn-cantrill-io-labs/blob/master/aws-elastic-wordpress-evolution/02_LABINSTRUCTIONS/STAGE1%20-%20Setup%20and%20Manual%20wordpress%20build.md"&gt;GitHub&lt;/a&gt;]
&lt;/h3&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--WCSkS8UP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/4fh49qt5kzikqpce962r.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--WCSkS8UP--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/4fh49qt5kzikqpce962r.png" alt="Step 1"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;For complete isolation from our current VPC in our AWS account, We will set up a new VPC and perform the whole demo in that. So, in the end, We can easily clean our resources and do not incur any unnecessary charges. You can find one-click deployment &lt;a href="https://console.aws.amazon.com/cloudformation/home?region=us-east-1#/stacks/quickcreate?templateURL=https://learn-cantrill-labs.s3.amazonaws.com/aws-elastic-wordpress-evolution/A4LVPC.yaml&amp;amp;stackName=A4LVPC"&gt;here&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;After this CloudFormation stack moves into CREATE_COMPLETE status, you have to launch the EC2 instance. To store configuration information, We will use SSM parameter store and create various parameters like DBUser, DBName, DBEndpoint, DBPassword, DBRootPassword.&lt;/p&gt;

&lt;p&gt;Now, We will connect to EC2 instance and install database and WordPress. For that, We have to install a web server and database server. We will set up a database and download WordPress. After that, We have to configure a &lt;code&gt;wp-config.php&lt;/code&gt; file. We will change permission for file system and create a WordPress user. Next, We will create a database user and configure permissions.&lt;/p&gt;

&lt;p&gt;At the end of this step, We can test WordPress by creating a post. Till now everything is manual and We have so many limitations.&lt;/p&gt;

&lt;blockquote&gt;
&lt;h3&gt;
  
  
  Step 2 - Automate setup using Launch Template [&lt;a href="https://github.com/acantril/learn-cantrill-io-labs/blob/master/aws-elastic-wordpress-evolution/02_LABINSTRUCTIONS/STAGE2%20-%20Automate%20the%20build%20using%20a%20Launch%20Template.md"&gt;GitHub&lt;/a&gt;]
&lt;/h3&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--6TpRhJ3K--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/of2883maxfom464p42l2.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--6TpRhJ3K--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/of2883maxfom464p42l2.png" alt="Step 2"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In step 1, We did all set up manually. To automate this process, We will create a launch template for this. Whatever configuration We did in the first step, We will do this in user data. Therefore when We launch the instance using this template, We will directly get our WordPress web application ready to use.&lt;/p&gt;

&lt;p&gt;By doing this, this will remove the overhead of configuring an EC2 instance whenever we launch it. Even doing this using user data is less time-consuming.&lt;/p&gt;

&lt;blockquote&gt;
&lt;h3&gt;
  
  
  Step 3 - Database migration to RDS [&lt;a href="https://github.com/acantril/learn-cantrill-io-labs/blob/master/aws-elastic-wordpress-evolution/02_LABINSTRUCTIONS/STAGE3%20-%20Add%20RDS%20and%20Update%20the%20LT.md"&gt;GitHub&lt;/a&gt;]
&lt;/h3&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--onPp3EQ4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/v2h7kvnq0rat42jkkh74.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--onPp3EQ4--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/v2h7kvnq0rat42jkkh74.png" alt="Step 3"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;We will split out database functionality from EC2 instance by launching the RDS DB instance. To migrate WordPress data, We will take a backup of the database from EC2 instance and restore into RDS. To accommodate this database migration, We have to change one SSM parameter DBEndpoint and update launch template with necessary command. &lt;/p&gt;

&lt;p&gt;By doing this, it allows the database to scale independently of the application.&lt;/p&gt;

&lt;blockquote&gt;
&lt;h3&gt;
  
  
  Step 4 - EFS for local storage [&lt;a href="https://github.com/acantril/learn-cantrill-io-labs/blob/master/aws-elastic-wordpress-evolution/02_LABINSTRUCTIONS/STAGE4%20-%20Add%20EFS%20and%20Update%20the%20LT.md"&gt;GitHub&lt;/a&gt;]
&lt;/h3&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--mTmBXRL6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/hz71qrxohgl8d10why3b.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--mTmBXRL6--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/hz71qrxohgl8d10why3b.png" alt="Step 4"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Currently, any media for WordPress posts are stored locally. Now, We will move this data to EFS, which will allow all instances access this shared data. After successful creation of EFS, We will add EFS id to SSM parameter store. To access this EFS, We have to modify user data entered in Launch template. &lt;/p&gt;

&lt;p&gt;Now, our application media and UI store are not local. It will also not create any problem in scaling. &lt;/p&gt;

&lt;blockquote&gt;
&lt;h3&gt;
  
  
  Step 5 - Elasticity using ASG and ALB [&lt;a href="https://github.com/acantril/learn-cantrill-io-labs/blob/master/aws-elastic-wordpress-evolution/02_LABINSTRUCTIONS/STAGE5%20-%20Add%20an%20ELB%20and%20ASG.md"&gt;GitHub&lt;/a&gt;]
&lt;/h3&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--sfsCsda0--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/r69xirramjdci201ptk1.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--sfsCsda0--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/r69xirramjdci201ptk1.png" alt="Step 5"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;To add elasticity in our WordPress application, We will add ASG and ALB. We need to add the DNS of ALB in SSM parameter store. To adopt changes, We have to change user data of launch template. We can add a scaling policy for ASG to scale in and out dynamically. &lt;/p&gt;

&lt;p&gt;After completion of this step, We have removed direct connection of customer and instance. Now, there is no dependency between the database and instance. &lt;/p&gt;

&lt;blockquote&gt;
&lt;h3&gt;
  
  
  Step 6 - Upgrade RDS to Aurora cluster [OPTIONAL] [&lt;a href="https://github.com/acantril/learn-cantrill-io-labs/blob/master/aws-elastic-wordpress-evolution/02_LABINSTRUCTIONS/STAGE6%20-%20Optional%20Aurora.md"&gt;GitHub&lt;/a&gt;]
&lt;/h3&gt;
&lt;/blockquote&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--DuNZHbyj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/8bckwqnbsvqnzusfk2zw.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--DuNZHbyj--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/8bckwqnbsvqnzusfk2zw.png" alt="Step 6"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;In this step, We will move RDS to 3AZ aurora cluster to provide high levels of resilience for the WordPress application. You can skip this part if you do not want to incur any charges. To upgrade our database, firstly We will create a snapshot of RDS instance. After that, We will migrate this snapshot Aurora DB cluster. For resilience, We can create additional readers. We will change the DBEndpoint SSM parameter with the Aurora database endpoint. To apply these changes, We have to refresh our instances using ASG.&lt;/p&gt;

&lt;p&gt;Finally, We can create a highly available, scalable and resilient architecture of WordPress web application.&lt;/p&gt;

&lt;blockquote&gt;
&lt;h3&gt;
  
  
  Step 7 - Cleanup [&lt;a href="https://github.com/acantril/learn-cantrill-io-labs/blob/master/aws-elastic-wordpress-evolution/02_LABINSTRUCTIONS/STAGE7%20-%20CLEANUP.md"&gt;GitHub&lt;/a&gt;]
&lt;/h3&gt;
&lt;/blockquote&gt;

&lt;p&gt;To do clean up, We have to delete load balancer, target group, auto-scaling group, EFS, RDS, launch template, RDS snapshot and CloudFormation template. By doing this, you will delete all resources created during this demo. &lt;/p&gt;

&lt;h1&gt;
  
  
  Conclusion
&lt;/h1&gt;

&lt;p&gt;I learned how to create this kind of architecture by doing step by step evolution. Thank you, Adrian Cantrill, for this demo. I highly recommend his course on &lt;a href="https://learn.cantrill.io/"&gt;learn.cantrill.io&lt;/a&gt;. &lt;/p&gt;

</description>
      <category>aws</category>
      <category>kubernetes</category>
      <category>linux</category>
    </item>
    <item>
      <title>Event-Driven Python on AWS</title>
      <dc:creator>Bansikumar Mendapara</dc:creator>
      <pubDate>Sun, 27 Sep 2020 17:31:53 +0000</pubDate>
      <link>https://dev.to/bansimendapara/cloudguruchallenge-september-60l</link>
      <guid>https://dev.to/bansimendapara/cloudguruchallenge-september-60l</guid>
      <description>&lt;p&gt;Around 15 days ago, A Cloud Guru launched #CloudGuruChallenge - Event-Driven Python on AWS. It might sound like a simple task, but I faced a few challenges and learned so many things. This blog will completely guide through this challenge.&lt;/p&gt;

&lt;h1&gt;
  
  
  Overview of Challenge
&lt;/h1&gt;

&lt;p&gt;&lt;a href="https://www.linkedin.com/in/forrestbrazeal/" rel="noopener noreferrer"&gt;Forrest Brazeal&lt;/a&gt; is the creator of this challenge. The main goal of this challenge is to automate an ETL processing pipeline for COVID-19 data using Python and Cloud services. More about this challenge, you can find &lt;a href="https://acloudguru.com/blog/engineering/cloudguruchallenge-python-aws-etl" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;h1&gt;
  
  
  Challenge Steps
&lt;/h1&gt;

&lt;p&gt;Now, I would like to go through all the steps of this challenge that I followed.&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;&lt;em&gt;Transformation&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;&lt;em&gt;Load transformed data into the database&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;&lt;em&gt;Notify customers when the database gets any updates&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;&lt;em&gt;Error handling&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;&lt;em&gt;Trigger function once a daily&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;&lt;em&gt;Infrastructure as Code&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;&lt;em&gt;CI/CD pipeline and Source control&lt;/em&gt;&lt;/li&gt;
&lt;li&gt;&lt;em&gt;Quicksight dashboard&lt;/em&gt;&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;You can find a full working code in my &lt;a href="https://github.com/bansimendapara/aws_etl" rel="noopener noreferrer"&gt;GitHub repository&lt;/a&gt;.&lt;/p&gt;

&lt;blockquote&gt;
&lt;h3&gt;
  
  
  Step 1 - Transformation
&lt;/h3&gt;
&lt;/blockquote&gt;

&lt;p&gt;One of the requirements of this challenge was to create a separate python module for the transformation task. Therefore, I have created a separate python code for the transformation of data. &lt;/p&gt;

&lt;p&gt;Firstly, I filtered both data according to the country name because here we want only COVID-19 data of US only. Next, I dropped unnecessary columns if present and converted date in python DateTime data type. Lastly, I wanted to join both data according to report date and to make that happen, I made some changes in data frame indexes. This function is completely abstracted as it does not care where data is stored and don't know about database also.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def transform(dfNYT,dfJH):
    dfJH = dfJH[dfJH['Country/Region']=='US'].drop(columns='Country/Region')
    dfJH.columns = ['date','recovered']
    dfNYT['date'] = pd.to_datetime(dfNYT['date'],format='%Y-%m-%d')
    dfJH['date'] = pd.to_datetime(dfJH['date'],format='%Y-%m-%d')
    dfNYT.set_index('date', inplace=True)
    dfJH.set_index('date',inplace=True)
    dfJH['recovered'] = dfJH['recovered'].astype('int64')
    dfFinal = dfNYT.join(dfJH, how='inner')
    dfFinal.reset_index(inplace=True)
    return dfFinal
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;h3&gt;
  
  
  Step 2 - Load transformed data into the database
&lt;/h3&gt;
&lt;/blockquote&gt;

&lt;p&gt;To fulfil this task, I used the Lambda function and the RDS PostgreSQL database. PostgreSQL has a database adapter for python called &lt;a href="https://www.psycopg.org/" rel="noopener noreferrer"&gt;psycopg2&lt;/a&gt;. It allows us to connect to the PostgreSQL database and perform various SQL queries. &lt;/p&gt;

&lt;p&gt;I have used Lambda environment variables to store necessary values like URL of both CSV files, database information(endpoint, port, username, password, region) etc. To make code more structured, I have divided this task into small subtasks and created a function for them.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Database connection - connect to database&lt;/li&gt;
&lt;li&gt;First-time data insertion - insert all data at once&lt;/li&gt;
&lt;li&gt;Daily data insertion - only insert new data available&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;I want to minimize manual interaction with AWS Management Console. Therefore, firstly I checked if there is any table called &lt;code&gt;etl&lt;/code&gt; available in the database. If it is not available, then the lambda function will create the table first and do first-time data insertion. If &lt;code&gt;etl&lt;/code&gt; table is already available then it will check for daily data insertion. &lt;/p&gt;

&lt;p&gt;For optimal data insertion, I tried several methods like &lt;code&gt;executemany()&lt;/code&gt;, &lt;code&gt;mogrify()&lt;/code&gt; and many others. At the end, I went with the below approach.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;for i in dfFinal.index:
    row = (dfFinal.loc[i,'date'], int(dfFinal.loc[i,'cases']),int(dfFinal.loc[i,'deaths']),int(dfFinal.loc[i,'recovered']))
    data.append(row)
records = ','.join(['%s'] * len(data))
query = "insert into etl (reportdate,cases,deaths,recovered) values{}".format(records)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;h3&gt;
  
  
  Step 3 - Notify customers when the database gets any updates
&lt;/h3&gt;
&lt;/blockquote&gt;

&lt;p&gt;To do this, I have used SNS service of AWS. I created a function in lambda to publish the message to SNS whenever there is any update in the database.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;def notify(text):
    try:
        sns = boto3.client('sns')
        sns.publish(TopicArn = snsARN, Message = text)
    except Exception as e:
        print("Not able to send SMS due to {}".format(e))
        exit(1)
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;h3&gt;
  
  
  Step 4 - Error handling
&lt;/h3&gt;
&lt;/blockquote&gt;

&lt;p&gt;For a seamless workflow, error handling is important. My code is able to manage various errors. Like,&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;If there is any unexpected or malfunction input in data, then transformation function will raise an exception and will notify.&lt;/li&gt;
&lt;li&gt;If if there is any issue while connecting to the database, then &lt;code&gt;database_connection()&lt;/code&gt; function will notify with reason.&lt;/li&gt;
&lt;li&gt;If there is any problem while table creation or data insertion, then a notification will be sent with the raised exception.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;If we are doing daily insertion then there is no need to insert whole data again. It is better to insert only new data. For that, I have checked the maximum(last) report date in the table and insert subsequent data. It will also notify customers with the number of updated rows.&lt;/p&gt;

&lt;blockquote&gt;
&lt;h3&gt;
  
  
  Step 5 - Trigger function once a daily
&lt;/h3&gt;
&lt;/blockquote&gt;

&lt;p&gt;As these CSV files get updated once a day. So there is no need to check for updates in data continuously. That's why I configure event rule which invokes lambda function once a day. You can find a tutorial for the same &lt;a href="https://docs.aws.amazon.com/AmazonCloudWatch/latest/events/WhatIsCloudWatchEvents.html" rel="noopener noreferrer"&gt;here&lt;/a&gt;.&lt;/p&gt;

&lt;blockquote&gt;
&lt;h3&gt;
  
  
  Step 6 - Infrastructure as Code
&lt;/h3&gt;
&lt;/blockquote&gt;

&lt;p&gt;It was one of the most challenging parts for me as I didn't play around CloudFormation much. I went from creating one by one element in YAML. In the end, I didn't find that much tough task as AWS has very well-oriented and well-described documentation for each type of AWS service. Even I end up creating RDS instance by CloudFormation and that's why in lambda function I checked first if the table exists or not. Overall, this part gave me more idea about IaC. You can check my CloudFormation template &lt;a href="https://github.com/bansimendapara/aws_etl/blob/master/template.yaml" rel="noopener noreferrer"&gt;here&lt;/a&gt;. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F30qwb4q1qzk0alatpr6f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F30qwb4q1qzk0alatpr6f.png" alt="Alt Text"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;h3&gt;
  
  
  Step 7 - CI/CD pipeline and Source control
&lt;/h3&gt;
&lt;/blockquote&gt;

&lt;p&gt;Even though this part was optional, I wanted to do it because I do not want to do a manual update in AWS whenever I change my python code or infrastructure. For source control, I used GitHub as it is very handy. To configure CI/CD pipeline, I went for GitHub actions. Now, whenever I push a change to my GitHub repository, GitHub actions will run the pipeline and perform the necessary update.&lt;/p&gt;

&lt;blockquote&gt;
&lt;h3&gt;
  
  
  Step 8 - Quicksight dashboard
&lt;/h3&gt;
&lt;/blockquote&gt;

&lt;p&gt;The final task was to generate virtualization of US case counts, recoveries, death ratio, daily increase in cases etc.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fzlvat1druot80mvlaza5.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fzlvat1druot80mvlaza5.PNG" alt="QuickSight Dashboard image"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fp1ze3b8hqce1mekdddk6.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fp1ze3b8hqce1mekdddk6.PNG" alt="QuickSight Dashboard image"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F1xj8x897z4e0gg8iepa8.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F1xj8x897z4e0gg8iepa8.PNG" alt="QuickSight Dashboard image"&gt;&lt;/a&gt;&lt;br&gt;
&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fez6fzqdtgtzwdiysgzgj.PNG" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fez6fzqdtgtzwdiysgzgj.PNG" alt="QuickSight Dashboard image"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h1&gt;
  
  
  Major Challenges and Learning
&lt;/h1&gt;

&lt;p&gt;Throughout this project, I faced so many challenges but even learned from them.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;To do a transformation and database connectivity, I used pandas and psycopg2 module of python. This module is not readily available in lambda. I have to add as layers in lambda. In starting, I faced an issue with that, but at the end, I found a curated list of awesome AWS Lambda Layers &lt;a href="https://github.com/mthenw/awesome-layers" rel="noopener noreferrer"&gt;here&lt;/a&gt; which made my task easy.&lt;/li&gt;
&lt;li&gt;As I wanted to create a table by code only, I was not able to make a way for this in lambda. But after trying a few things, I got this.&lt;/li&gt;
&lt;li&gt;I do not have much experience with CloudFormation. So, I learned first by going through various AWS documentation and was able to make it at the end.&lt;/li&gt;
&lt;li&gt;CI/CD pipeline is new for me, but I found it really interesting. I configured pipeline with GitHub actions still I want to explore AWS CodePipeline for this.&lt;/li&gt;
&lt;/ul&gt;

&lt;h1&gt;
  
  
  Conclusion
&lt;/h1&gt;

&lt;p&gt;Overall, I enjoyed this challenge and learned so many things. In this journey, so many people helped me. I would like to thank &lt;a href="https://www.linkedin.com/in/forrestbrazeal/" rel="noopener noreferrer"&gt;Forrest Brazeal&lt;/a&gt; for creating this amazing challenge. I really appreciate any feedback for this challenge or blog. Thank you.&lt;/p&gt;

&lt;blockquote&gt;
&lt;h3&gt;
  
  
  &lt;a href="https://github.com/bansimendapara/aws_etl" rel="noopener noreferrer"&gt;GitHub Repository&lt;/a&gt;
&lt;/h3&gt;
&lt;/blockquote&gt;

</description>
      <category>aws</category>
      <category>100daysofcloud</category>
      <category>cloudguruchallenge</category>
    </item>
  </channel>
</rss>
