<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Andy</title>
    <description>The latest articles on DEV Community by Andy (@chandy13).</description>
    <link>https://dev.to/chandy13</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/chandy13"/>
    <language>en</language>
    <item>
      <title>ETL Pipeline for COVID-19 data using Python and AWS</title>
      <dc:creator>Andy</dc:creator>
      <pubDate>Thu, 15 Oct 2020 03:25:07 +0000</pubDate>
      <link>https://dev.to/chandy13/etl-pipeline-for-covid-19-data-using-python-and-aws-1j83</link>
      <guid>https://dev.to/chandy13/etl-pipeline-for-covid-19-data-using-python-and-aws-1j83</guid>
      <description>&lt;p&gt;Hey dev.to! Excited to share another project I've been working on. I present to you my &lt;a href="http://ec2-54-205-122-229.compute-1.amazonaws.com/public/dashboards/GBgyIYQle5wJzuPR9wzWUuemsBM95kLM0PYcJSL1?org_slug=default" rel="noopener noreferrer"&gt;Dashboard&lt;/a&gt; for COVID-19 data for Ontario Canada! I created an automated ETL pipeline using Python on AWS infrastructure and displayed it using Redash.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fj2cq3xf2njbxmocpciao.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fj2cq3xf2njbxmocpciao.png" alt="Ontario COVID Data using Redash"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Project Overview
&lt;/h3&gt;

&lt;p&gt;The idea of this project came from A Cloud Guru's monthly &lt;a href="https://acloudguru.com/blog/engineering/cloudguruchallenge-python-aws-etl" rel="noopener noreferrer"&gt;#CloudGuruChallenge&lt;/a&gt;. For September the goal was to build an automated pipeline using python that would extract csv data from an online source, transform the data by converting some strings into integers, and load the data into a DynamoDB table. After that we would display the data in a dashboard. I added a little twist to this to make it more relevant to me and used data for Ontario Canada instead!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fzitz4vptzqrnenrbyb5w.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fzitz4vptzqrnenrbyb5w.png" alt="Project Diagram"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I was excited to work on this project because I wanted to develop my Python coding skills and also create a useful tool that I can use everyday and share it with others if they're interested! &lt;/p&gt;

&lt;h3&gt;
  
  
  Discovering Trello
&lt;/h3&gt;

&lt;p&gt;Over the last 3 months I've learned that free time is very valuable and often in short supply so I needed a way to organize my workload and maximize efficiency. I started looking around for some tools that could help in this aspect and started from JIRA which I use at work. Unfortunately JIRA seemed a bit overkill for just a one person team which is when I discovered Trello.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fkvt89x8s5t4q172ufmp8.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fkvt89x8s5t4q172ufmp8.png" alt="My empty Trello board"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I'm such a huge fan of Trello, I love all the customization options to match my workflow and its very rewarding, for me at least, to punt that Trello task card over to my completed list. There's still so much more that I can do with it and I'm excited to dive into some of the automation options but I don't want to turn this into a Trello blog post so I won't go into too much detail. I created a card for each step that was listed on the &lt;a href="https://acloudguru.com/blog/engineering/cloudguruchallenge-python-aws-etl" rel="noopener noreferrer"&gt;challenge page&lt;/a&gt; and started working through them!&lt;/p&gt;

&lt;h3&gt;
  
  
  There's so much data
&lt;/h3&gt;

&lt;p&gt;I am a newbie when it comes to this, I've never had to do data manipulation with this much data before so these were the steps that I had the most trouble with, I even broke VSCode a couple times because I iterated through a huge csv file oops...&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F4fb4df1izz9ncezp4nnv.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2F4fb4df1izz9ncezp4nnv.png" alt="So much data"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;First step was to extract the data from a csv source from the Ontario government. I had trouble initially as I wasn't sure how to do this and there were so many different ways to do this but I settled on using the csv and requests modules to get this working. &lt;/p&gt;

&lt;p&gt;Next we had to transform the data and for me I created 3 new columns for daily numbers using loops to calculate the numbers. This was definitely challenging and caused my VSCode to crash a couple times because there were a couple of times where I iterated through the entire dataset instead of filtering it first and then iterating through it and my computer definitely did not like that. A couple of crashes later I filtered out the irrelevant data and got everything combined and neatly organized into a sweet list. &lt;/p&gt;

&lt;p&gt;Finally we had to load the data into a DynamoDB table and thanks to my experience working on the &lt;a href="https://dev.to/chandy13/my-journey-in-conquering-the-cloud-resume-challenge-2doa"&gt;Cloud Resume Challenge&lt;/a&gt; last month I was able to quickly complete this. &lt;/p&gt;

&lt;h3&gt;
  
  
  Everyone use CloudFormation as much as possible
&lt;/h3&gt;

&lt;p&gt;If you read my last post you'll know that I am a huge fan of CloudFormation. I try to use it whenever possible and for this project I deployed everything using only two CloudFormation templates, I can't imagine going back to the days where I would deploy my infrastructure manually!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fsvwry1m5cutqsebzr0fa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fsvwry1m5cutqsebzr0fa.png" alt="Took less than 2 minutes!"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;I had the mindset going into this project that if I was going to work on AWS I will use CloudFormation templates for everything I can. Working on this I learned even more about CloudFormation uses such as configuring CloudWatch events, setting up DynamoDB streams, and connecting that as a trigger for a notification Lambda! The best part for me about CloudFormation is that after making all the required changes to my code and templates I just SAM deploy it, go grab some water, and by the time I'm back my entire ETL Job is updated!&lt;/p&gt;

&lt;h3&gt;
  
  
  Finishing touches
&lt;/h3&gt;

&lt;p&gt;After everything was deployed on AWS there was still some tasks to do in order to ensure everything works and is visualized in a nice way.&lt;/p&gt;

&lt;p&gt;AWS SNS is not something I have worked a lot with but its important to this project because it updates me on whether my ETL Lambda is being triggered daily or if I run into any problems with loading the data into DynamoDB. First thing is to set up a notification in my ETL Lambda function that would let me know if there was any errors in loading the data into DynamoDB. I used a try except block in my Lambda function that would publish a message to an SNS topic if there was invalid data entries so I know that data is being regularly updated and is correct. &lt;/p&gt;

&lt;p&gt;Next I needed to make sure that when there is a successful update that I also get a notification just so I know my table is up to date with today's information. I created a NotifyUpdates.js file and have it run whenever DynamoDB streams reports a successful update to the table. This message would tell me how many new rows are added (usually 1 a day) and what the info in those rows are.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fuq5jh0xswowhvqhk0ndb.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fuq5jh0xswowhvqhk0ndb.png" alt="Daily Email"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Now for a cool way to display the data, I looked at a couple of different options and initially the plan was to go with AWS Quick Sight but after playing around with it and learning that first; it doesn't support DynamoDB, and second it wasn't publicly shareable I had to pivot to something else which is when I discovered Redash!&lt;/p&gt;

&lt;h3&gt;
  
  
  Data is beautiful
&lt;/h3&gt;

&lt;p&gt;Redash is incredibly powerful but also very easy to use especially for someone like me who didn't have any experience querying databases or setting up dashboards. &lt;/p&gt;

&lt;p&gt;First thing to do is spin up an EC2 instance using the Redash image ID which I got from their webpage. I quickly added this to my existing CloudFormation Template so I can easily deploy and update it when needed.&lt;/p&gt;

&lt;p&gt;Next once the server was started I went through the web interface to go through the configuration, connect my DynamoDB database and started querying my data to create visualizations.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fwtaedtgq5g93czwt73zu.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fi%2Fwtaedtgq5g93czwt73zu.png" alt="So many visualizations"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;Everything was super simple to pick up and I had so many options to visualize my data. Designing the dashboard too was simple and I tried to put the most relevant data on screen and fit everything there. If anyone ever needs a dashboard for their database I highly recommend &lt;a href="https://redash.io/" rel="noopener noreferrer"&gt;Redash&lt;/a&gt;.&lt;/p&gt;

&lt;h3&gt;
  
  
  Things I learned
&lt;/h3&gt;

&lt;p&gt;I'm going to make it a habit to summarize a couple things that I learned in every project so I can one day go back on these blogs and see my progress!&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Manipulating csv's from internet sources using Python scripts&lt;/li&gt;
&lt;li&gt;Automating jobs using CloudWatch and Lambda with SNS Notifications&lt;/li&gt;
&lt;li&gt;Working with DynamoDB streams and new CloudFormation commands&lt;/li&gt;
&lt;li&gt;Trello is amazing and I should keep using it&lt;/li&gt;
&lt;li&gt;Redash is awesome and I will definitely try to implement this in my future projects.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Conclusion
&lt;/h3&gt;

&lt;p&gt;There we have it, an automated ETL job that collects US COVID-19 data and displays it in a cool dashboard. I am happy with how everything turned out and everything I learned I will definitely use in the future. Going to try to keep blog posts coming monthly so thanks for reading my October 2020 post! See you in November!&lt;/p&gt;

</description>
      <category>python</category>
      <category>aws</category>
    </item>
    <item>
      <title>My journey in conquering the cloud resume challenge</title>
      <dc:creator>Andy</dc:creator>
      <pubDate>Sat, 26 Sep 2020 21:58:17 +0000</pubDate>
      <link>https://dev.to/chandy13/my-journey-in-conquering-the-cloud-resume-challenge-2doa</link>
      <guid>https://dev.to/chandy13/my-journey-in-conquering-the-cloud-resume-challenge-2doa</guid>
      <description>&lt;p&gt;Hello dev.to! Welcome to my first of hopefully many posts where I share my experiences taking on cloud challenges and try to level up my cloud skills! I decided to do this because I want to share my experiences from starting as a novice working in the cloud to becoming a cloud expert! &lt;/p&gt;

&lt;p&gt;For my first personal project I decided to take on the &lt;a href="https://cloudresumechallenge.dev/instructions/"&gt;Cloud Resume Challenge&lt;/a&gt;. The reason I picked this challenge is because I have some background in AWS from work and I want to learn and work with more AWS technologies like CloudFormation, Lambda, and DynamoDB. Even though I missed the official deadline for the challenge I wanted to spend some extra time and move at my own pace to complete this. After long hours of trial and error and self-learning I finally did it!&lt;/p&gt;

&lt;h3&gt;
  
  
  Project Overview
&lt;/h3&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--IYej69Vv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/y93ywrmjebtbdftbrqwa.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--IYej69Vv--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/y93ywrmjebtbdftbrqwa.png" alt="CloudResume Layout"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;The task is to build out an HTML Resume hosted in an S3 bucket, served using CloudFront. It should display a view counter that interacts with an API that triggers a Lambda function to increment a value in a simple DynamoDB table. All of this should be set up using CloudFormation and SAM templates. Finally we should set up a CI/CD pipeline with repos in GitHub that automatically push our changes to production after a commit to the master branch.&lt;/p&gt;

&lt;p&gt;Now this is completely new territory for me, I know about some of these technologies from studying for the AWS Solutions Architect Associate exam but never really worked with any of these technologies directly. This means I have a lot of learning to do and thankfully I had access to some great resources such as &lt;a href="https://acloud.guru"&gt;A Cloud Guru&lt;/a&gt;, &lt;a href="https://codeacademy.com"&gt;CodeAcademy&lt;/a&gt;, and of course Google.&lt;/p&gt;

&lt;h3&gt;
  
  
  PDF to HTML + CSS
&lt;/h3&gt;

&lt;p&gt;First step was to convert my PDF resume and rewrite it in HTML. I have some prior knowledge of HTML from beginner CS courses in university which came in handy and I did some quick googling for a guide for CSS to get my resume matching what I had in my PDF. Also with HTML I was able to embed some awesome certification badges that I earned from completing certification exams and link them to the &lt;a href="https://youracclaim.com"&gt;Acclaim&lt;/a&gt; site that verifies that my cert is active! &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--wSfUG407--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/hm3xhpwyg3ipge6s95kc.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--wSfUG407--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/hm3xhpwyg3ipge6s95kc.png" alt="PDF to HTML"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Introducing CloudChandy.com
&lt;/h3&gt;

&lt;p&gt;Next thing to do is to host my HTML resume in a static S3 website but before I did that I needed to think of a clever domain name for my website. I spent way too much time on this part but I finally decided on CloudChandy.com, now I had to go to Amazon Route 53 to register it. A couple of clicks and dollars later and I became the proud owner of CloudChandy.com! Time to put my HTML resume and CSS in an S3 bucket. The key thing that I learned here is to make sure your bucket name is the same as the domain that you registered or else I won't be able to navigate to the website. Its important to turn off the default options that blocks public access in S3 and also to apply the bucket policy that grants read-only access to items in the bucket.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "PublicReadGetObject",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::BUCKET/*"
        }
    ]
}
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Once that's done its time to add the ability to navigate to my site via HTTPS. I used Certificate Manager in AWS to generate a SSL certificate and I created a CloudFront distribution to serve my site via HTTPS!&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--aTAq_5-a--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/0sv8icw0tcovjo3bswy5.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--aTAq_5-a--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/0sv8icw0tcovjo3bswy5.png" alt="It's Secure!"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Building my first Lambda function
&lt;/h3&gt;

&lt;p&gt;Building out a view counter for my website is definitely my favourite part of this challenge. I have never built anything before so this was challenging for me but after completing this my mindset definitely changed to how I can make my life easier using code. First I had to put together a simple table on DynamoDB to store the views. I went ahead and went through the very informative deep dive on &lt;a href="https://learn.acloud.guru/course/4d91ceee-353d-47be-af9e-996ece43dca6/dashboard"&gt;DynamoDB&lt;/a&gt; and why stop there, I decided to take it a step further and work through the &lt;a href="https://learn.acloud.guru/course/d8a92be0-dbab-4498-a2af-375a7a591ae8/dashboard"&gt;CloudFormation&lt;/a&gt; deep dive to learn how I can leverage that to build a simple table in DynamoDB. &lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--4JYCVIBq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/qlqgssshol083xii7eem.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--4JYCVIBq--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/qlqgssshol083xii7eem.png" alt="First CloudFormation Stack!"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;After successfully creating a table I can start to build out a simple view counter using python and boto3, every time the script is run, it will increment the value in my DynamoDB table. After some trial and error, I got this working and uploaded to Lambda and got it working there too! From there I created an API trigger for the Lambda function that would run my script. Finally it was time to show the world my view counter, I created a simple script in Javascript using fetch that would invoke my API every time my site was visited. I fought a tough battle with CORS but eventually figured it out. I embedded this into my HTML resume along with a visual for the view counter and I was surprised to see the views up over 100! &lt;del&gt;I couldn't believe how much traffic I already had for this little site.&lt;/del&gt; Turns out it was just me testing too much...&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--mU8oHBh1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/gmu55rofelab58kptjrd.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--mU8oHBh1--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/gmu55rofelab58kptjrd.png" alt="Am I this popular?"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  The Beauty of Infrastructure as Code
&lt;/h3&gt;

&lt;p&gt;With all the excitement and adrenaline from successfully getting the view counter up and running I started thinking, what else can I do with coding and that is where I discovered SAM templates! SAM templates are basically how I can deploy Lambda functions, APIs, and many other things using CloudFormation. I instantly thought about converting my view counter into a SAM package and deploy it using CloudFormation. It is surprisingly easy to do especially after doing that deep dive on CloudFormation, I was able to quickly put together a template and run the following commands.&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight"&gt;&lt;pre class="highlight plaintext"&gt;&lt;code&gt;sam package \
  --template-file template.yaml \
  --output-template-file sam-template.yml \
  --s3-bucket your-S3-bucket

sam deploy \
  --template-file sam-template.yml \
  --stack-name NAME-OF-STACK \
  --capabilities CAPABILITY_IAM
&lt;/code&gt;&lt;/pre&gt;&lt;/div&gt;



&lt;p&gt;Infrastructure as Code is beautiful, I can run it, go grab a coffee, and be back and my whole environment will be built.&lt;/p&gt;

&lt;h3&gt;
  
  
  CI/CD Pipeline and Testing
&lt;/h3&gt;

&lt;p&gt;The final couple steps of the challenge involved writing tests, using Github Actions and building out a CI/CD pipeline for both my back-end and front-end code. In the past I've been using GitHub as more of a cloud storage solution for all my coding but I never thought that I can create a pipeline where code updated there gets tested and then pushed into the production environment. I actually went with two different methods to accomplish this. &lt;/p&gt;

&lt;p&gt;For my front-end code I set up AWS CodePipeline to sync content from my GitHub repo and to update my S3 bucket whenever there was an update on GitHub. From there I set up a Lambda function that triggered whenever an update was made to the S3 bucket to invalidate my CloudFront cache so the Resume updates are live. I know that there are better ways to do this now so I should go back to improve this in the future. &lt;/p&gt;

&lt;p&gt;For my back-end code I first created some testing for my python view counter script, this was challenging because what I test I just try running it and if it didn't run I went back to fix it until it did. However this method isn't as efficient as doing integration and unit testing. I set up a simple integration test for my script and ran pytest locally and successfully got it working! Final step was to automate this deployment pipeline so I created another back-end GitHub repo and set up GitHub Actions using a python testing template, then added in a command that if the integration test was successful it would run the SAM package and SAM deploy commands to automatically update my CloudFormation stack! At this point I was feeling amazing, I just set up CI/CD pipeline and made it super easy to update my code and push it live in just a few minutes.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--RBkPY2VT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/v8nkk3979pglxn0g7nh6.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--RBkPY2VT--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/i/v8nkk3979pglxn0g7nh6.png" alt="GitHub SAM Deploy"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;With that I was finally finished with the Cloud Resume Challenge, I gave myself a pat on the back and started writing this blog!&lt;/p&gt;

&lt;h3&gt;
  
  
  Things I learned
&lt;/h3&gt;

&lt;p&gt;This challenge was definitely an excellent learning experience and I could write a whole blog post on what I learned but instead of adding too much content to this post I'll try to keep this part concise.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Lambda is very powerful, cost effective, and easy way to run your code without spinning up a VM&lt;/li&gt;
&lt;li&gt;CloudFormation is the best way to create and manage AWS infrastructure&lt;/li&gt;
&lt;li&gt;SAM templates help create and manage Serverless technologies&lt;/li&gt;
&lt;li&gt;CI/CD is the best way to manage and work on code because of the ability to review code, collaborate, and revert changes &lt;/li&gt;
&lt;li&gt;Github Actions allows me to test my code across many types of environments to ensure full functionality anywhere, it can even be customized to run certain commands to push off other processes&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Conclusion
&lt;/h3&gt;

&lt;p&gt;Thanks for making it all the way through this blog, I have a tendency to rattle on when I get to write about something that interests me! If you want to take a look at my work please see below:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://cloudchandy.com"&gt;Andy's Cloud Resume&lt;/a&gt;&lt;/p&gt;

&lt;p&gt;There is always updates to be made so I'll never truly be done but I'm proud of what I have now and I'm excited to keep adding new projects and blogs!&lt;/p&gt;

</description>
      <category>aws</category>
      <category>career</category>
      <category>serverless</category>
    </item>
  </channel>
</rss>
