DEV Community

Cover image for Conquering AWS Cloud Resume Challenge
Madhesh Waran
Madhesh Waran

Posted on • Edited on

Conquering AWS Cloud Resume Challenge

The Cloud Resume Challenge was created by Forrest Brazeal to upgrade and showcase your cloud skills. Completing it requires you to follow a strict set of requirements that will test and challenge your understanding of the cloud.
I did the AWS Cloud Resume Challenge and this is how it went:

Certification

First, your resume needs to have an AWS certification on it. I got my AWS Cloud Practitioner certificate by using the Stephen Mareek course on Udemy. I think that the course and the accompanying practice test are enough if you get above 80% on your first try. But if you score below 70%, I would advise you to sit through some practice tests on Udemy and not try your hand at the exam till you consistently get an 80% in most of the practice tests you try.

HTML

Your resume needs to be written in HTML. Not a Word doc, not a PDF. I previously had no idea about HTML except the little I learned in fifth grade and knowing it was an easy language that you can easily pick up. I learned HTML using the w3schools website and the Odin project and made a simple HTML page for my resume.

CSS

Your resume needs to be styled with CSS. I already had a good idea of CSS since you get an HTML/CSS as a package deal in most of the tutorials you find. I didn’t want to think too much about designing my website, so I just watched a YouTube video for a website resume and styled my page to look exactly like that. I decided that I would redesign my website with my ideas later when I had free time. But this would do for now.

Static Website

Your HTML resume should be deployed online as an Amazon S3 static website. This was the easiest part with lots of tutorials and a very extensive AWS document that provides a comprehensive guide. So, I easily whipped up an S3 bucket, turned on static website hosting, and uploaded my website files to it. The website endpoint worked fine and this did not require any troubleshooting except me looking through AWS documentation and stack overflow to see if I have given any unnecessary permissions that could threaten my account security.

HTTPS

The S3 website URL should use HTTPS for security. You will need to use Amazon CloudFront to help with this. This was where I encountered my very first hiccup. I bought a custom domain name from Namecheap and wanted it to point to my CloudFront distribution. I was very excited that my domain name only cost a dollar but I fear that the service merits that cheap price. I wanted that lock sign next to my website but I learned that getting an SSL certificate from ACM for my cheap domain would require a whole lot more effort to validate it than if I had purchased from route 53. The process should be easy but since I did not purchase the premium DNS package that lets me manipulate host records, I had to find a sneaky way to validate which I did. This stunted my progress for quite a while but, I persevered and created a custom DNS server for it using route 53 which will be explained in detail in the next process. Aside from getting my SSL certificate from ACM, everything else was a breeze. I quickly set up a CloudFront distribution using my S3 bucket website endpoint as the domain origin. This gave me that sweet https:// locked sign for my site which I wanted very much.
The resource that was very helpful during my troubleshooting process is this doc:
Alternate Domain Developer Guide

DNS

Point a custom DNS domain name to the CloudFront distribution, so your resume can be accessed at a custom domain name. You can use Amazon Route 53 or any other DNS provider for this. I first created a Route 53 hosted zone which gave me 4 nameservers. I switched my Namecheap DNS with these 4 servers and they were very helpful in creating records to get my DNS validated without falling into the premium DNS trap of Namecheap.

Javascript

Your resume webpage should include a visitor counter that displays how many people have accessed the site. You will need to write a bit of Javascript to make this happen. Once again, I used the w3schools database to get a feel of the language. I decided to write a simple script that will call the API as soon as the page has loaded and then display the responding data from the API. My code was a bit archaic since I used the XML HTTP function instead of the function that is specifically made for fetching APIs but it seemed to work as it is so I did not change it.

Database

The visitor counter will need to retrieve and update its count in a database somewhere. I was advised to use Amazon’s DynamoDB for this purpose. Creating the table was straightforward and was finished in minutes.

API

You should not communicate directly with DynamoDB from your Javascript code. Instead, you will need to create an API that accepts requests from your web app and communicates with the database. I used AWS’s API Gateway and Lambda services for this. This gave me a bit of a struggle as I did not have a single idea of what I was supposed to do. So, I read many documents and watched many videos to understand what it was that I was supposed to do. Once I felt enough confidence in my knowledge of API gateways I decided that I would stumble around and make it work somehow since the AWS official documentation was confusing to me and I decided not to use it. I first experimented with an HTTP API which I felt must be cheaper and got it working. I later switched to using REST APIs as it was easier to deploy with CI/CD integration. Integrating with the Lambda and deploying gave me an API endpoint which I inserted into the Javascript code.

Lambda

I created a Lambda function to integrate with the API and DynamoDB database. You will need to write a bit of code in the Lambda function to access and update the database. I decided to explore Python – a common language used in back-end programs and scripts – and its boto3 library for AWS. There were many resources available for creating a Python code and I decided to use them as my guidance since I had not used Python before this. Those guides were very extensive and helpful in helping me create my code. Boto3 for Dynamodb

Infrastructure as Code

You should not be configuring your API resources – the DynamoDB table, the API Gateway, the Lambda function – manually, by clicking around in the AWS console. Instead, define them and deploy them using Terraform. This is called “infrastructure as code” or IaC. It saves you time in the long run. I had no previous experience with Terraform and I went in fresh with only the official documentation as my guide. Even though I had to rewrite my code due to various trial and error methods, every time the code worked it felt like Christmas. It was simple and the official guide is the only thing you need to deploy the entire infrastructure. The error logs were very specific and this helped me not waste my time making unnecessary code changes. It took me three days to write the code to automatically provision all the AWS resources with only the official guide and no other resources.

Source Control

You do not want to be updating either your back-end API or your front-end website by making calls from your laptop, though. You want them to update automatically whenever you make a change to the code. This is called continuous integration and deployment, or CI/CD. I achieved this by creating a GitHub repository for my backend code.

CI/CD (Back end)

I set up GitHub Actions such that when I push an update to my Terraform template or Python code, they automatically get packaged and deployed to AWS. This was achieved by this Github action by appleboy.

CI/CD (Front end)

Create GitHub Actions such that when you push new website code, the S3 bucket automatically gets updated. I used the s3 sync action resource made by jakejarvis.

Blog post

The final goal of the challenge was to post our experience with this challenge in a blog. I was deciding between Dev.to, Hashnode, and medium as my blog site. I still have goals to create my own blog but I chose dev.to since I felt that I would rather be a part of a close-knit dedicated community than be a part of a large site writing pointless articles to get more and more traffic.

Top comments (1)

Collapse
 
cheikhnouha profile image
Cheikhnouha

Good