DEV Community

Tariq Moore
Tariq Moore

Posted on • Edited on

AWS Cloud Resume Challenge

Who I Am

Hello everyone! My name is Tariq Moore and I’m a Web Developer, aspiring Cloud Engineer, and DevOps Engineer. This is my first time writing a blog post, so I hope you enjoy this deep dive into the Cloud Resume Challenge, and my thought process while completing it!

A few weeks ago, I acquired my AWS Cloud Practitioner Certification which provided a high-level overview of key services like S3, CloudFront, Route 53, EC2, and more. I’ve decided to aim for the AWS Developer and Solutions Architect certifications but needed a way to solidify the knowledge I gained. In comes the Cloud Resume Challenge created by Forrest Brazeal.

Cloud Resume Challenge…?

“Cloud Resume Challenge is a 16-step hands-on project designed to help you bridge the gap from cloud certification to cloud job. It incorporates many of the skills that real cloud and DevOps engineers use in their daily work” – Forrest Brazeal

AWS Cloud Resume Challenge

These are the steps to this challenge:

  1. Acquire Certification
  2. HTML
  3. CSS
  4. Static Website Configuration
  5. HTTPS
  6. DNS (Optional, but recommended)
  7. JavaScript
  8. Database
  9. API Creation
  10. Python
  11. Tests
  12. Infrastructure as Code (via Terraform)
  13. Source Control (GitHub)
  14. CI/CD (Backend)
  15. CI/CD (Frontend)
  16. Blog Post Creation

AWS Certification

The first step is to obtain an AWS cert. Any of the certifications will suffice, but for beginners it’s generally recommended to have the AWS Cloud Practitioner Certification for the foundational knowledge. I obtained my Cloud Practitioner cert last month!

Creating the Website

The next step is to create a static website utilizing HTML, CSS, and JavaScript for your resume. I had deployed a portfolio website back in 2020 using GitHub Pages but decided to start fresh for this challenge. To save time developing, I did some research for website templates that I could customize later. I found the website HTML5up.net, which provides fully responsive websites with source code, and I decided on the “Hyperspace” template. After customizing the layout and adding more character to the site (like coffee and bears), I had a fully functional resume site.

Image description

Website Deployment

From here we start diving into the AWS services beginning with S3. I created an S3 bucket and configured the bucket policy for static website hosting. This was good and all but when accessing the site it used HTTP instead of enforcing HTTPS. Additionally, S3 buckets are regional instead of global. Someone accessing the site would have to connect to the US region but what if they were in Japan?

To reduce latency and enforce HTTPS, I incorporated CloudFront. With CloudFront my website files would be cached globally in CloudFront edge locations. Someone from Japan or South America would receive the files from a location closer to them, instead of having to access the US region.

Connecting DynamoDB, Lambda, and API Gateway

The challenge asks us to create a section of the website to display how many people viewed our website. To accomplish this, I created a DynamoDB table with one row, an ID column to act as the primary key, and a number column that will be incremented. To increase the number each time someone visits the site, I would need something to pull the value, update it, and return said value to my JS code. In comes the Lambda and API Gateway combo.

Image description
I designed my code in Python but ran into issues when returning a response. Here I learned more about status codes, CORS, and returning the proper headers/body in Python that’s readable for JSON.

I ran into issues with headers/body because my Python code was returning a “Decimal” from DynamoDB. After running the code, I would get JSON Error: “Not Serializable.” To fix this, I researched a Decimal Encoder class to turn my Decimal into a string.

The next problem I had was with CORS. When accessing the API Gateway URL, I constantly got a ‘No Access-Control-Allow-Origin Header Present’ error. I was sure that I set up the access. The console showed it being allowed properly, but still an error. Redeployed, made small changes, tried a different origin, attempted a wildcard with “*,” and still nothing.
The problem was how I created my API Gateway resource in Terraform.

Image description
It was designed as a proxy resource. This led me down a rabbit hole of sorts for some hours. Eventually, I figured since it is set up as a proxy, my Lambda code is now responsible for returning the proper headers/body, not the gateway.

Infrastructure as Code (IAC)

When creating the AWS services you shouldn’t be building them in the AWS console. Lambda, API Gateway, and DynamoDB should be created using Infrastructure as Code (IaC). Any changes to these resources should be also done via IaC. This could be done with an AWS SAM template and deployed using the SAM CLI, but I opted for the industry standard Terraform. This didn’t take much but tons of reading Terraform documents and guides as I haven’t used this tool before!

CI/CD

Creating a CI/CD pipeline was pretty cool. I haven’t done something like this before, so initially, I utilized AWS CodePipeline and added my GitHub repository. The problem is CodePipeline does have a fee associated with it after 30 days when using a V1 pipeline, and a per-execution charge for V2 pipelines. Both are eligible for AWS Free Tier, but I decided to try my hand at GitHub Actions for CI/CD. This way, I can build a pipeline from scratch for projects outside of AWS like GCP, Azure, etc. The pipeline was created in a way that any push changes to a specific branch would update the files in my S3 bucket and invalidate the CloudFront cached files to reflect the new changes.

Image description

Conclusion

Completing the Cloud Resume Challenge has been an invaluable journey, blending theoretical knowledge with hands-on experience across various AWS services and web development tools. From earning my certification to deploying a dynamic, globally accessible resume site, each step reinforced my skills and unveiled new concepts. This experience has not only solidified my understanding of cloud infrastructure but also prepared me for future projects as I pursue the AWS Developer certification. I am excited to continue this journey and tackle new challenges with the confidence and skills I've gained. Thank you for following along!

Top comments (0)