DEV Community

Jake Cosio
Jake Cosio

Posted on

From Attending 504 Meetings to Fixing 403 Errors: A Teacher’s Journey Through the Cloud Resume Challenge

Some background:

As a veteran teacher of nine years, I have attended more meetings than I care to remember: weekly staff meetings (appropriately named “Monday Grind”), parent-teacher conferences, booster club meetings, department meetings, and so many other meetings that can be filed under “this could have been an email.” However, the meetings that I was always glad to be a part of were the ones that were expressly aimed at helping students with qualifying disabilities get the support that they needed in the classroom: namely 504 meetings and ARDs.

In these meetings, the student, their parents, a counselor, an administrator, and a few teachers would sit in a conference room and have an honest conversation about the needs of the student. We would hear feedback from all of the student’s teachers along with the concerns of the student and their parents. Then we would discuss what needed to happen as we moved forward. The goal was always the same - implement the accommodations that were needed for the student to succeed and remove any accommodations that were no longer working or needed.

These accommodations have a wide range of impact on the classroom. Some of the more invasive ones include modified curriculum and reduced assignments. On the other hand, some students only need things like preferential seating, individual checks for understanding, or, one of the most relatable, breaking longer assignments into more manageable chunks.

Although I enjoyed much of my time as a teacher, when my first child was born in May, I realized that a career transition was in my family’s best interest. I am sure that my extended family thought I was crazy for leaving my nine-year-long career in education to start over from scratch in the cloud. However, with the encouragement of my wife and a few friends alongside my goal of being a “life long learner” that I acquired (read- that was ground into the deepest parts of my soul) in nine years of mandatory professional development, I decided to take the leap. I immediately dove into Adrian Cantrill’s AWS Certified Solutions Architect - Associate course to learn the basics after seeing it recommended in every reddit post I read.

After working my way through the course, it was time to sit for the exam. With Adrian’s help, I was able to pass the exam and become an AWS Solutions Architect - Associate with the badge and everything! However, after updating my resume, I realized that I needed some practical experience: enter the Cloud Resume Challenge by Forrest Brazeal.

The Cloud Resume Challenge Overview:

After reading the overview of the project, I was excited to implement the serverless architecture that it described and learn/use many other tricks of the trade for the first time. I specifically enjoyed how the challenge described the many parameters but didn’t include the methodology with which to accomplish it. The teacher in me said, “This challenge gives students a great structure to learn by doing: it is project based learning at its finest.”

Steps 1 and 16 are pretty stand alone. Step 1 is to get any cloud certification. Since I had already passed my AWS certification exam, I thought I was off to a pretty good start. Step 16 is to write a blog post about your experience with the challenge. (Beware the upcoming absurd run on sentence) The real challenge is contained in steps 2-15 where a challenger must implement one product: a CSS-styled, static-hosted website written in HTML that uses HTTPS and a custom DNS domain name which, ultimately, displays both your resume and a count of the number of visitors to the website by using JavaScript to communicate with an infrastructure as code provisioned serverless backend that allows for it to make an api call that invokes a Lambda function to update a database and respond with the updated value to be displayed all of which uses GitHub for source control and GitHub Actions for CI/CD. Thankfully, instead of presenting it as an overwhelming string of words –for you visual learners– we can look at the following architecture:

Image description

As someone who was new to cloud, I was happy to see that the ebook that accompanies the challenge had an accommodation baked in to set me up for success. As I moved forward with the challenge, I was proud to remain faithful to one of the strategies that was most often discussed in the 504 meetings; I approached this project in chunks:

1: The Frontend - HTML, CSS, S3 static hosting, HTTPS, and DNS

  1. Setting up the API - database, API, and Lambda
  2. Integration/Testing - JS and tests
  3. Automation/CI - Infrastructure as Code (IaC), source control, and CI/CD

Each of these chunks presented their own challenges and triumphs.

Chunk 1: The Frontend

This part of the challenge I more or less found pretty straight forward. I don’t have a ton of HTML/CSS experience, but I was able to find a workable template pretty fast and edit it to my tastes. Well, to be completely honest I found something that I kind of hated, but I was so excited about the more cloud focused parts of the challenge that I decided that I would loop back to it at the end of the challenge. As I am writing this blog post, I still have a couple hours of work editing a new template that I really like left. Luckily, now that I have the CI/CD pipeline fully set up, it will be a breeze to implement!

Once I got to the AWS console, I felt more at home immediately. One of the labs from Adrian Cantrill’s course actually had me host an S3 static site with HTTPS and DNS, so I was off to the races with the rest of this chunk.

I registered my domain name in Route53(R53), made my (appropriately named) Simple Storage Service (S3) bucket, uploaded my website resources to the bucket, requested a certificate in AWS Certificates Manager(ACM), used ACM to send records to R53, and set up a distribution pointed at my S3 bucket in CloudFront to force HTTPS. Overall, I would say that this was by far the easiest of the four chunks.

Chunk 2: Setting up the API

This chunk was much more interesting. I started by creating my database in DynamoDB as recommended in the challenge. I used “website” as the partition key (in case I ever want to add visitor counters to other websites). To keep database costs down, I chose to set the table up as on-demand read and write, so we only pay for what we use since, as a personal website, I don’t anticipate it consuming much capacity. I also chose to use an encryption key owned by Amazon DynamoDB to keep things simple.

Next I used API Gateway to provision a REST API. In order to integrate the POST method with the Lambda function, I had to put this on the back burner for a bit.

I then focused my attention on the Lambda function that I chose to write in Python. The code is rather simple. It increments the VisitorCount attribute of the website in the DynamoDB table by one and returns the new value to eventually be displayed on the website. At first, I got an error and realized that I needed to give the Lambda function DynamoDB access. After a short jaunt into IAM to set up the permissions, I ran the test in the console again. This time I got the expected response from the database, so I knew it was time to finish setting up the API.

I went back to the API Gateway service, added a non-proxy Lambda integration of the POST method to the default resource path, and enabled CORS through the console (enabling the OPTIONS method). Then I had to toy with permissions until I had the API communicating with the Lambda function. This was accomplished with a resource policy to allow the API to invoke the Lambda function.

Chunk 3: Integration of the front and back ends/Testing the code

This chunk is, quite literally, where it all came together. The javascript required to make the api call and display the response on the website was the first thing that tripped me up a little. It was easy enough to set up the fetch to make an API call using the POST method and return the response as JSON, but displaying the data on the website took me a bit to wrap my head around. After reading several blog posts detailing the interaction between JS and HTML, I realized that I could post it by looking up a div id, inserting an element, and using innerHTML to write the data returned from the API call to the website.

I decided to make a slight modification to the project by using Cypress to test the code. Due to having no prior Cypress experience, I spent a few days on this step. Luckily, by looking at some examples, I was able to write some tests that make sure that the lambda function/api call is operating as expected.

Chunk 4: Automation/CI

I found this chunk to be the most satisfying by far. I started by making another modification to the project. The original challenge encourages challengers to use SAM and Cloudformation to automate the back end. However, knowing that it would be a good idea to build some cloud agnostic IaC tool experience, I decided to use Terraform.

Terraform is a blast. I hadn’t had much of a chance to work with it before this project but fell in love with it very quickly. I took the entire back end infrastructure (DynamoDB, API Gateway, and Lambda) and converted it to an IaC template by using the Terraform documentation and a few blog posts to help me work through a few errors.

Due to setting up the new API as a proxy integration with Lambda, I added a little code to the lambda function to add a header to the response (to prevent a CORS error). I had seen a ton of posts in the Cloud Resume Challenge Discord about CORS errors and was trying my darndest to get the backend all set up to avoid them before my first “terraform apply.” However, that was just not in the cards: I immediately got the error.

This was by far the most challenging part of the project for me. I tried so many things. At first, I thought that my terraform code wasn’t setting up CORS right. I just knew that I had made a mistake somewhere. However, after going through each line of code having to do with API Gateway letter by letter for the third time, I started looking elsewhere. My savior ended up being a blog post that encouraged me to add CORS headers to 4XX and 5XX errors. As soon as I did this, my CORS error turned into a 403 error, and I had a different, although much less frustrating problem.

The 403 error is a basic access denied error. I spent more time than I would like to admit looking at AWS documentation trying to solve this one. Embarrassingly, the fix was super simple and, true to form, hit me like a ton of bricks while reading yet another blog post. I was trying to make the API call without the stage and resource path in my javascript. The moment I added those the error cleared up, and I was ready to move on.

I set up source control by storing my code on GitHub using two repos: one for the frontend and one for the backend. I’m glad that I did this halfway through my terraform journey. Not only did it break things up, it also gave me a good chance to practice GitHub operations.

The final step was to set the CI/CD or continuous integration/continuous deployment pipeline with GitHub actions. This had me working in yet another language: yaml. Apparently, many people get frustrated with yaml’s pickiness with indentation, but VSCode made it pretty easy to see when I had an issue. After reading a number of articles, I was able to knock out a simple workflow. The biggest hurdle I had to jump came in the form of setting up the role for the workflow to use in order to interact with AWS resources. I used IAM to set up GitHub as an identity provider and then edited the role trust relationships to solve the issue.

I did a bit of trial and error with a ton of commits to get the pipeline working how I wanted it to, but soon enough I completed both of the pipelines and, with that, the project as a whole.

Final Thoughts:

I really enjoyed this project. Here is a link to the website if you'd like to see it. You can find the links to the GitHub repos there if you want to check out the code. I still have a few tweaks that I’d like to make, namely the HTML and CSS. I would like to continue to improve this project if I have time to revisit it in the future: increasing security, changing from a hit counter to an IP address counter to be a true visitor counter, implementing a true test environment separate from production. For now though, I think it is time to move on to a new project. If you have any ideas or suggestions for my next project or want to discuss this one, please leave a comment!

Top comments (0)