DEV Community

Jsirchia
Jsirchia

Posted on • Updated on

Gain valuable hands on experience with AWS! - The Cloud Resume Challenge

A Very Compelling Introduction

A few months ago I attained the AWS Solutions Architect certification. What I was unknowingly doing was completing step one of the Cloud Resume Challenge, issued by A Cloud Guru's Forrest Brazeal. You can find out everything about the challenge here, Cloud Resume Challenge, but essentially the challenge is to get an AWS Cert, make your resume a website, and then learn to utilize awesome AWS services, GitHub, and very handy DevOps principles.

The components are as follows:
(Follow the 'count')

  1. Website with visitor count
  2. Javascript is getting the visitor count from your API
  3. Your API is triggering a lambda function
  4. The Lambda function gets and updates the count, which is being stored in a dynamodb table
  5. Infrastructure as Code will be used and stored in a GitHub repository which will auto-deploy on updates to the code, with GitHub Actions
  6. The frontend code will also be in a GitHub repository that updates your website when you update the code with GitHub Actions as well

Below are all of the steps of the challenge, but in the order I did them.

1. Certification - AWS Solutions Architect

Studying, learning, memorizing. That's pretty much the gist of this step. Besides that I will say that I decided to get this certification because it was time for me to specialize further, and I've always been intrigued by the power of AWS and other cloud services. Also, what better time to study and get certs than when you can't go outside (The current year is 2020). There are tons of resources online that can prepare you for this cert, but I recommend finding some sort of practice test that you can be confident with.

2. HTML, 3. CSS, 4. S3 Static Website

I did something a little secret for this step. I found something I liked that was already made, then I copy and pasted it, and changed it to fit my own resume. Feel free to use this concept for your own projects, but keep it on the hush hush. I did add a few things like my cert badges, and a shine effect when you mouse over them. Once I had my resume as an index.html, creating a bucket in S3 and enabling static website hosting on the bucket is very straight forward and there are plenty of easy guides to do this. On to step 5 with a bucket that can be accessed through a long aws s3 url.

5. HTTPS 6. DNS

This step makes it so you can type a custom domain name into a browser, and get back the site that's in your bucket. Once I was done trying to fit different domain names into my name (jam.es 😔 nah, james.sir if only I was knighted). I settled on jamessirchia.com like a boring person, and bought it through Route 53. Next, because you need HTTPS, you need a certificate, and AWS has their own Certificate Manager for that. This was the first time I did something wrong in the challenge. The certificate has to be in a certain region. Next was to create a cloudfront distribution that would serve my site. After googling all the settings to be sure, it went smoothly. After tinkering to make sure I could get to my site without www, and making sure it was forcing https, I was off to #7

7. Database

At this point I needed a database so I could start with my Lambda function that would be using the number stored in it. I clicked create table, clicked create item, and clicked save. Then I clicked 1,072 more times off to the side of the console just to get the clicks out of my system before I started doing this with Infrastructure as Code in step 12.

8. Python 11. Tests

I won't describe line for line of my Lambda, but it essentially uses Boto3 to have my DynamoDB table be a variable in the code, and once it's there you can do things to your table, like update an item in it. I want this code to take whatever number is in my DB, increment it by 1, and return this number to the API that doesn't exist yet. Once I got it working with the database I actually went on to the API step, all the way to Infrastructure as Code, but I'll put the Test here anyways. Making the test was harder for me than making the actual lambda. I researched unittest, moto and mock, what 'if name == 'main':' is and so on. I had everything setup and in my head it was correct, but not working (Spaghetti Code 101). Then I realized I hadn't even tested to see if the fake dynamodb table was actually working right, and from there I went back piece by piece to make sure the parts of the test were actually working, and I ended up with a fully functional unit test.

9. API

At this point in the challenge, for this step, I went to API gateway in the console, and I clicked my heart out for the last time. I made an API, connected it to my Lambda function, and saw the number I wanted when I went to my API URL, even though it wasn't just the number by itself yet. This API doesn't really matter because one gets remade in step 12. Before this step I googled things about REST APIs and methods like POST vs GET

10. Javascript

Since the javascript is being used to call the API I didn't bother with it until this point. I looked at the different ways to call an API for this step, from older things like XMLHttpRequst to newer ways like Fetch and JQuery. This step wasn't too bad as it's a very documented thing to do. After testing it, and fishing for just the number I wanted, I was able to see the number that was in the database on my website, and the number incremented every visit thanks to the lambda function.

12. Infrastructure as Code

This was definitely the coolest part of the challenge so far. Leading up to this requirement, all the other steps like making the dynamodb table, lambda function, and API were just learning and testing moments, because all three get remade in this step. Instead of using Cloudformation to describe and create these resources by text, I used AWS's Serverless Application Model (SAM). If you are going to deploy serverless AWS resources SAM is the way to go. You still make a YAML file like other IaC methods, but it requires less details, and handles more things for you. For example, when you declare that you want a Lambda function you describe where your code is for the function, the runtime of the function, and a few other things like that. But, then you can also say, "slap an API on this sucker, and here's the path and the method." From that it would make your lambda function and an API that calls the function on the method you want. It was really fun to learn. At this point I also retooled my lambda function to return an api response with headers.

Special thanks to Chris Nagy and his SAM HelloWorld Tutorial which is a great jumping off point.

13. Source Control

This step was about making a frontend and backend repo in GitHub. I believe I did this before this step, but I also had the repos working with my VScode which is nice to work in.

14. CI/CD Backend

This is where fun automation happened. For this I used GitHub Actions. I made an action for my backend repo, so that whenever a change was pushed to it, it would run and deploy those changes to my AWS environment. There is a big marketplace (with free things) for GitHub actions where people have made it so you can easily do things like running SAM CLI commands. It was fun to be able to tell GitHub, hey, set up an Ubuntu environment with python and such, run my unit test, and deploy my SAM code to be actual infrastructure out in the world. I enjoyed finding what I needed to set up the right environment, and the different things I could use. I'm definitely going to keep practicing with CI/CD tools like GitHub Actions and Jenkins, because I want my next job to involve the use of those.

15. CI/CD Frontend

Much like the last step, but for the frontend instead. Rather than creating an environment to deploy SAM, it's about pushing your changes to your S3 bucket of step 4, and invalidating the cloudfront distribution, so it will be serving the new content instead of the old. I believe this step was very straightforward and easily googlable. The two things needed to be done can be done through regular aws cli commands, so if you can get everything working for step 14, this one should be easier.

16. Blog

Im still working on this step but once I click publish I should be about done

This challenge was very worth it

You should absolutely take this challenge if you are interested in the technologies and tools used above. My recommendation is to take the scenic route. Once you find exactly what you need for a piece of the project, stay a while, poke around and learn more about the subject. If you realize that your github action can just run on ubuntu-latest, learn about the other options like a matrix and what it can be used for. Once you have your SAM template, look into what else can be done with it, and what else could be made with cloudformation.

All in all I'm very glad I took the cloud resume challenge, and got hands on experience with all the technologies listed above. I highly recommend it!

Final product plug: jamessirchia.com

Top comments (0)