DEV Community

gccalvin
gccalvin

Posted on

Completing the Cloud Resume Challenge with 0 Real-world Experience

Introduction

This is my first blog post, going over my experience with the Cloud Resume Challenge containing 16 steps to complete, ranging from programming, to workflow management, to API creation.

The purpose of the challenge is to learn and showcase common cloud engineering skills needed in the workforce. From the link above you can see very limited information is provided, just general guidance. It's up to me to come up with the solutions necessary to complete the challenge.

I currently as of writing this, work in Cybersecurity, have been in IT for a few years, but with zero professional cloud experience. I've dabbled in AWS a bit, but no real projects to speak of.

This is a brief post going over my experience with the challenge.

Note: The headers are numbered in the order presented within the challenge.

1. Certification

The first step is passing the AWS Cloud Practitioner Exam. By the time I started the challenge I was already certified, and was working towards my Solutions Architect Associate (which I currently possess).

2. HTML

I hesitated with starting the challenge mostly due to the front-end development section, as I've done a good amount of Python and C# (Unity game development) in the past but not a lot of front-end.

I essentially copied my resume which was in the form of a pdf and transformed it into HTML and CSS. This process was time consuming to say the least, as it's mostly trial and error getting elements to line up properly.

3. CSS

Similar to the HTML, CSS needs to be structured so that the elements are aligned properly. The biggest challenge was the margins and moving elements to the right side of the page without breaking the other elements. Again, this was mostly trial and error.

4. Static Website

This was an easy step, as I've done it before in my training for AWS certifications. Upload the html and css to a bucket on AWS. I also named the bucket gabecalvin.com, so I didn't need to recreate the bucket later when I bought my domain.

5. HTTPS

Unlike S3, I never used CloudFront before and it was fun to figure out how to get a certificate and link everything together. Registered a domain on Route 53, setup CloudFront with an origin pointing to the S3 bucket, and configured an Alias record routing gabecalvin.com to CloudFront. By the end, https://gabecalvin.com would route from Route 53, to the nearest CloudFront edge location, to my static website on S3. It's also worth noting that the CA cert for my website was a public cert provided by AWS, so it was free of charge.

6. DNS

As discussed during the HTTPS section, I used Route 53 as suggested with the domain gabecalvin.com. Pointed an Alias record to CloudFront.

8. Database

I decided to work backwards and start with the back-end programming before touching the JavaScript for my visitor counter. The first step was creating a DynamoDB table with an item that would hold the value of visitors. I made this a simple integer.

10. Python

Now we jump to the Python code that will query the database and return the visitor counter value to the client. This code goes through multiple revisions in the future for optimization but the basic principle stays the same. Use the Python boto library to access AWS resources, in this case the DynamoDB table we just created.

9. API

The API will be used to bridge the JavaScript and Python Lambda function together. This is pretty simple to set up and if I send a GET request to the API endpoint path I can see the visitorCounter value going up.

7. JavaScript

Now back to the frontend I needed to create some JavaScript that would send a GET request to the API endpoint upon loading the page. This is where I experienced my first hurdle. The code is quite simple, and there are 20 different ways to do it. However, I ran into issues where the request would sometimes get blocked. Eventually, I tracked this down to CORS (cross-origin resource sharing). I modified CORS to allow my requests and no longer experienced issues. I tied my JavaScript to the html page to see the counter. Now the website is complete.

11. Tests

This step made little sense to me for this project. I understand Unit and Integration tests, but for my simple python function that just increases a value by one it seems silly to test. Because of this I created a complete integration test that queries the database, saves that value, sends a GET request to the API, and validates the results. Afterwards it undoes the value as a cleanup step. It works well and I set it up as a Lambda function on AWS. I also added EventBridge and SNS. Now this runs every week and sends an email with a success for fail message.

At this time I also created another function that runs weekly with an email of the response. This time it tells me how many new visitors have visited the website.

12. Infrastructure as Code

It took me a couple days to really make progress with IaC. This is mostly due to some of the confusing documentation regarding AWS SAM (Serverless Application Model). I even contemplated switching to standard Cloudformation before going back.

Mostly due to lack of proper documentation it was difficult to get the HTTPApi syntax correct. The yaml file is very picky for CORS configuration on an HTTPApi. Luckily I found this github issue which allowed me to troubleshoot my yaml file.

13. Source Control

I've used GitHub very little in the past, so I was excited to finally learn how to use this tool that has been a staple in the Dev and open-source scene for so long. I created two repositories, one for the front-end and one for the back-end.

15. CI/CD (Front end)

I started with the front-end as that seemed easier and would allow me to learn how github actions worked (which by the way, are super cool!).

Since I'm in Cybersecurity, giving Github access to AWS was a concern to me. Obviously storing access keys within the github action was out of the question. AWS and Github generally recommend using GitHub Secrets to store keys and calling those from the workflow. This is pretty secure, but I wanted to take it one step further. I created a role and configured it with an identity provider and OpenID. Now the workflow assumes the role granting temporary permissions.

The actual workflow is pretty simple, just copying over data from github to the S3 bucket.

14. CI/CD (Back end)

This is where I go off course. This is also where the majority of my time is thrown into this project. On the surface it's pretty simple, just use SAM within the github action (which is an ubuntu vm) like you would locally. This part is true, and the deployment isn't the issue... it's the testing.

The goal is to run my integration test I created prior to deployment. This is a problem because my test requires the infrastructure's existence to test. I can't test the API without an API, and likewise with the database and lambda function.

Luckily SAM has a few features to locally test your api and functions. This is clear on the documentation: sam local start-api can be used to test the API. `sam local invoke' can run your Python functions.

There are two primary issues. The first was the database. SAM doesn't currently support testing the DynamoDB Table locally. Because of this, I had to rely on Docker to create the table.

The second issue was networking. Due to the lambda functions being called within a docker container, and the way 'sam local start-api' works, I couldn't get my python function (the integration test I made earlier) to communicate with the API, yet it can reach the docker database.

I created an issue on the sam-cli github page. After banging my head against the wall for a week I wanted someone else's opinion on the issue. Though I learned a lot, I could never get the GET request to go through properly. I've worked this issue for many hours and I'm guessing it may be due to how 'sam local start-api' works, as I can call 'sam local start-lambda' without issue. The difference is the first will test the API, wheras the second will just call the app directly.

If you follow the issue I linked you can see I came to the conclusion that I didn't need SAM. Why does my integration test need to run in SAM? I don't need a lambda function deployed for my Integration test, I only need that for the primary app. I switched to using a local python function instead with very few issues. Actually, using a local python function is more efficient than creating a lambda function with SAM. I was able to send a GET request to the local api within the workflow and everything worked as expected. I also made it so a Python error will throw stopping the deployment in case the integration test fails.

Here is an example of the output:

Image description

Finally, I modified the original function to create the VisitorCounter item if it doesn't exist, so that if I deploy a fresh stack everything would work. Originally I used Virginia to test my Stack, and migrated to Ohio once everything was working. Now, I can make a new commit and the stack will be updated.

16. Blog Post

We've now reached the end. I've learned a lot through this challenge and it was a lot of fun. I have lots of plans for my website, though I'll keep the resume section the same for the sake of this challenge. I'd like to add a better front-end, with a separate page for the resume, and maybe create my own blog. I also need to cleanup my code a bit.

Thanks for reading!

Top comments (0)