DEV Community

Cover image for The Cloud Resume Challenge
Matt
Matt

Posted on • Edited on

The Cloud Resume Challenge

Overview

The Cloud Resume Challenge by Forrest Brazeal is a project framework that guides participants as they make a resume website fully hosted in the cloud utilizing many of the different services. This isn't a step by step tutorial that shows you have to do every part but more of an assignment that tells you what to implement next but not exactly how. It requires the participants to be resourceful, make mistakes, and in the end, helps reinforce the theory they have learned in the classroom.

Part 1: Cloud Website:

In this section I set up my HTML/CSS, host it in S3, point a CDN to it and assign the CDN a custom DNS and secure it with an SSL certificate.

HTML, CSS

My HTML, CSS, and JavaScript skills are enough to be dangerous but I'm no front end dev. I found a free template and spent admittingly way too much time customizing it to the way I like before starting the main purpose of this challenge which is deploying and building on this solution in the cloud.

Set up MFA/IAM

I set up MFA using my phone on my root account and then created an IAM user to use as it's best practice to not use the root account.

I installed and set up AWS Vault on my local machine. AWS Vault is a tool to securely store and access AWS credentials in a development environment. AWS Vault stores IAM credentials in your operating system's secure keystore and then generates temporary credentials from those to expose to your shell and applications. It's designed to be complementary to the AWS CLI tools.

Static Website using S3

I wanted to set up IaC using AWS SAM now instead of the later to get practice using it to set up my services.

First I needed to install the SAM CLI.

Next I initialized a SAM template and then set the IAM policy.

Once that was created I changed to the created directory and ran sam build but It's having problems finding my python.exe

  • Build Failed Error: PythonPipBuilder:Validation - Binary validation failed for python, searched for python in following locations : ['C:\Users\mleve\AppData\Local\Microsoft\WindowsApps\python.EXE', 'C:\Users\mleve\AppData\Local\Microsoft\WindowsApps\python3.EXE'] which did not satisfy constraints for runtime: python3.8. Do you have python for runtime: python3.8 on your PATH?
  • I had to install Python 3.8.10 and checkbox the setting to Add Python 3.8 to PATH which fixed the issue

Deploy SAM with 'aws-vault exec my-user --no-session -- sam deploy --guided'

  • I got another error Error: Failed to create managed resources: An error occurred (InvalidClientTokenId) when calling the CreateChangeSet operation: The security token included in the request is invalid.
  • To resolve this I deleted my old access key and created a new one and re-initialized my-user in the aws-vault

Now if I go to CloudFormation (and switch to the correct region) I will see two new stacks created.

I want to create and S3 bucket using SAM so I added in the below command to my template.yml file in the Resources section then re-ran sam build to re-build the template then ran 'aws-vault exec my-user --no-session -- sam deploy'. No need for the --guided part of the cmd as those settings were saved in a file.

  • Use this in future to do both in 1 cmd: sam build && aws-vault exec my-user --no-session -- sam deploy

Image description

The next step was to add onto this config below but I kept getting ACL access errors.

  • Bucket cannot have ACLs setwith ObjectOwnership's BucketOwnerEnforced setting (Service: Amazon S3; Status Code: 400; Error Code: InvalidBucketAclWithObjectOwnership
    • After some research it turns out "This is a legacy property, and it is not recommended for most use cases. A majority of modern use cases in Amazon S3 no longer require the use of ACLs, and we recommend that you keep ACLs disabled. For more information, see Controlling object ownership in the Amazon S3 User Guide."

Image description

Now on the properties tab I can see Static Web Hosting is enabled with this URL http://cloud-resume-evenson.s3-website.us-east-2.amazonaws.com but in the sam build I'm getting a new error "API: s3:PutBucketPolicy Access Denied".

CDN

Since I wanted to use IaC to build as much of this as possible instead of using the web interface I updated my template.yml file with a CloudFront distribution section and ran sam build && deploy to create the CDN. This created a CDN that I could use to access my S3 bucket through an auto generated domain name that wasn't very pretty (https://d2yhamyo6y5t0x.cloudfront.net).

Now that I can access S3 directly through the CDN and the S3 link I removed the public access I granted earlier to S3 so the only way to access the web app is through the CDN.

DNS/HTTPS

To get a nicer looking domain name I went to the Route 53 DNS and bought a domain that I can use for my website.

I also want to secure the site so I went to AWS Certificate manager to create an SSL certificate for the domain I bought.

I added a Route53Recordset to my yaml file with my updated domain name and host zone ID and mapped it back to the S3 origin name that the CDN distribution section points to.

I need to allow my custom domain access to the CloudFront distro so I need to also add a CertificateManager section with my domain name I purchased and create an SSL for to the yaml file.

Originally I created all my infra in us-east-2 and kept getting an error when trying to use SAM to attach a certificate to my CloudFront CDN. I tried creating my certificate in us-east-1 and 2, nothing seemed to work. I deleted my whole stack and restarted deploying everything in us-east-1 the next day and low and behold it works now. IaC already coming in handy.

Part 2: Serverless API

In this section the challenge is to set up the infrastructure to later add JavaScript to the front end to keep track of the number of visitors. In order to do this I set up a DynamoDB table to store the number of visits, create a POST route on my API Gateway, and write some python in a Lambda function to update the database.

Database

Next I created a DynamoDB table. Since there is no real schema I just needed to define the key schema and ID attributes.

I'll test the table once I set up the API.

API

SAM already creates an API gateway by default so I just need to define a POST method on the /visit route.

For the lambda I wrote it using Python since I'm familiar with it.

  • I used the get_item method to return the number of visits from the database
  • I then incremented the counter by 1 and used the put_item method to update the value in the database
  • Finally in the return section I return statusCode 200 with the visit_count in the body. I also set the CORS headers to allow all so I don't get an error since I didn't implement CORS for this project

Python

I followed an example written in Golang and translated it to Python.

I went to run a test event in lambda and got the below error. After some research it seems the error is with support for requests for urllib3 for python versions >=3.7 and <3.10. I can either specify "requests >= 2.28.2, < 2.29.0" which will use an older version of openssl in my requirement.txt or upgrade my python dependency in my yaml file to be 3.10 instead of 3.8 and on my local machine. I choose the update my requirements.txt and that fixed the issue.

Part 3: Front End / Back End Integration

In this section I added some JavaScript to invoke the API Gateway which will trigger the Lambda function and update the count in my database. I also added some unit tests to my Python code.

JavaScript

This next section looks simple but took more time than I thought. I created an AJAX function to call my API endpoint and then update the amount in the HTML. At first I didn't realize I needed the jQuery script tag to run the AJAX code. Then I accidently made the id the same for the div and span so my "Visitors:" word was also getting removed when the AJAX was run.

Tests

I created a simple unit test to check for a 200 response code from the lambda function and then if it returned the visit count in the body as a numeric like expected.

I had to update my JavaScript to parse this object since before I was sending the raw number and now I was sending {visit_count: 60} and it was displaying as [object Object] in my html before the fix.

Part 4: Infrastructure as Code and CI/CD

In this section you were supposed to set up the IaC but I started building it from the beginning and I feel like this helped me incrementally iterate on the template as IaC is a new topic for me. I also set up a CI/CD pipeline to test, build, and deploy my infra on each commit. I did this using GitHub Actions because I was more familiar with it from previous projects and hadn't gotten a chance to use the CodePipline AWS provides.

Infrastructure as Code

I've been using AWS SAM which under the hood builds on CloudFormation since the beginning. This has come in extremely handy but also caused some headaches. There were times where I needed a fresh restart of my stack to start over at a point where I knew the code was working. In this case IaC came in very helpful as doing this all manually would have been a pain. On the other hand there were times where I didn't fully understand why I would get errors, mainly in CloudFront when trying to update my infra. This would cause me to chase down some pretty nondescriptive bugs and end up just deleting the stack and starting over adding in 1 piece at a time until I could determine what was causing the error. This was especially true when I was trying to add my lambda function using SAM. I had to narrow it down to the CodeUri path being incorrect by re-launching until I narrowed it down to that command since the error message was generic.

CI/CD

First I set up GitHub Actions by creating a .gihub/workflows/main.yml in my root directory of my project. I added an access key and secret access key from my IAM user to GitHub actions so I can reference these variables in my main.yml file.

The first test I automated was the python lambda code.

Next I added a test that depends on the first one passing that will run a sam build and sam deploy to deploy the infra.

Finally in the last block if the previous pass I automate the front end upload by having it upload the static files to the S3 bucket.

It's failing in the build and deploy infra block.

Image description

Adding --use-container to sam deploy solved this and now it works

Conclusion

As someone new to the cloud this project definitely lived up to the "Challenge" part of the name. I want to give credit to the YouTube channels "Open Up The Cloud" and "Cumulus Cycles" because without them this journey would have been much longer and more frustrating. Overall I really enjoyed this project. I feel like I grew a lot as a developer and gained confidence in my cloud skills. I plan to keep this project updated to display for years to come.

Top comments (0)