DEV Community

CodyKal
CodyKal

Posted on

AWS - The Cloud Resume Challenge

Background

I first heard about the cloud resume challenge when I was scrolling through the r/devops subreddit. It looked like an interesting way to improve my AWS Skills. I had recently gotten the AWS Solutions Architect Associate Cert, and was looking for a way to practically apply my newfound knowledge.

Knowledge

SAM

It all started with AWS SAM, and getting that set up correctly on my Ubuntu machine. Including installing the AWS CLI. I read some of the SAM documentation and was able to learn the commands to get a simple serverless template setup. Unfortunately for me, I ran into a few issues, because SAM uses python 3.9, and my system default on Ubuntu was 3.10. After a few Google searches and lots of typing, I was able to get 3.9 as my system default and I was up and going!

Typing

I added an s3 bucket to my template.yaml, and created an index.html just to test things out. I found that I wasn't able to access my website, I was getting error 403: Forbidden. After searching around, I found that this was occurring because I had not enabled public access to my bucket, as I had no bucket policy attached to my bucket in my template. I got the bucket policy setup, deployed my template, and was able to read my simple "Hello World" on my web browser.

CloudFront and Route 53

Next, I looked into implementing a CloudFront distribution and Route 53 route to my site. I ended up buying a domain name directly from Amazon, and set up a hosted zone.

After setting up the hosted zone, I edited my template.yaml to set my Route 53 hosted zone to point to CloudFront. After fiddling around and having a few errors getting the hosted zones to cooperate, I found that CloudFront has its own hosted zone Z2FDTNDATAQYW2 that was required when connecting Route 53.

Certificates, Certificates, Certificates!

I moved on to creating a public certificate to secure traffic coming into my CloudFront distribution. I used Amazon certificate manager and requested a public certificate, then verified, using a Route 53 CNAME record. I had quite a few issues getting this to work, and my SAM deployment kept getting rolled back. After searching online, I found that I was missing something as simple as the Amazon Resource Name in my certificate properties, causing the deployment to fail. After including the ARN, the deployment succeeded, and my website now had an SSL certificate attached. But then I ran into another issue.

A technical issue

Every time I navigated to my domain name, I kept getting a 504 error. I found that if I typed in the direct name of my CloudFront endpoint, I still wasn't getting anything. I navigated directly to the s3 bucket website endpoint and that worked. I was pretty confused and decided to, once again, consult the internet for answers.

It turns out that S3 website endpoints do not allow https traffic directly from CloudFront, and only support http traffic. The user connecting to the website would be served traffic through https, but traffic from CloudFront to S3 would not be encrypted. After changing the origin protocol policy in my template, and applying, I was successfully able to pull up my site using the domain name!

Lambda and API Gateway

After setting up CloudFront, I looked to Lambda and API Gateway. I started by setting up a simple JavaScript <script> in my html code.

fetch('https://29sm874uv6.execute-api.us-west-2.amazonaws.com/Prod/helloworld')
            .then(response => response.json())
            .then((data) => {
                document.getElementById('usercount').innerText = data.count
            })

Enter fullscreen mode Exit fullscreen mode

This would pull the variable 'usercount' from my API, and push it onto the screen whenever a user accessed the page. For now this was a simple setup just to get things started, but would be revisited later.

I then setup my DynamoDB database and deployed it using the CloudFront Template. I next decided to split up my Lambda functions into two different functions that would both interact with my API Gateway. I edited the template.yaml to split the base helloworld function, included with SAM, into two different ones.

I started working on getting DynamoDB set up properly to have items that my JavaScript could pull from to update the visitor counter on my site. Here I ran into quite a few issues.

Problems

The next part of The Cloud Resume challenge was probably the most difficult for me. I needed to setup my lambda function to increase the DynamoDB "visitor_count" property of my "Visitors" ID. I scraped through a lot of boto3 documentation and StackOverflow in desperate search of answers, and it seemed like everything I tried didn't want to work. After going through the boto3 documentation, I found some helpful examples that I was able to fit into my Python code, that finally worked! It started updating my DynamoDB table every time I navigated to the API gateway endpoint.

The second function was much easier to set up, as I had learned a lot more about how boto3 interacts with DynamoDB. After creating the Python script for Lambda, I had to implement it into my html code. This part was a bit harder, as I had very little knowledge of JavaScript and didn't know how json worked. After a lot of trial and error which included changing variables, changing my html, editing and completely changing the data type of my "visitor_count" DynamoDB item, I kept running into errors with json formatting. I finally found some more info on json and was able to apply it to my Python code, which formatted the data correctly for the JavaScript in my html. I got super excited when I saw that the visitors counter was updating on my page!

Cheering Baby

Github Actions and CI/CD

The end was near! I got started on getting Github Actions set up. I finally did something I should have done a long time ago, using Git, and getting a repo going on Github. I set up a .github/workflows folder and fiddled around with the template for a while. I had a few issues where the file directories were not working how I wanted them to in the runners, and I had to set a specific working-directory, which solved the issue. I implemented some basic python testing on my lambda functions.

I integrated my AWS resources with Github Actions and set up a template where if the python testing was successful, it would start another action that would deploy the infrastructure to AWS. This finally finished the backend deployment of my resources! Next I worked on getting the template deploy my frontend website to my S3 bucket by implementing Jake Jarvis' s3-sync-action. I tested to see if it worked after building my template and, to my surprise, it worked on the first try! I made it so this action depended on my backend actions running successfully first.

HTML and CSS

Finally, now that the backend was completed, I started working on getting my website looking pretty. I ended up purchasing a template from ColorLib that looked good to me, and implemented it. I ended up messing around with the html and CSS A LOT, trying to make everything look how I wanted it to. So, so, so many <divs> to center.

After everything looked good, I pushed to my repo and everything looked amazing! I was so happy to participate in this challenge, even if I am a couple years late. It taught me so much practical knowledge about AWS architecture and how CI/CD pipelines work. And I can't forget about the Infrastructure as Code, lots of work put in there to make everything work correctly. Even a bit of web development!

If you made it all the way through my ranting, thanks for reading, and catch me at my website codykall.com
Linkedin
Github

Top comments (0)