I laid my eyes on Forrest Brazeal's Cloud Resume Challenge last year during my search for cloud related projects after I got certified in AWS Solutions Architect Associate.
I saw Brazeal's personalized shoutouts to the people who completed this challenge and got quite overwhelmed when I saw their amazing websites and success stories. At the time, I thought I could never complete this. How am I supposed to work on frontend and backend requiring several skillsets in HTML, JavaScript, Python testing, and all??!
However, my motivation to attempt this challenge surged prior week right before the weekend on Friday. So I completed this challenge in 3 grueling days and am quite proud and excited to see my CI/CD pipeline checks becoming green as I write this post!
As of now Forrest Brazeal's personalized code review is expired. Even then, completing this challenge is extremely helpful in learning the most important AWS cloud services inside out. This challenge guarantees that you will understand something about full-stack software development, version control, infrastructure as code, automation, continuous integration and delivery, cloud services and “serverless”, application security, and networking. And also, yes.. Googling stuff. :)
Enough chitchat, let's move onto the technical stuff.
Cloud Resume Challenge Requirements:
1. Gain AWS Cloud Practitioner certificate.
2. The resume should be written in HTML.
3. The HTML resume should be stylized with CSS.
4. Deploy HTML resume in AWS S3 Bucket.
5. Deploy the static S3 website with AWS CloudFront (allows HTTPS).
6. Implement a custom domain name for the website.
7. Include a website view counter in the webpage with Javascript.
8. The counter should retrieve and update view count in a database (i.e., DynamoDB).
9. Use a web API to fetch/update view count data instead of implementing Javascript directly.
10. Write an AWS Lambda function in Python to implement all this.
11. Generate tests for the python code.
12. Use SAM CLI to deploy your serverless architecture.
13. Implement CI/CD pipelines for the frontend and the backend using GitHub Actions.
Phase 1: S3 Static Website and Custom Domain Name
Firstly, I don't know how to create websites using CSS and HTML from the ground up. Bootstrap has great opensource startup templates, so I borrowed them for my use. After uploading the files to my S3 bucket, I immediately bought a domain name for my website. After that I provisioned a certificate for my domain for HTTPS.
Phase 2: Setting Up DNS and CloudFront Distribution
AWS's Route 53 is where the DNS magic happens. I generated zone files with the help of AWS console, but did not specify A records for my domain, yet. I then created a CloudFront distribution where the SSL certificate was provisioned. After this, I created the appropriate A records and Alias pointing to the CloudFront distribution.
Phase 3: AWS Lambda, Web API, DynamoDB and JavaScript counter
Bulk of my time was spent on this stage. I also felt this was the toughest to understand the underlying mechanisms and the interactions between the serverless solutions. I created the DynamoDB table first, then I created the AWS Lambda function in python which returned a REST API payload in JSON format. The payload can be forwarded to the JavaScript through the Web API. Generating the serverless services through the AWS console was easy. But remember, these deployments should be automated for practicing CI/CD.
Phase 4: Implementing IaaC through CloudFormation and SAM CLI
I really enjoyed deploying my serverless architecture with IaaC tools such as AWS CloudFormation and SAM CLI commands. This is a time saver when you have to deploy and configure several serverless services in your architecture. It allowed me to test my Lambda functions and Web API in so many different ways. Be sure to be extremely familiar with reading YAML or JSON formats. There are other ways to implement AWS resources such as Terraform and Serverless Framework. But I chose to do with AWS itself since this was a chance for me to be more familiar with AWS.
Phase 5: Creating Python Test
Creating the Python test for testing the Lambda function was very challenging. I used Pytest, Boto3 and Moto packages to create a mock AWS DynamoDB table and test the Lambda function on that. The learning curve is very steep. After several failed attempts of creating the python test, the satisfaction of seeing a passed test is unreal!
Phase 6: Implementing CI/CD with GitHub Actions and Automated Deployment of my Architecture
I learned a lot when I tried to deploy CI/CD for both my frontend and backend. I had to refactor my JavaScript code at this stage, since I did not want the AWS Web API's URL being exposed through my webpage and also for the counter to be properly deployed whenever I made changes to my GitHub repositories for the frontend and the backend.
For automated deployments to AWS resources, I had to create new users with limited and appropriate permissions. They were provisioned to my repositories by using secret access keys generated by AWS during the creation of the users.
I faced several failed attempts in deploying CI/CD pipelines for my frontend and the backend due to the issue with user policies in deploying AWS Resources. Make sure you know where to look and what kind of permissions are required to build, update and delete resources with SAM CLI and your AWS non-root user credentials. But once again, seeing a successful deployment of your serverless architecture is an indescribable feeling!
My Thoughts
This was a very challenging but a rewarding task. And I am really glad that I pushed through and completed it. I learned a ton about AWS and serverless architecture deployment through IaaC. Not only that, testing applications is a very important aspect for the implementation of CI/CD pipelines for your code. Of course, code re-writes and refactoring are usual in this aspect. In hindsight, we need to have the right mindset to make automated application/software deployments possible.
Top comments (2)
Great work!
Thanks!