About This Project
When I stumbled upon the Cloud Resume Challenge by Forrest Brazeal, I saw it as an exciting opportunity to demonstrate and further develop my skills in cloud computing and serverless architecture. Leveraging my AWS Certified Cloud Practitioner (CCP) certification as a foundation, I embarked on this journey to build a cloud-powered portfolio website. The project encompassed various AWS services, including S3 for hosting, CloudFront for HTTPS, DynamoDB for data storage, and AWS Lambda for serverless functions. Terraform facilitated the Infrastructure as Code (IaC) approach, ensuring consistency and scalability. A robust CI/CD pipeline, orchestrated by GitHub Actions, automated the deployment process, ensuring seamless updates and maintaining high availability. Through this project, I gained hands-on experience in architecting cloud-native solutions, deploying serverless applications, and implementing DevOps best practices.
Steps To Finish Project
- Certification
- HTML
- CSS
- Static Website
- HTTPS
- DNS
- Javascript
- Database
- API
- Python
- Tests
- Infrastructure as Code
- Source Control
- CI/CD (Back end)
- CI/CD (Front end)
- Blog Post
Architecture Overview
Detailed Steps
1. Obtaining an AWS Certification:
Recently acquired the AWS CCP certification on January 2024.
2. Creating an HTML and CSS Resume:
Understanding that the focus of the challenge lay more in the cloud architecture aspect rather than front-end design, I opted for efficiency over perfection. I quickly found a suitable portfolio website template through a simple online search, made some moderate tweaks to personalize it, and moved on, knowing I could always come back later to refine the design.
3. Deployment to Amazon S3:
With the portfolio in place, the next steps involved deploying it to Amazon S3. This step was crucial for hosting the static website and making it accessible to users. Despite the simplicity of S3 hosting, it laid the foundation for the subsequent steps in the project.
4. Implementing HTTPS with CloudFront and Route 53:
Ensuring the security and integrity of the portfolio website, I implemented HTTPS using Amazon CloudFront and Route 53. By leveraging CloudFront's content delivery network (CDN) capabilities, I distributed the website's content globally, improving performance and reliability for users worldwide. Route 53 provided domain name system (DNS) services, enabling me to manage domain names and route traffic efficiently to my CloudFront distribution. Together, these services not only enhanced the website's security but also optimized its delivery, ensuring a seamless browsing experience for visitors.
5. Implementing JavaScript Visitor Counter:
Implementing functionality beyond the static website, I added a JavaScript visitor counter to display the number of visitors. To store and manage this visitor count data, I turned to Amazon DynamoDB.
6. Utilizing DynamoDB for Visitor Count:
Leveraging DynamoDB's capabilities, I created an API to facilitate communication between my JavaScript code and the database. This API triggered a Lambda function, written in Python, to read from and update the DynamoDB table accordingly.
8. Developing Python Code, Testing, and Infrastructure as Code (IaC) Deployment with Terraform:
Python code was developed to handle requests from the API, interact with DynamoDB, and manage the visitor count. Unit tests were written to ensure the reliability and functionality of the Python code. Additionally, Terraform was used to define the AWS Lambda function, including its configuration and dependencies. The Python code, along with the Terraform configuration, allowed for the seamless deployment of the Lambda function to AWS.
Python Function Deployed to AWS Lambda
To automate the testing process, YAML (.yml) files were created to define the unit test configurations. These configurations were integrated into GitHub Actions, ensuring that the unit tests were automatically run upon each code push, providing continuous integration and ensuring the reliability of the deployed Lambda function.
Moreover, IAM roles and policies were attached to the Lambda function to define its permissions and access controls. Specifically, the IAM roles were configured to allow permissions for accessing DynamoDB, enabling smooth interaction with the database.
9. Establishing a CI/CD Pipeline with GitHub Actions:
To automate the deployment process and enable continuous integration and continuous deployment (CI/CD), I established a CI/CD pipeline using GitHub Actions. This pipeline streamlined the development workflow, automatically testing and deploying changes to my portfolio whenever new code was pushed to the repository. Additionally, GitHub Actions were configured to update the associated Amazon S3 buckets with the latest changes whenever code was pushed to the repository.
Frontend .yml file
GitHub Workflow
Final Thoughts
In conclusion, the Cloud Resume Challenge has been an invaluable journey for me as an aspiring cloud engineer. Through this project, I've had the opportunity to delve into various AWS services, implement real-world solutions, and gain hands-on experience in cloud architecture and development. As I continue on my path, I'm excited to further explore the possibilities of cloud computing and continue honing my skills. I am currently pursuing a master's degree in CS with a concentration in enterprise and cloud computing, I'm eager to apply the knowledge gained from this project and further my journey in the dynamic field of cloud technology.
Link to Cloud Portfolio
Find me on linkedin
GitHub Repository
Top comments (1)
This is great, I had a great time completing the Cloud Resume Challenge using Azure! Thank you for sharing your experience