Hello, fellow tech enthusiasts! I'm thrilled to share with you a detailed account of my recent journey through the exciting world of Amazon Web Services (AWS). I took on the task of hosting my resume as a static website on the cloud as part of the Cloud Resume Challenge, an initiative designed to showcase competency in cloud computing. You can find more about the challenge here.
The project began with setting up an Amazon S3 bucket to host the static files for my website, which included HTML, CSS, and JavaScript files. The process of configuring the S3 bucket was quite straightforward - create a bucket, upload the static files, and set the bucket's properties to enable static website hosting.
To ensure the content of my website was delivered quickly and efficiently, I utilized Amazon CloudFront, a content delivery network (CDN) service. The configuration process was a bit intricate, involving the creation of a distribution, specifying the S3 bucket as the origin, and configuring settings such as the default root object but the wizard simplifies this process immensely.
Next, I put Identity and Access Management (IAM) into action. IAM was instrumental in managing access to my AWS resources, including creating policies and roles that AWS Lambda and GitHub Actions could use. This was crucial for automating the deployment processes.
Speaking of automation, I used GitHub Actions to automate the deployment of the website every time I made changes and pushed them to my GitHub repository. This involved creating a workflow file that defined the steps necessary to deploy the website, including invalidating the CloudFront cache, which ensured the latest content was always served to visitors.
The visitor count section was achieved using a combination of AWS Lambda, API Gateway, and DynamoDB. I set up a Lambda function to increment a counter in a DynamoDB table each time a visitor accessed the site. Then, I used API Gateway to expose this function as an HTTP endpoint, which was called from the webpage every time it loaded. This part can get a little confusing so work through this slowly, especially if you will be setting up Custom Domains for your API Gateway.
I employed Route 53 for DNS management. After registering a domain and setting up routing policies, I connected it to my CloudFront distribution, making my resume website accessible at a custom domain name. I also added sub domains to be used by the API Gateway Custom Domains.
As with all technology choices, there were trade-offs involved. While AWS offers a rich set of features and high scalability, it can be complex and more expensive compared to more straightforward alternatives like Netlify or GitHub Pages for hosting static sites. However, the level of control and flexibility offered by AWS is hard to match.
If I were to embark on this project again, I would consider using the AWS Serverless Application Model (SAM) or Terraform. Both tools provide a more streamlined way to deploy and manage AWS resources, with SAM being particularly useful for building serverless applications and Terraform offering the advantage of being cloud-agnostic.
In conclusion, the Cloud Resume Challenge provided a practical and hands-on way to enhance and showcase my cloud skills. I encourage all developers and cloud enthusiasts to consider taking on this challenge. It's an enriching journey through the AWS ecosystem, providing hands-on experience with key AWS services. Happy coding, everyone!
Top comments (0)