DEV Community

cner-smith
cner-smith

Posted on

Can the Cloud Resume Challenge Teach Me AWS?

Are you tired of sending out resumes and never hearing back from potential employers? Do you want to showcase your skills and experience in a unique and impressive way? Look no further! In this blog post, I will take you on a journey of creating a resume website as part of the AWS Cloud Resume Challenge, an innovative approach to standing out in the competitive job market.

As a job seeker in today's digital age, having a traditional paper resume is no longer enough. Employers are increasingly looking for candidates who can demonstrate their technical expertise and creativity. That's where the AWS Cloud Resume Challenge comes in. It's a hands-on project that allows you to create a resume website using Amazon Web Services (AWS) and show off your skills in a practical and dynamic way.

The first step in my journey was to plan out my resume website. I brainstormed ideas and thought about what sets me apart from other job candidates. I wanted my website to reflect my personality, highlight my professional achievements, and showcase my technical skills. I made a list of the AWS services I wanted to use, such as Amazon S3 for hosting my website, Amazon Route 53 for domain registration, and AWS Lambda for serverless computing.

Once I had a clear vision for my resume website, I set up my AWS account and started building my website using the AWS Console.

Hosting the Website with Amazon S3:

Amazon Simple Storage Service (S3) is a highly scalable and durable object storage service that can be used to host static websites. I created an S3 bucket to store my website's HTML, CSS, and JavaScript files. I then configured the bucket to act as a static website by enabling the static website hosting feature in the S3 console. I also set up permissions to make the website files publicly accessible. This allowed me to host my resume website and make it accessible to visitors via a website URL.

Registering a Custom Domain with Amazon Route 53:

Amazon Route 53 is a domain registration and DNS management service that I used to register a custom domain name for my resume website. I followed the documentation provided by AWS to register a domain name of my choice, such as connersmith.net, using Route 53. Once the domain was registered, I configured the DNS settings to route traffic to my S3 bucket hosting the website. This involved creating a Route 53 hosted zone, adding a record set for the domain name, and specifying the S3 bucket as the target for the domain name. It's important to note that DNS changes can take some time to propagate globally, so I had to be patient and wait for the changes to take effect.

Implementing Serverless Functions with AWS Lambda:

One of the key components of my resume website was a dynamic visitor counter that keeps track of the number of visitors to my website. To implement this feature, I decided to leverage the power of serverless computing using AWS Lambda.

AWS Lambda is a serverless compute service that allows you to run code in response to events without having to manage any servers. It can be used to create small, focused functions that can perform specific tasks and integrate with other AWS services, such as DynamoDB and API Gateway, to build serverless applications.

To implement the visitor counter, I first created a DynamoDB table to store the visitor count. DynamoDB is a managed NoSQL database service provided by AWS that offers low-latency and scalable performance. I defined the table schema with a primary key for the visitor count and configured it for read and write capacity units to meet the expected workload.

Next, I created an AWS Lambda function that would be triggered by an API Gateway endpoint whenever a visitor accessed my website. The Lambda function would increment the visitor count in the DynamoDB table and return the updated count to the API Gateway, which would then be displayed on the website.

I used python 3.8 as the runtime for my Lambda function, and I wrote the function logic in JavaScript. The function code consisted of using the AWS SDK for JavaScript to interact with DynamoDB, specifically the update_item method to increment the visitor count in the DynamoDB table.

I then created an API Gateway REST API with a custom route that would trigger the Lambda function when accessed. I configured the API Gateway to pass the request from the website to the Lambda function and return the response to the website.

Once the Lambda function and API Gateway were set up, I tested the visitor counter by accessing my website and verifying that the count was incremented in the DynamoDB table and displayed correctly on the website. I also added error handling in the Lambda function to handle cases where the DynamoDB update fails, ensuring the reliability of the visitor counter.

Throughout the implementation process, I faced some challenges, such as understanding the event-driven nature of Lambda and API Gateway, configuring the permissions and roles for the Lambda function to interact with DynamoDB securely, and troubleshooting issues related to API Gateway integration. However, with careful reading of documentation, experimenting with different configurations, and thorough testing, I was able to overcome these challenges and successfully implement the visitor counter as a serverless function using AWS Lambda.

In conclusion, leveraging the power of serverless computing with AWS Lambda allowed me to implement a dynamic visitor counter for my resume website in a scalable, cost-effective, and serverless manner. It provided the flexibility to handle varying workloads, automatically scale based on demand, and offload the management of servers, allowing me to focus on the application logic rather than infrastructure management.

Next, I focused on optimizing my resume website for performance and security.

Optimizing Performance with Amazon CloudFront:

Amazon CloudFront is a content delivery network (CDN) that helps improve the performance of websites by caching and delivering content from edge locations around the world. I used CloudFront to distribute my website's content, such as images and CSS files, to edge locations, which are geographically distributed points of presence. This helped reduce the latency and improved the load times for visitors accessing my website from different locations. I configured CloudFront to use my S3 bucket as the origin, which is the source of the content that CloudFront caches and serves. I also enabled SSL encryption using AWS Certificate Manager to ensure that the content served by CloudFront is encrypted and secure.

Design and User Experience:

While the technical implementations were crucial, I also paid close attention to the design and user experience of my resume website. I used modern web design principles, such as responsive design, clean layout, and visually appealing typography, to create a professional and polished website. I made sure that the website was accessible, following web accessibility guidelines such as providing alternative text for images, using semantic HTML tags, and ensuring proper color contrast for readability. I also optimized the website for different devices, such as desktops, tablets, and mobile devices, to ensure a consistent experience across different platforms.

DevOps Practices and CI/CD:

In the final stages of creating my resume website, I implemented a robust CI/CD (Continuous Integration/Continuous Deployment) pipeline using GitHub and GitHub Actions. This allowed me to dive deeper into the intricacies of Git version control and gain further expertise in working with Linux and the command line. I set up GitHub Actions workflows to automatically build, test, and deploy my website whenever changes were pushed to the repository, ensuring that the latest version of my website was always deployed to production.

In addition to the CI/CD pipeline, I took on the challenge of converting my entire website's infrastructure to Terraform, which is a widely-used Infrastructure-as-Code (IaC) service. This required me to define my entire infrastructure, including AWS resources like S3 buckets, CloudFront distributions, API Gateway, Lambda functions, and DynamoDB tables, as code using Terraform configuration files. I learned the ins and outs of Terraform, including its declarative syntax, resource lifecycle management, and state management. I also gained a deep understanding of the benefits of using IaC, such as versioning, reproducibility, and scalability, in managing cloud infrastructure.

This extra challenge of implementing Terraform for my website's infrastructure proved to be highly rewarding. It gave me a comprehensive understanding of managing infrastructure as code, allowing me to treat my infrastructure as a software project with version control, automated testing, and continuous deployment. It also provided me with the ability to easily update and scale my infrastructure, and quickly recover from any issues using Terraform's infrastructure management capabilities. Overall, implementing CI/CD and converting my infrastructure to Terraform greatly enhanced the efficiency, reliability, and scalability of my resume website, and provided me with valuable skills in modern DevOps practices.

Challenges faced along the way:

Learning Curve: As with any new technology or service, there was a learning curve involved in getting familiar with the AWS services and their configuration. I had to spend time reading documentation, watching tutorials, and experimenting with the services to understand how they work and how they can be integrated to create a resume website. It required patience and perseverance to overcome the initial challenges and gain confidence in using the AWS services effectively.

Managing Permissions and Roles: AWS uses a fine-grained permissions model that requires careful management of permissions and roles to ensure that the right level of access is granted to different resources and services. Setting up the correct permissions for the S3 bucket, Lambda functions, API Gateway, and CloudFront distribution required careful planning and configuration to ensure that the website functions properly and securely. Managing IAM roles and policies, understanding resource policies, and troubleshooting permissions issues were some of the challenges I faced along the way.

Debugging and Troubleshooting: Building a resume website with multiple AWS services involves complex interactions and configurations. Debugging and troubleshooting issues can be challenging, especially when dealing with issues related to API Gateway, Lambda functions, DNS settings, and CloudFront configurations. I had to learn how to use AWS CloudWatch, logs, and other monitoring tools to diagnose and resolve issues effectively.

Deployment and Testing: Deploying changes to the website and testing the functionality required careful planning and coordination. I had to ensure that changes made in one service, such as updating Lambda function code, were properly tested and deployed without disrupting the overall website functionality. Testing the website on different devices, browsers, and network conditions to ensure a consistent user experience was also a challenge that required thorough testing and iteration.

In conclusion, the AWS Cloud Resume Challenge was a challenging but fulfilling experience. It allowed me to showcase my skills and experience in a creative and innovative way, setting me apart from other job candidates. The journey of creating my resume website using AWS services was a valuable learning experience that not only improved my technical skills but also helped me better understand the capabilities of cloud computing. I am confident that my resume website will impress potential employers and open doors to exciting job opportunities. If you're looking to stand out in the competitive job market, I highly recommend taking on the AWS Cloud Resume Challenge and creating your own stellar resume website using AWS services. Good luck!

Check out my website at https://connersmith.net


This blog post was edited with the assistance of generative AI for professionalism.

Top comments (0)