DEV Community

Cover image for A Security-Focused Cloud Resume Challenge
David O' Connor
David O' Connor

Posted on

A Security-Focused Cloud Resume Challenge

Hi! My name is David O' Connor and I am a Cloud/DevOps engineer based in Ireland. Before transitioning to this field, I worked as a musician and music teacher. Last year I decided on a career change. I have always had a passion for tech, and after completing Cybersecurity and Cloud courses I found a job in Cloud. I am really enjoying working in this area.

It may seem like quite a jump from music to Cloud, but believe it or not they have a lot in common! Both fields require a strong understanding of your architecture, including identifying potential failure points, recognizing symptoms of issues, and troubleshooting problems quickly. Preparation, versatility, and continuous learning are crucial. In music, I had to be able to play genres like jazz, rock, pop, and even classical – sometimes all on the same day! Similarly, in Cloud so far I’ve worked with Cloudformation, Kubernetes, Python, Docker, Javascript, Terraform, C#, the list goes on!

The Challenge

I came across this challenge while in training for a Cloud role with Deloitte Ireland. This consisted of a three-month course on AWS/DevOps resources, ending with the AWS Certified Cloud Practitioner certification. I enjoyed learning about these technologies and wanted to put my newfound skills to the test – the Cloud Resume Challenge was exactly what I was looking for.

The idea of this challenge is to create and host your own CV in the Cloud using HTML/CSS and an S3 static website. Next, you create a visitor counter using Javascript, Lambda, an API, and a database. Lastly, you create your resources as Infrastructure as Code and create CI/CD pipelines to automatically deploy them when changes are pushed to your Github. A full list of steps can be found here.

Setup:

I started the project by setting up an AWS Organisation in my root account. Next, I used org-formation to create dev and production OUs and accounts with budget alarms, password policies, and region restrictions. I also set up SSO with MFA so I could easily and securely access the accounts. I learnt a lot from this step as it closely resembles what you might typically encounter in a professional environment.

Chunk 1: Front-end

I spent some time getting the HTML/CSS exactly as I wanted it before uploading it to S3. I set up a static HTTPS website using S3, Cloudfront, and ACM. Next, I purchased a domain, created a Route 53 hosted zone pointing to my Cloudfront distribution, and updated my domain to use the provided AWS nameservers.

When I was recreating the front-end with Terraform I decided to improve the overall security. When researching best practices, I came across a very interesting blog on security headers. These HTTP headers help protect against threats like cross-site-scripting and clickjacking attacks. Luckily, these headers were very easy to add to Cloudfront through Terraform, and I was able to quickly upgrade my site’s rating from an F to an A.

Improved security rating with headers

I also wanted to try and implement DNSSEC. This protocol helps shield against DNS-based attacks by adding cryptographic signatures to DNS records. I found a guide on implementing this through Terraform which was very helpful. I was able to create a key, associate it with Route53, and configure my domain to use it.

Lastly, I decided to disable the S3 static website configuration as this requires public read access. Instead, I limited S3 access to Cloudfront using an origin access control. This provides essentially the same functionality as an S3 static website while also following the principle of least privilege.

Chunk 2: Back-end

I experimented a lot with this part of the project and was able to make a visitor counter I was quite happy with. However, when I revisited this step in Terraform I couldn’t resist trying Forrest’s mod, which was to create a unique visitors counter.

I wrote a Lambda function in Python that would retrieve, hash, and store the visitor’s IP address in a DynamoDB table along with a time-to-live value. DynamoDB uses TTL to delete items after a specified time – in my case I decided upon a month, meaning it would be monthly unique visitors.

Hashed IPs with TTL

I modified my other Lambda function to fetch the count of hashed IPs from this table while also incrementing and returning the value from a hit count table. I integrated these with the POST and GET functions of my API and implemented rate limiting to enhance API security.

Chunk 3: Front-end/Back-end Integration & Testing

I experimented quite a bit with Javascript and ended up with some code I was happy with. I decided to use Cypress for end-to-end testing as the guide suggested. I looked at previous examples and wrote tests to confirm that the API Gateway could update and retrieve from the databases correctly.

Chunk 4: Automation/CICD

Now for the fun part! I started by recreating my resources in Terraform. Next I researched how to configure a remote backend, eventually settling on S3. Following the challenge’s DevOps mods, I wrote a Github Actions file to deploy code changes to the dev account and only merge and deploy to production if tests pass. I also worked out how to uniquely name dev resources using the Git commit ID and tear them down after successful tests.

For security, I set up OIDC instead of storing my AWS access keys directly in Github. I also enforced signed Git commits and set up CodeQL to scan my code once monthly and upon pull requests. Lastly, I set up the front-end to automatically update the site and refresh Cloudfront whenever changes were made.

My site

Architecture of the site using AWS services

Above is a high-level overview of the architecture of my site. You can check it out at davidoconnor.me, and my Github repositories can be found here and here.

Reflections and next steps

I thoroughly enjoyed the challenge and learnt a lot in the process. I took my time and explored almost everything suggested in the guide. When met with a problem, I kept trying until I was able to find my way past it – something I always tried to do in music too. Whether learning a difficult tune or how to use complicated new equipment, I always tried to explore all the possibilities and persevere until I finally succeeded.

I particularly enjoyed the Python and Javascript parts of the challenge, and I would like my next project to focus on one of these. In terms of certs I am planning to study for the AWS Solutions Architect Associate next.

Thank you for reading, and thanks to Forrest Brazeal for this really enjoyable challenge!

Top comments (0)