Recently, I stumbled across a forum online talking about projects that people had done to demonstrate their skills in the cloud. One commenter linked a post called “The Cloud Resume Challenge by Forrest Brazeal” and it got me curious. Later that day I found myself diving head-first into a new project that really tested my new AWS knowledge to the max!
I started this project as I wanted to have something on my CV which demonstrated my newly acquired knowledge from passing 3 AWS exams. Not coming from an IT background, I was proud of how far I’d come already this year, after using the Covid-19 lockdown as a chance to switch careers from a job in market research. However, I wanted to get a bit more hands-on with the AWS platform and get some valuable practical experience.
I was in a good position conceptually as I had passed the two AWS associate exams that would help the most with this project in my opinion (Solutions Architect and Developer) – it was just a case of putting what I’d learnt into practice. Easy right?!
Projects are a great way to solidify what you think you know and having read through the very vague instructions of the challenge, I thought this project would put into practice everything I had learnt so far on this journey.
My website was already hosted in S3 and using Route 53, and can be found at www.joshmearsportfolio.com (subtle plug I know!). However, as part of the challenge I needed to use HTTPS for security. This is where CloudFront came in and I was able to set this up without too much trouble.
I then pushed my code to my GitHub repository and created a ‘CodePipeline’ for my front end work so that any changes I made to my code were automatically synced to my S3 bucket hosting my website. This was my first practical use of CI/CD tools and I instantly saw the benefits - it was super useful! I didn’t have too much experience with git at this point so this process was also great for learning some of the basic commands and getting used to pushing new versions to my main branch – it was great seeing these almost instantly, once I refreshed my webpage in my browser too!
Next was what I considered the hard part. Not coming from an IT background, I found these steps very daunting despite already passing 3 AWS exams. It’s all very well having a good foundation in the concepts and theoretical knowledge but putting it into practice was going to be much harder – which is precisely why I took up this challenge!
There were a couple of ‘doh!’ moments for me, one being that I didn’t realise you had to press ‘deploy’ in the console to save any edits to your lambda code before testing again. Through my studies I was used to having a ‘save’ button and since the console had been updated, this had been changed and was enough to throw me off course for a while!
After watching a few YouTube videos and doing a lot of googling, I eventually finished my fully functioning website with a serverless backend!
But that wasn’t enough. Next was learning how to deploy my backend as Infrastructure-as-Code!
For this, I had to learn how to use AWS SAM (serverless CloudFormation). This was my favourite part of the whole project as it required a lot of attention to detail and demonstrated just how powerful Infrastructure-as-Code is. AWS SAM gives you the ability to deploy and remove services with relative ease once you have your template set up.
To practice again, I used a template project from the AWS documentation to see how it all worked and to learn the SAM commands. Once I felt comfortable with it, I cracked on with writing my own template file. This required a lot of patience as YAML indentation isn’t very forgiving! Also, making sure all of my resources were referencing each other in the correct way was tricky. I learnt so much during this part of the project and it made me feel like it was all worthwhile taking up this challenge.
Documentation and googling was a lot more fruitful for the SAM part of the challenge and although it took some time (and plenty of error messages!) I managed to build my template and deploy my backend.
For those of you who haven’t deployed anything with AWS SAM before and are a beginner like me, I recommend the following link to get you started:
This YouTube video is also a great tutorial for understanding the advantages of serverless and diagrams explaining SAM and how to use it. This may be helpful for anyone who hasn’t studied for any of the associate AWS exams yet!
Lastly, I also recommend YouTube channel ‘FooBar Serverless’ for understanding SAM better and learning more on writing a template.yaml file.
Despite managing to deploy my serverless backend, I encountered more problems with my visitor count and integrating the backend with the frontend. API Gateway wasn’t working at first as the output from my lambda function wasn’t in the correct format (502 error). After a couple of long nights googling, I read that DynamoDB has a bug that causes get requests to return in ‘Decimal’ format sometimes. This was extremely frustrating as it meant my API Gateway didn’t like the format of my lambda output as it was expecting pure, simple JSON. I wanted to persevere and get the right format and found the simplejson package which was meant to sort this problem in python. I figured out how to implement this locally but needed to learn lambda layers to be able to import the simplejson module into my function. Through going down a few rabbit holes and doing some research, I managed to deploy and configure a layer but was still getting an error! Lambda layers is something I am still looking into but for the purpose of simply wanting to get the visitor count number in the right format, I resigned to chucking in some python code I’d found on stack overflow that would sort the problem! It was great experience learning more about Lambda layers though so I’m happy I came across this problem!
Finally, after getting my project fully working, I had to implement CI/CD to my backend. As I had used AWS’s own CI/CD tools for the frontend, I decided to use GitHub Actions for the backend to learn some different tools and get more exposure to different practices. I had used GitHub before but never come across GitHub Actions so this was all new to me. Again, the documentation was very useful for this process and helped explain what was needed to get a ‘workflow’ set up. It involved writing a template file that would be trigerred by a push to my backend repository. After a few late nights I managed to get this working and again realised the benefits of having this set up for future testing and updating.
As part of this final process, I wrote a test in python using pytest which checked for a 200 status code from my lambda function. Once passed, the whole of my backend infrastructure would be packaged and deployed using a fancy GitHub Action!
I am sat here writing this blog, feeling slightly tired from the whole process, but extremely happy with the finished product and feeling like I have learnt just as much as I did when passing the AWS exams! It is true that building something in the cloud teaches you so much – things that I wouldn’t necessarily have ever learnt from a book or an online course. I would recommend it to anyone looking to gain experience of the AWS tools and to practice their coding too. You can find my finished product here and help me get my visitor count up whilst you’re at it 😉
I’d like to say a huge thanks to Forrest Brazeal for designing the challenge! Next for me? I think I’m going to have a stab at one of A Cloud Guru’s ‘challenges of the month’ and continue learning more about the AWS cloud.