DEV Community

Cover image for AWS Cloud resume challenge
David WOGLO
David WOGLO

Posted on • Updated on • Originally published at blog.davidwoglo.me

AWS Cloud resume challenge

In this article I describe how things I worked on the AWS CRC project With some slight comparison with Google Cloud based on my own experience. By the way, I also wrote an article about the Google Cloud version of this project, you can have a look at it here.

It is better if you are new to all this Cloud related stuff, to take the AWS Cloud Practitioner certification exam, as recommended in the CRC guide as a first step. If you are not familiar with cloud environments or come from another cloud provider, taking this exam will help you validate your knowledge of the cloud and your familiarity with the different services of AWS (in case you come from another cloud provider).

Personally, I had to quickly complete the AWS Cloud Practitioner Quest to refresh my knowledge of AWS as I am quite familiar with the cloud and regarding AWS, I have already worked with some services. The quest is free; you can try it here.

Well now it's good for the intro, let's get to the heart of the matter

Big picture of deployments

This project is about creating a website which is hosted on amazon S3, then on the site display the number of visitors which is calculated with an AWS Lambda Function and stored in DynamoDB table, then automate all process, Website publication and the resources deployment via a CI/CD pipeline.
My resume page is available https://aws.davidwoglo.me/

Image description

Website

The first steps of the project consist in setting up a website, using a completely different approach than the traditional one. In a traditional way, we set up a machine or VM on which we install a web server (Apache, Nginx, or whatever you choose) then we upload the html/CSS files etc. and we proceed to some configuration to make the site available.

website.png

There I'm spared this server preparation task (and the underlying configs). I just uploaded the website files to Amazon S3 bucket, the object storage service of AWS, the equivalent of Cloud Storage at Google Cloud, then I did some small configuration in two three clicks, just to let S3 know that I want to use this bucket to host a static website, and then we have a ready to use website. No need for a physical server or additional tools to install. This is the serverless approach, and this is what the rest of the project is based on, I didn't need to use any server or VM.

In order for my site to be accessible via a user friendly and secure HTTPS URL, it is necessary to manage the DNS and the ssl certificate configuration, so I used AWS Certificate Manager to obtain a certificate for my domain which ownership had to be verified by automatic mail, due to some problems related to my Custom Domain provider but it is recommended to use a CNAME record to do that. Then to route the DNS traffic I used Amazon Route 53, and the distribution of website content is sped up by Amazon CloudFront(CDN Service). All these configurations were done manually in a separate way and tied together at the end to make things work.

At this point let's make a small comparison with how Google Cloud handles it. At Google Cloud all this can be included in the creation of a Load Balancer where we will just have to activate the automatic management of SSL for HTTPS and CDN for content caching.

Counting website visitors

visitor count

My web page include a visitor counter that displays how many people have accessed the site.To do this, I created an AWS Lambda function, a DynamoDB table and a REST API. On one side, I wrote a python code that is executed by Lambda, the python function is to get the current number of visitors stored in DYnamoDB and increment it by 1 each time a visitor access my page, on the other side, I added a javaScript code to the files of my site. The job of this script is to get the number of visitors present in the Dynamo table and display it on my page, the communication between the JS code and the database is done via a REST API that I set up using Amazon API Gateway.

This is the part that gave me headaches when I was doing the project on Google Cloud. I didn't use an API Gateway there (because honestly, I didn't know), so I used an open source Functions Framework for Python in which I used client library API to communicate with Cloud Firestore which is the equivalent of DynamoDB.

Automation (CI/CD, IaC, Source Control)

IAC

To accelerate and simplify the update of my deployments, whether it is the website (frontend) or the underlying resources (backend), I need to set up a CI/CD pipeline. A CI/CD pipeline is a series of steps that must be performed in order to deliver a new version of software.
Well for the website, it's ok, we can manage it as software, (it's composed of files as a software is), the question you could ask yourself is : What about the cloud resources in the background ? Since they are not files, right? This is where Infrastructure as Code (IaC) comes in. But before talking about it, let's see how the CI/CD for the front end was set up.
I created a source control repository on Github where I put the website files, then I wrote a workflow file that instructs Github Action on how to update my website every time I make a push.
Now let's talk about the Infrastructure as Code stuff, i.e. how to manage provision resources through machine-readable definition files, rather than use an interactive configuration as it is traditionally done.
Well I used Terraform to define the DynamoDB table, the API Gateway, the Lambda function configurations in a template and deploy them with Terraform CLI. You see, now that we can also manage our infrastructure as a software, we can integrate it in a CICD to accelerate the process of deployment and update of the infrastructure resources.
I proceeded in the same way as the website to set up the Backend pipeline, just that here the Github Actions workflow file is a bit more complex.

You can access my frontend repository here and the backend one here

Well, here is how things went during this project. Thanks for reading :)
Please check below for some useful resources.

Useful resources

Top comments (0)