The Capstone project is quite a formidable task, making even quite experienced users work hard on how to overcome this scenario.
We find ourselves working at a Social Research organisation that has a website that allows users to look up various data. Over the past few years, the website has grown in popularity and begun experiencing traffic issues as well as complaints about how slow it is. Additionally, there have been attempted ransomware attacks and security breaches. This is where we come in. We are here to design an infrastructure for the company's website that follows best practises and improves upon the existing architecture.
This blog is a record of how I overcame the Capstone project.
What we started with
At the start, we were given a diagram of the current environment and how the company's website was laid out:
This current architecture shows us how the bastion host has been set up as well as its additional security groups placed in multiple subnets.
This current architecture does not follow best practises as it is not highly available or does not scale automatically.
The solution
The solution I came up with not only solves the initial problem of being unable to scale automatically but also makes it more secure and highly available. By storing data within a MySQL RDS Multi-AZ database in a private subnet across multiple availability zones (AZ), it allows for more users around the world to access the website without experiencing traffic delays. Additionally, this meant that this solution had failover, allowing the secondary database to take over if the primary one were to fail or become unavailable.
The Application Load Balancer (ALB) is placed in front of the autoscaling group, which is linked to both applications in each AZ, allowing for a smoother runtime. Admin users would be able to access the applications by using SSH through the bastion host and access or store data.
We start off by downloading a SQL dump file provided by AWS that gives us the necessary tools in order for our applications as well as our databases to run.
We create an internet-facing Application Load Balancer and attach it to two public subnets, as well as attaching the respective security groups. As we were setting up the autoscaling group, which would be behind the ALB, it gave us the option to attach it to both availability zones as well as both subnets (public and private). This would allow the infrastructure to be highly available.
Once the two had been created, it allowed the application to also be created within the process. After configuring the security groups, I was finally able to access the website. However, not only did it not look like a proper website, it was not functioning like one either.
The website would list what users may want to find information about, but it would not be able to list any of the necessary information. The website would return a connection error.
The next step was to create a multi-RDS database. This would allow for a secure and highly available way to store and access data that users could access without experiencing traffic.
note
(I had used a burstable class for this instance as it would allow for performance to exceed regular use if there was a need for it as we were already experiencing traffic issues.)
While the database was being created, I used this time to go into Systems Manager and create the parameters needed in the parameter store:
Once the database was created, it was time to connect to it through the application we created earlier. This was done through the Bastion host and accessing it through the Access Key and Secret Access Key provided by AWS. Upon entering, I was able to import the SQL dump file, which allowed the database to have the data it needed. This was done by listing within the file to ensure that Countrydatadump.sql was inside of it. After the commands mysql -u username -p database_name < file.sql and entering the required password that I had set earlier when creating the RDS database, it was successfully able to ingest all the data it needed.
Upon successfully launching the application, the last thing to do was to check the website one last time and see if users could access the data they were looking for. After all my hard work, it was finally a success.
The Capstone project really pushed me to think outside the box and utilise all the skills I acquired during my journey as a trainee cloud engineer. This was something I had never experienced before, and I look forward to working on more projects like this in the future. I plan to update my blog throughout my career to not only allow readers to see my work but also to use it as a portfolio to show myself and others how I have evolved over the coming years with each project.
I appreciate you taking the time to read this!
Top comments (0)