DEV Community

Cover image for My Cloud Resume: Built on Azure
asim-makes
asim-makes

Posted on

My Cloud Resume: Built on Azure

So a few weeks ago, I decided to take on the Cloud Resume Challenge that I saw online. I skimmed over the challenge and thought it would be interesting for beginner like me to try out. And that is way better than just watching or reading tutorials.

Before going on how I completed the challenge, I will quickly summarize what the challenge is all about:
1) Build a resume using HTML, and CSS.
2) Host the static resume in cloud storage.
3) Make a visitor counter to track how many times my resume is visited with JavaScript, Python, and Database.
4) Make a template to automatically deploy the resources (IaC).
5) CI/CD pipelines for both frontend and backend with GitHub Actions.
6) Make a custom DNS and use HTTPS to visit the websiste.



Simple Archcitecture Diagram of the project

Simple Architecture Diagram of the project

Part 1: Creating a static resume

The very first part was challenging for me as I am not someone who is comfortable with frontend. So for that, I learned basic HTML and CSS and then used a tutorial YouTube video with AI tools to build my resume. The resume was good enough for me.



Part 2: Making the resume a static website

After finishing my resume, I created a storage account in Azure. Azure storage serves files, like a CDN and hence it is the best way to host a static website. First, I need to enable static web by going to the Networking. When I do that, a special container called $web is created. By convention, it looks only at $web when it is serving my site.



Part 3: Frontend Visitor Counter

Before making any JS, I first went to the static resume that I have created, made HTML for displaying visitor counter and then linked the class to JS. The job of the JS is simple" It calls an API and display the number it gets back dynamically. Since I am not into frontend, I leveraged AI tools to write the code for me and understood the snippets it gave me.



Part 4: CosmosDB Database

For the database creation, the challenge suggests to use Azure Cosmos DB Table API in serverless mode. Let me briefly explain what it is. The Table API is a schema-less NoSQL option and the simplest out of the many flavors of APIs that are out there (eg: SQL, MongoDB, etc). To add a new data, I just throw in entities (rows) with whatever properties (columns) that I want. Each entity has:
a) PartitionKey: Groups data
b) RowKey: Unique identifier
Serverless means I am only paying for what I consume and am not reserving capacity. If I were using a normal model like provisioned throughput, I am billed for that capacity whether I use it or not. I created a new CosmosDB resource and made a table for visitor counter.



Part 5: Backend Visitor Counter

For security, the browser should not talk to the database directly because any key/connection string shipped to the browser is public and users can read it. Just as the challenge suggests, I built a small API using Azure Functions with an HTTP trigger. When the JS makes a request, the function reads the current visitor count from CosmosDB, increments it, writes it back, and then returns the new counter to the browser.

The part that really had me banging my head was deploying Azure Functions with Python. I was on Linux and started using Python 3.12 as the runtime stack. I wrote the code, debugged it, checked my host file, even hit the forums and leaned on AI tools but no matter what, I just couldn’t get the function to deploy to my Function App.

After some digging, I learned that Python 3.12 support for Azure Functions is still new and has some dependency conflicts. The build would show as successful in VS Code, but the function itself just wouldn’t show up after deployment. Switching back to Python 3.11 solved the issue immediately.

I finally had an HTTP trigger up and running that connected to my database, got the counter value, incremented it, and stored it back. That little victory felt huge.

Finally, all that was left for this step was to write some test cases. I just was not in the mood to write it on my own, so I skipped it and used AI tools to write it for me.



Part 6: Infrastructure as Code (IaC)

The normal way to create resources in Azure is to go to the Azure portal, click on the resource that I want to create, fill out the information and hit "review+create". This is a manual task and if I need to recreate the same setup, I have to do it again and if I forget a setting, then good luck with it.

IaC is a way of solving it where I deploy my infrastructure in code like JSON, YAML, etc. I used Bicep to deploy my resources which is just a way to declare I need a Storage Account in this resource group, I need a Function App with this plan and runtime, etc. Bicep generally has 3 sections:
a) Parameter
b) Resource
c) Deployment
Parameter section defines the variable that will be used throughout my bicep file. Resource section defines the infrastructure. Deployment section just deploys the code. The code can be deployed using the simple command:
az deployment group create --resource-group RESOURCE_GROUP_NAME --template-file BICEP_FILE

This part was not frustrating as there were plenty of resources out there which does exactly this. So I did this step easily.



Part 7: CI/CD for Backend

First let us understand some theory. So when I make some changes to my code and upload it, that's it. If i want the change to be reflected in my Azure resource, then I have to do it manually by going to that resource and uploading the new changes again. However, I can automatically deploy changes and this is where CI/CD comes in.

CI stands for Continuous Integration. Every time I make a change and push it to GitHub, my code is automatically tested.

CD stands for Continuous Deployment. If the test passes, my code and infra changes are automatically deployed to Azure.

For this, I need to use GitHub Actions. It is a way to write automation scripts that run whenever certain events happen in my repository like a change is pushed to a repo.

I first attempted to deploy both the infra and trigger using the single bicep file but the approach was wrong. Bicep is for infra and its sole purpose is to provision and configure resources. Embedding the code in bicep template means every time I run the template, it would redeploy the code which makes the code harder to debug and manage.

So to deploy the code, I used GitHub Actions. This pipeline runs every time I push the code to the repository. To build GitHub actions, I createdsingle workflow file: .github/actions/main.yml and added two jobs:
a) build-and-test: This job is for CI. So this job must be passed to move to the next job and to pass it, the tests i have written earlier runs. If the test passes, then it moves to the second job else the whole deployment process fails.
b) deploy-to-azure: This job is for CD. Dependency is created so that the job only runs after the first job is executed. A secret is created in the project repo so that sensitive information is not hardcoded in the workflow file.
az ad sp create-for-rbac --name "myGitHubActionsServicePrincipal" --role contributor --scopes /subscriptions/sub_id --sdk-auth
This secret is used to login to Azure. After logging in, bicep template is deployed. After deploying the function app, the function (HTTP Trigger) is zipped and then deployed.

Like Part 5, this was the most frustrating part of the project and it took me a day to solve it. No matter how many times I debugged the code, the Azure Function just wouldn’t deploy. I searched online forums, used AI tools to check my syntax, and tried different deployment methods but the function just won't appear. So as a last resort, I installed requirements like this:
pip install -r requirements.txt --target=".python_packages/lib/site-packages"
and the function was deployed. Even though I explicitly specified Python 3.11 in my workflow, the GitHub Actions runner ignored it and defaulted to Python 3.12. So the above code ensures that the workflow actually runs with the right Python version and installs all dependencies into the location Azure Functions expects. After this change, deployment finally worked.



Part 8: CI/CD for Fronotend

Once the backend is running, the next step is to make updating the frontend file automatically to the storage account. I created a new GitHub repository just for static website files. I wrote a small workflow that runs automatically whenever I push new code to the repository. The workflow takes the new files and uploads them to the Azure Storage $web container, replacing the old version. Finally, I need to purge Azure CDN cache so that I can see the latest changes in the domain instead of the old cached version.



Part 9: HTTPS and DNS

This was the final step for me. The main goal of this step is to make my resume website publicly available at an address of my own. To complete this, the very first thing I need is a custom domain. To get a custom domain, I need to register a domain from a registrar. A small amount of fee needs to be paid to the registrar to get the domain. So upon searching, I choose the cheapest domain .cloud from Hostinger. I then need to create an Azure Front Door and add domain to it. After adding domain, Azure provides a TXT and CNAME record which needs to be copied to my custom domain. This is done so that my custom domain points to the Azure Front Door. I can make my domain point directly to storage account but if I do that, I would need to buy TLS/SSL certificate for HTTPS from third party CA. But Azure FrontDoor with AFD Managed Certificate handles that automatically.
Finally, I created a routing rule so that when someone visits to my custom domain, they will be forwarded to the storage account.


So this is the final result of the challenge.
My Resume Website



Conclusion

For a long time, I was caught in the cycle of "learning by reading" where I would read theories and concepts but never apply them. I had a myriad of knowledge in my head, but no real experience to back it up.

This challenge forced me to break free from that cycle. I didn't start by reading about Azure Functions, CI/CD pipelines or Infrastructure as Code; I went head first, actually built them and then forced myself to learn the basic theory. I admit that the process was messy, full of debugging and unexpected errors, but it was in those moments that the theoretical concepts made sense to me.

This project also pushed me to explore a wide range of services, from Azure Storage to CosmosDB, without getting lost in a single one. While my knowledge lacks depth right now, I am starting to have a mental map of how all the pieces of a modern web application fit together.

I had a lot of fun, frustration and learned many things doing this project. I highly recommend any new comers like me to complete this project.

Photo by Vladimir Anikeev on Unsplash

Top comments (0)