The Cloud Resume Challenge presents developers with an exciting opportunity to delve into full stack static website development in the cloud. I stumbled upon this challenge while browsing through another developer's resume, and I immediately felt compelled to take it on.
The goal of the challenge is to provide developers with a comprehensive understanding of cloud applications by creating an online resume hosted in the cloud. While the concept may seem simple in theory, the practical implementation involves various aspects of web development, networking, database and API concepts, Infrastructure-as-Code (IaC), Source Control, and CI/CD.
In this blog post, I'll share my journey of creating my own cloud resume, hoping it serves as a guide and inspires others to embrace new challenges in cloud development.
Let's Get Started
To kickstart the project, it made perfect sense to begin with a functional front end. After experimenting with various designs, I opted to draw inspiration from existing resumes for the initial version of my website. For that, I owe thanks to Stacy Stipe, whose website served as the foundation for my own resume, albeit with some personalized CSS and responsiveness enhancements.
With a working front end in place, the next step was to find a hosting solution. The Cloud Resume Challenge provides multiple cloud provider options, and after careful consideration, I chose to leverage Microsoft Azure. I already use Azure for data storage and transformation at my company, and I was curious to learn more about its capabilities.
Thankfully, hosting a static website in Azure is cost-effective (more on costs later) and straightforward. Azure Storage Accounts can be configured with a $web
container in blob storage to host your static content. Here's a sample configuration that can be deployed to Azure:
{
"type": "Microsoft.Storage/storageAccounts/blobServices/containers",
"apiVersion": "2022-09-01",
"name": "[concat(parameters('storageAccounts_resumeaccount_name'), '/default/$web')]",
"dependsOn": [
"[resourceId('Microsoft.Storage/storageAccounts/blobServices', parameters('storageAccounts_resumeaccount_name'), 'default')]",
"[resourceId('Microsoft.Storage/storageAccounts', parameters('storageAccounts_resumeaccount_name'))]"
],
"properties": {
"immutableStorageWithVersioning": {
"enabled": false
},
"defaultEncryptionScope": "$account-encryption-key",
"denyEncryptionScopeOverride": false,
"publicAccess": "None"
}
},
This is where I stored my front-end files, including images and a copy of my paper resume for visitors to download if they wish.
All done! (Not...)
At this point, you could say I was done: I had created a cloud hosted resume website that was publicly accessible. But it's not one that I would be comfortable sharing with any serious employers, nor did it satisfy all the requirements for the challenge.
The website wasn't secure and was only accessible via HTTP. I was manually uploading my website's index.html to Azure every time I made a change. Content was slow to update after the source was modified. I didn't have a custom domain to point my visitors to. There were a lot of problems to address.
Azure CDN can solve a few of these problems. Firstly, I created an endpoint for my static content to be served from different locations around the globe (also known as content caching). This speeds up the content delivery, and as a bonus, it allows for HTTPS connections to my website. Here's what that looks like using IaC:
{
"type": "Microsoft.Cdn/profiles/endpoints",
"apiVersion": "2022-11-01-preview",
"name": "[concat(parameters('profiles_ResumeCDN_name'), '/ArmbristerCloudResume')]",
"location": "Global",
"dependsOn": [
"[resourceId('Microsoft.Cdn/profiles', parameters('profiles_ResumeCDN_name'))]"
],
"properties": {
"originHostHeader": "resumeaccount.z13.web.core.windows.net",
"isCompressionEnabled": true,
"isHttpAllowed": true,
"isHttpsAllowed": true,
"queryStringCachingBehavior": "BypassCaching",
"origins": [
{
"name": "resumeaccount-z13-web-core-windows-net",
"properties": {
"hostName": "resumeaccount.z13.web.core.windows.net",
"priority": 1,
"weight": 1000,
"enabled": true
}
}
],
"originGroups": [],
"geoFilters": [],
"deliveryPolicy": {
"rules": []
}
}
},
Deploying this code to azure creates a CDN endpoint for content caching and HTTPS capabilities.
Next, I needed to address networking, particularly obtaining a custom domain and managing domain routing. I chose GoDaddy as my DNS provider and registered my domain: bristercloud.com
. Initially, I attempted to use the GoDaddy name servers to manage my DNS records, but after much trial and error, I opted to use Azure's name servers. This gave me greater control over domain routing within Azure's DNS zones. The new DNS records allowed the website to be accessed from other subdomains (e.g., www
).
Python: Let's Write an API
The Challenge required the inclusion of a page view counter on the website, which had to be accomplished using Azure services and a Python API. In other words, it was time to develop the back end.
First, I needed a place to store the current page view count. Azure's CosmosDB is an excellent non-relational database for this task. I created a table called pageviews
within Cosmos DB to store the current count.
Whenever my page loads, my API should retrieve the current count, increment it by 1, update the table, and return the new value to the front end.
Using Python (and a healthy dose of Microsoft Documentation), I was able to create an HTTP trigger which accepts GET requests and interfaces with CosmosDB using the azure.data.tables
python library. This allowed my API to read and update the PageViewCount
.
Next, I created a Function App on an App Service Plan to host the back end code in Azure. This function would be the endpoint called by my front end whenever I wanted to retrieve the new PageViewCount
.
Important note if you're following along: Make sure to enable CORS (Cross-Origin Resource Sharing) in your Function App; otherwise, your website will have issues accessing your function.
To tie it all together, I wrote a bit of JavaScript to call the API and update the landing page:
const url="https://pageviewcounterfunction.azurewebsites.net/api/pageviews?PartitionKey=pageviewcount"
const xhr = new XMLHttpRequest();
xhr.open("GET", url);
xhr.send();
xhr.responseType = "json";
xhr.onload = () => {
if (xhr.readyState == 4 && xhr.status == 200) {
const pageViewCount = xhr.response;
document.getElementById("guestNumber").innerHTML = pageViewCount;
console.log(data);
} else {
console.log(`Error: ${xhr.status}`);
}
};
Wrap It Up!
To round out the project and make it truly feel like a modern cloud application, the Challenge calls for Source Control, IaC, and CI/CD.
I started by migrating the front and back end content to GitHub. Then, I set up GitHub Actions to run my Python test scripts and deploy my code to Azure whenever local code was pushed to my master branch. Additionally, I set up an ARM template for configuration and deployment of new resources. Here's what my front end repo looked like when I was done:
Before I made these changes, code was deployed manually. Thankfully, VS Code's Azure Tools acted as an interim solution to code deployments, by allowing me to deploy directly from my IDE. However, without GitHub Actions, I had to manually purge the CDN caches to retrieve the latest code and manually execute my tests.
The Challenge has these requirements listed last, but I would strongly recommend anyone taking the challenge to start here. Utilizing GitHub Actions and ARM templates will save hours of time in the long run by eliminating the manual testing, deployment, and configuration of resources in your environment.
Costs and Conclusions
The Product Manager in me can't ignore costs, and the potential costs involved with cloud technologies may deter others from attempting projects like these. So, what is the cost of the Cloud Resume Challenge?
Surprisingly, the most expensive part of the entire project is the custom domain. While the cost varies based on demand, thanks to a promotion from GoDaddy, my domain only cost me $3.17 for the first year and $11.99/year afterwards. However, it's important to note that a custom domain isn't necessary to complete the project. Nevertheless, if you plan to use this resume to apply for new positions, a custom domain is worth considering.
The cost of everything else? Free.
Microsoft is very generous with their credits, providing developers like me with $200 in free credits for a few months when I signed up. The cost of Azure services, even after hundreds of API calls and site visits, retrieving and modifying files in blob storage, DNS services, CDN, Cosmos DB storage, and every other Azure service used to complete these projects, amounted to less than $1/month.
If you're like me and the first thing you do when opening the Azure portal is set up budget alerts, you shouldn't have to worry. Even if there are some costs involved, the rewards are far more valuable.
Thanks to this challenge, I am now much more well-versed in Azure.
I appreciate the practicality of this project as it encompasses a wide variety of development principles and provides me with a place to showcase my hard work to potential employers. This project is fantastic for anyone looking to expand their cloud skills, particularly students who may be graduating soon and considering a career with cloud technologies.
I hope this inspires you to take your next step into the cloud! Feel free to check out my finished resume!
Top comments (0)