DEV Community

Dinku143
Dinku143

Posted on

How I Built My Cloud Resume on Azure with Terraform & GitHub Actions

**Introduction

The Cloud Resume Challenge is a hands-on project created by Forrest Brazeal that takes you through building a real-world cloud solution from scratch. It covers everything from hosting a static website to writing Python APIs, provisioning infrastructure as code, and setting up fully automated CI/CD pipelines.

I completed the Azure edition of the challenge, and in this post, I'll walk through what I built, the architecture decisions I made, the problems I ran into, and what I learned along the way.

Live site: https://www.dinkisaworku.com

Backend repo: https://github.com/Dinku143/cloud-resume-backend-infra

Frontend repo: https://github.com/Dinku143/cloud-resume-frontend

**The Architecture

Here is the full architecture of what I built:
Browser

www.dinkisaworku.com (Cloudflare DNS)

Azure Front Door (HTTPS, SSL certificate, global CDN)

Azure Blob Storage (static website — HTML + CSS)

Browser JS fetch()

Azure Function (Python HTTP trigger — serverless API)

Azure CosmosDB Table API (visitor counter storage)

GitHub
├── cloud-resume-frontend
│ └── push → GitHub Actions → upload to Blob Storage → purge CDN cache
└── cloud-resume-backend-infra
└── push → GitHub Actions → run pytest → deploy Azure Function
Everything except the domain name and DNS is provisioned using Terraform.

**Step by Step — What I Built

Step 1 — Certification

I already held AZ-900, AZ-104, and AZ-305 before starting the challenge. These certifications gave me a solid foundation for understanding the Azure services I would be using throughout the project.

Steps 2 & 3 — HTML and CSS

I built my resume as a single-page website using HTML and CSS — no frameworks, no JavaScript build tools. I used Google Fonts (Bebas Neue, DM Mono, Syne) and CSS custom properties to create a dark, cyberpunk-inspired design featuring animated blobs, scroll-reveal effects, and a custom cursor.

Step 4 — Azure Storage Static Website with Terraform

Instead of clicking around in the Azure Portal, I used Terraform to provision everything from day one. I created:

  • A Resource Group
  • A Storage Account with Static Website enabled
  • The $web container serving index.html and my_resume.css

The key concept here is Infrastructure as Code — your infrastructure is defined in version-controlled text files, making deployments reproducible and auditable.
resource "azurerm_storage_account" "resume" {
name = "dinkisacvstorage"
account_tier = "Standard"
account_replication_type = "LRS"

static_website {
index_document = "my_resume.html"
error_404_document = "my_resume.html"
}
}

Step 5 — HTTPS with Azure Front Door

The classic Azure CDN was retired — new profiles can no longer be created. I switched to Azure Front Door Standard, which handles HTTPS, global caching, and HTTP-to-HTTPS redirects. I also hit a tricky bug where the replace() function in Terraform was missing a closing parenthesis — a good reminder always to run terraform validate before apply.

Step 6 — Custom Domain

I registered dinkisaworku.com through Cloudflare (~$10/year). I added CNAME records pointing to my Front Door endpoint and TXT validation records for SSL certificate issuance. The trickiest part was understanding that the domain validation tokens change every time you run terraform destroy and recreate resources — something to keep in mind if you ever tear down and rebuild.

Steps 7, 8, 9, 10 — The Visitor Counter (Full Stack)

This is the most interesting part. I built a serverless visitor counter using:
JavaScript (frontend):
async function updateVisitorCount() {
const response = await fetch('https://func-cloud-resume-dinkisa.azurewebsites.net/api/visitor_counter');
const data = await response.json();
document.getElementById('visitor-count').innerText = data.count;
}
updateVisitorCount();
Python Azure Function (API):
def main(req: func.HttpRequest) -> func.HttpResponse:
connection_string = os.environ["COSMOS_CONNECTION_STRING"]
service = TableServiceClient.from_connection_string(conn_str=connection_string)
table = service.get_table_client("VisitorCounter")

try:
    entity = table.get_entity(partition_key="counter", row_key="main")
    count = int(entity["count"]) + 1
    entity["count"] = count
    table.update_entity(entity)
except ResourceNotFoundError:
    count = 1
    table.create_entity({"PartitionKey": "counter", "RowKey": "main", "count": count})

return func.HttpResponse(
    body=json.dumps({"count": count}),
    mimetype="application/json",
    headers={"Access-Control-Allow-Origin": "https://www.dinkisaworku.com"}
)
Enter fullscreen mode Exit fullscreen mode

CosmosDB Table API stores the count as a simple key-value entity. I used the Table API connection string (not the SQL one) — an easy mistake to make since Azure gives you multiple connection strings.
One important lesson: never expose your database connection string in JavaScript. The Azure Function acts as a middleman — the browser only ever talks to the Function, which securely holds the CosmosDB credentials in environment variables.

Step 11 — Python Tests

I wrote four unit tests using unittest and unittest.mock:

  1. Counter increments correctly for existing entity
  2. Counter starts at 1 when no entity exists
  3. Response is valid JSON
  4. CORS header is present

The key was mocking both the TableServiceClient and the COSMOS_CONNECTION_STRING environment variable using setUp() — otherwise tests fail with a KeyError because the environment variable only exists inside local.settings.json, which isn't loaded by pytest.
def setUp(self):
os.environ["COSMOS_CONNECTION_STRING"] = "DefaultEndpointsProtocol=https;AccountName=fake;..."

Step 12 — Function App with Terraform

I provisioned the Azure Function App using Terraform with a Consumption (Y1) plan — serverless, pay per execution, first 1 million calls free. The CosmosDB connection string is stored as a Terraform variable marked sensitive = true and passed via terraform.tfvars, which is excluded from Git via .gitignore.
Step 13 — GitHub Source Control
I split the project into two separate repositories:

  • cloud-resume-frontend — HTML, CSS, GitHub Actions workflow
  • cloud-resume-backend-infra — Python function, tests, Terraform Two repos mean two independent CI/CD pipelines. A CSS change doesn't trigger a backend deployment and vice versa. Steps 14 & 15 — CI/CD with GitHub Actions Backend pipeline (.github/workflows/deploy-backend.yml):
  • name: Run tests
    run: |
    cd backend
    python -m pytest tests/test_counter.py -v

  • name: Deploy to Azure Functions
    uses: Azure/functions-action@v1
    with:
    app-name: func-cloud-resume-dinkisa
    package: backend
    Tests act as a safety gate — if they fail, nothing gets deployed.
    Frontend pipeline (.github/workflows/deploy-frontend.yml):

  • name: Upload to Azure Blob Storage
    uses: azure/CLI@v1
    with:
    inlineScript: |
    az storage blob upload-batch \
    --account-name dinkisacvstorage \
    --destination '$web' \
    --source frontend/ \
    --overwrite \
    --auth-mode key \
    --account-key ${{ secrets.STORAGE_ACCOUNT_KEY }}

  • name: Purge Front Door cache
    uses: azure/CLI@v1
    with:
    inlineScript: |
    az afd endpoint purge \
    --resource-group rg-cloud-resume-challenge \
    --profile-name afd-cloud-resume \
    --endpoint-name dinkisa-resume \
    --content-paths '/*' \
    --no-wait

The --no-wait flag on the cache purge is important — without it the pipeline hangs for 7+ minutes waiting for the purge to complete globally.

**The Hardest Parts

  1. Azure CDN retirement

When I first tried to create a CDN profile, Terraform failed with:
Azure CDN from Microsoft (classic) no longer support new profile creation.

Classic Azure CDN was retired. I had to switch to azurerm_cdn_frontdoor_profile with the Standard_AzureFrontDoor SKU — a completely different resource type with a different structure.

  1. CosmosDB connection string confusion

Azure gives you 8 different connection strings for a CosmosDB account — SQL primary, SQL secondary, Table primary, Table secondary, and read-only variants of each. Using the wrong one (SQL instead of Table) causes a cryptic ValueError: Connection string missing required connection details error. Always use the Primary Table Connection String.

  1. Domain validation tokens regenerate on destroy

Every time you run terraform destroy and recreate the Front Door custom domain resources, Azure generates new TXT validation tokens. You have to update your DNS records in Cloudflare with the new values and wait for Azure to re-validate, which can take 15+ minutes.

  1. Nested git repositories Running terraform init inside infrastructure/ accidentally created a nested .git folder, making git add . fail with error: 'infrastructure/' does not have a commit checked out. Fixed by removing the nested .git folder with rm -rf infrastructure/.git.

**What I Learned

Terraform is powerful but unforgiving — a single missing ) breaks everything. Always run terraform fmt and terraform validate before apply.
terraform destroy is expensive in a challenge context — resources like CosmosDB get soft-deleted and reserved for 7 days, causing conflicts when you try to recreate them. Either import existing resources or rename them.
Secrets management matters from day one — .gitignore your terraform.tfvars and local.settings.json before your first commit, not after.

CORS is not optional — if you forget the Access-Control-Allow-Origin header in your Azure Function response, the browser silently blocks the response, and your visitor counter shows nothing.

  • CI/CD changes your workflow completely — once GitHub Actions is set up, you stop thinking about deployment as a manual step. You just push code and trust the pipeline.

**Final Thoughts
The Cloud Resume Challenge is genuinely one of the best ways to learn cloud engineering by doing. Every step teaches you something real — not just theory. By the end, you have a live project that demonstrates:

  1. Infrastructure as Code (Terraform)
  2. Serverless compute (Azure Functions)
  3. NoSQL databases (CosmosDB)
  4. CI/CD automation (GitHub Actions)
  5. DNS and HTTPS (Cloudflare + Azure Front Door)
  6. Python backend development
  7. Unit testing

If you're studying for Azure certifications, do this challenge alongside — it makes everything click.

Live site: https://www.dinkisaworku.com


GitHub: https://github.com/Dinku143

Top comments (0)