DEV Community

Dev Verse - Code Everything
Dev Verse - Code Everything

Posted on

I Built a Production-Level AI Resume Analyzer Using 9 Azure Services (Free Tier)

Event-driven architecture · Real bugs · Full Azure pipeline explained


Here's the entire architecture, every line of thinking, every bug I hit, and exactly how it works — from Blazor WASM to Gemini AI to a live email in your inbox.

Pravin Kshirsagar · Azure & .NET Developer · 2026 · 15 min read


📺 This article is a companion to my YouTube video walkthrough of ResumePulse.
Everything in the video — the architecture, the code, the live demo — is written out here so you can read, reference, and build along at your own pace.

👉 Watch the full video here


Let me be honest with you first

I didn't build ResumePulse because I had some grand product vision. I built it because I was stuck in the same loop that most developers with one year of experience find themselves in — I could talk about Azure in interviews, mention Service Bus and Event Grid and Key Vault, but I had nothing to actually show. No deployed project. No real architecture. Just theory.

That's a genuinely uncomfortable place to be. Especially when you're competing with people who have three, five, seven years of experience and a GitHub full of projects.

So I asked myself a simple question: what if I just built something real? Not a tutorial copy-paste, not a hello-world with Azure branding slapped on it — something that actually uses cloud services the way cloud services are meant to be used, solving a problem that real people have.

ResumePulse is what came out of that question. An AI-powered resume analyzer, built entirely on Azure's free tier, deployed and running live. And this article is the full story of how it works — not the polished "here's the perfect architecture" version, but the actual version including the bugs, the dead ends, and the moments where things finally clicked.

💡 If you are a fresher or early-career .NET developer trying to build something that actually impresses interviewers — not just looks good on paper — read this start to finish. This is the guide I genuinely wish existed when I started.


The problem worth solving

Think about what happens when someone applies for a job online. They spend hours on their resume, hit submit, and then... silence. No feedback. No idea why they got rejected or what was missing. It's a black box.

On the recruiter side, it's the opposite problem. A single job posting gets hundreds of applications. Manually screening resumes to find the three genuinely strong candidates takes hours. Most companies are doing this with spreadsheets and gut feelings.

ResumePulse sits in the middle of this gap. You upload your resume, paste a job description, and within about 30 seconds you get back:

  • A score from 0 to 100
  • A list of skills you already have that match
  • A list of skills you are missing
  • A clear recommendation: Strong Match / Good Match / Partial Match / Not a Match
  • A detailed email report sent straight to your inbox

For a job seeker, that's actually useful — you know exactly what to add to your resume before applying. For a recruiter, you can rank 500 resumes automatically instead of reading them one by one.

The real-world value is obvious. The interesting part is how it's built.


The architecture — why it's event-driven and why that matters

Before touching any code, I want to explain the architectural decision that makes ResumePulse more than a simple CRUD app. The system is event-driven. That's a term that gets thrown around a lot, so let me explain what it actually means in practice here.

When you upload a resume, the API does not sit there waiting for Gemini AI to finish analyzing it. AI calls take time — sometimes 20, sometimes 40 seconds depending on the model load. If the API blocked and waited, your browser would just spin. If the connection dropped, your result would be lost. That's fragile.

Instead, the API does two quick things — stores your PDF and drops a message in a queue — then immediately returns a job ID to your browser. Your browser polls every 5 seconds asking "is it ready?" while in the background an Azure Function picks up that queue message, does the heavy AI processing, and saves the result when it's done. The browser finds it on the next poll.

This is exactly how Netflix, Uber, and Amazon build their systems. Not because those companies are obsessed with Azure architecture patterns, but because it's genuinely the right way to handle work that takes variable amounts of time.

The full flow, step by step

  1. User opens the Blazor WASM frontend — served from Azure Static Web Apps on a global CDN
  2. They enter their email, paste a job description, and upload a PDF resume
  3. Blazor calls POST /api/resume/upload on the ASP.NET Core API hosted on Azure App Service
  4. The API reads secrets from Azure Key Vault using Managed Identity — no passwords in code
  5. API uploads the PDF to Azure Blob Storage in the resumes container and gets back the blob URL
  6. API sends a message to Azure Service Bus queue: { JobId, BlobUrl, Email, JobDescription }
  7. API immediately returns { JobId, Status: 'Queued' } — user sees a progress tracker, not a spinner
  8. Azure Function wakes up automatically via ServiceBusTrigger — downloads the PDF from Blob Storage
  9. Function extracts text from the PDF and calls Google Gemini AI with resume + job description
  10. Gemini returns structured JSON: score, summary, matched skills, missing skills, recommendation
  11. Function saves report.json to Blob Storage in the reports container
  12. Function publishes a ResumeAnalysisCompleted event to Azure Event Grid Topic
  13. Event Grid routes the event to a Logic App — which sends a formatted email via SendGrid
  14. Blazor's polling loop finds the report on the next GET /api/resume/report/{jobId} call and displays results

The API never directly calls Gemini. The Function never directly sends emails. Each service does exactly one thing and communicates through events and queues. This is loose coupling — and it's what makes the system reliable, scalable, and testable.


All 9 Azure services — what they are and why we picked each one

Azure Service What It Does in ResumePulse Free Tier
Azure Blob Storage Stores uploaded PDF resumes and generated JSON reports 5 GB free (12 months)
Azure Service Bus Queues analysis jobs — decouples API from AI processing 10M messages/month free
Azure Key Vault Stores ALL secrets — connection strings, API keys 10K operations/month free
Azure App Service Hosts the ASP.NET Core Web API with a public URL F1 plan — always free
Azure App Service Plan The compute resource the App Service runs on F1 — zero cost forever
Azure Functions Serverless processor — wakes on Service Bus trigger 1M executions/month free
Azure Event Grid Routes 'analysis complete' events to Logic App 100K ops/month free
Azure Logic Apps No-code email automation via SendGrid connector 4K actions/month free
Azure Static Web Apps Hosts the Blazor WASM frontend on global CDN Always free

I'm going to explain each one the way I wish someone had explained it to me — not with marketing language, but with the real answer to "okay but what does this actually do in my project?"


🗄️ Azure Blob Storage

What it is: Cloud file storage. Think of it as Google Drive for your application — it stores files like PDFs, images, and JSON documents cheaply and reliably.

Why we used it: We need to store two types of files — the uploaded resume PDFs and the generated analysis reports in JSON format. Blob Storage costs almost nothing for small amounts of data and is natively integrated with the rest of Azure.

How it fits: The API uploads resume PDFs to a container called resumes. The Azure Function saves report.json to a container called reports. The Blazor frontend reads the report via the API when polling.


📨 Azure Service Bus

What it is: A message queue service. Think of it like a post office — you drop a message in, and a worker picks it up and processes it when ready. Messages are guaranteed to be delivered, even if the worker crashes.

Why we used it: We don't want the API to block for 30+ seconds waiting for Gemini AI. Service Bus decouples the request from the processing entirely. If the Azure Function fails, the message stays in the queue and retries automatically.

How it fits: The API puts a JSON message { JobId, BlobUrl, Email, JobDescription } into the resume-analysis-queue. The Function has a [ServiceBusTrigger] attribute and wakes up automatically when a message arrives.


🔑 Azure Key Vault

What it is: A secure digital safe in the cloud. It stores passwords, API keys, and connection strings. Nothing sensitive goes in your code or config files.

Why we used it: ResumePulse has four sensitive secrets: Storage connection string, Service Bus connection string, Gemini API key, and Event Grid key. Hardcoding any of these is a serious security mistake that would fail any code review.

How it fits: All four secrets live in Key Vault. App Service and Functions access them using Managed Identity — completely passwordless. In code, AddAzureKeyVault() pulls all secrets automatically at startup.


⚡ Azure Functions

What it is: Serverless compute. You write a function, Azure runs it when triggered, and you only pay per execution. One million executions per month are free forever.

Why we used it: The resume analysis needs to run in the background, triggered by a Service Bus message. Functions are exactly built for this — they wake up on demand, do their work, and sleep again. No idle server costs, no wasted resources.

How it fits: ResumeAnalysisFunction uses [ServiceBusTrigger] — Azure invokes it automatically when a message arrives. It downloads the PDF, extracts text, calls Gemini AI, saves the report, and fires an Event Grid event.


📡 Azure Event Grid

What it is: A managed event routing service. It connects event publishers (your code) to event subscribers (Logic Apps, Functions, etc.) — like a notification system for your cloud infrastructure.

Why we used it: After analysis is complete, we need to trigger an email. Instead of calling SendGrid directly from the Function (which would tightly couple them), we publish an event. Event Grid routes it to whoever cares. Tomorrow, if you want to add a Slack notification, you just add another subscription — no code changes.

How it fits: The Function publishes ResumeAnalysisCompleted to the Event Grid Topic. Event Grid routes this to the Logic App subscription. Zero direct coupling between the Function and the email system.


🔁 Azure Logic Apps

What it is: No-code workflow automation with 400+ built-in connectors. You build workflows visually — connecting services like LEGO blocks.

Why we used it: We need to send a formatted email when analysis completes. Instead of writing SMTP code and managing credentials, Logic Apps handles the entire SendGrid integration with zero code written.

How it fits: Logic App listens for Event Grid events of type ResumeAnalysisCompleted. When triggered, it parses the event data — score, email address, skills — and uses the SendGrid connector to send a professional result email.


🌍 Azure Static Web Apps

What it is: Hosting designed specifically for static frontend files — HTML, CSS, JavaScript, and WebAssembly. Includes free global CDN, HTTPS, and custom domains.

Why we used it: Blazor WASM compiles to static files. Azure Static Web Apps is built exactly for this — free tier, global CDN, instant deployment. It's a much better fit than App Service for a frontend-only project.

How it fits: The compiled Blazor WASM output (the wwwroot folder) deploys here. Users worldwide get fast load times because the files are served from the nearest CDN edge location — not from a single server.


The code — how the important pieces actually work

I'm not going to paste 500 lines of code here. What I will do is explain the four pieces of code that matter most — the ones that either confused me when I built them or that I know will confuse you when you try to replicate this.


1. Key Vault + Managed Identity in Program.cs

This is the most important pattern in the entire project. The entire point of Key Vault is that you never hardcode secrets. Here is how we wire it up in the API's Program.cs:

var kvUrl = builder.Configuration["KeyVault__Url"];

builder.Configuration.AddAzureKeyVault(
    new Uri(kvUrl), new DefaultAzureCredential());
Enter fullscreen mode Exit fullscreen mode

That's it. Three lines. After this, any secret stored in Key Vault is accessible via builder.Configuration["SecretName"] — exactly like reading from appsettings.json, except the values never touch your code or repository.

💡 DefaultAzureCredential is clever — it uses your local Azure CLI credentials when running on your laptop, and automatically switches to Managed Identity when running in Azure. The same code works in both environments without any changes.


2. The upload endpoint — why it returns immediately

This is the pattern that makes the architecture feel fast. The API does NOT wait for AI:

var jobId = Guid.NewGuid().ToString();

// Step 1: Store the PDF
var blobUrl = await _blob.UploadResumeAsync(request.ResumeFile, jobId);

// Step 2: Queue the job — fire and forget
await _serviceBus.SendMessageAsync(new {
    JobId = jobId,
    BlobUrl = blobUrl,
    Email = request.Email,
    JobDescription = request.JobDescription
});

// Step 3: Return immediately — no waiting
return Ok(new { JobId = jobId, Status = "Queued" });
Enter fullscreen mode Exit fullscreen mode

The frontend gets this response in under a second. Then it polls every 5 seconds asking for the report. The Azure Function is doing the heavy lifting in parallel. When the report is ready, the next poll finds it.


3. The Blazor file upload bug — and the fix

This one caught me off guard. In Blazor WASM, if you try to read a file after an await call, the file reference becomes null. It's a known Blazor limitation and there is no obvious error message — it just crashes silently.

// ❌ WRONG — crashes silently after the first await
private async Task OnFileChanged(InputFileChangeEventArgs e) {
    await SomeOtherAsyncThing(); // file is NULL after this!
    var stream = e.File.OpenReadStream(); // NullReferenceException
}
Enter fullscreen mode Exit fullscreen mode
// ✅ CORRECT — read file bytes immediately, before any await
private async Task OnFileChanged(InputFileChangeEventArgs e) {
    using var ms = new MemoryStream();
    await e.File.OpenReadStream(10 * 1024 * 1024).CopyToAsync(ms);
    _fileBytes = ms.ToArray(); // safe — stored in memory now
}
Enter fullscreen mode Exit fullscreen mode

Read the file into a byte array immediately when the input changes. Then use those bytes in your upload logic. Simple fix once you know the cause — but it cost me about an hour to figure out.


4. The Gemini AI prompt — getting structured JSON reliably

Getting consistent, parseable output from any AI model requires a carefully written configuration:

var config = new GenerateContentConfig {
    ResponseMimeType = "application/json", // force JSON output — critical!
    Temperature = 0.1f  // low temperature = consistent, predictable output
};
Enter fullscreen mode Exit fullscreen mode

The ResponseMimeType setting is crucial. Without it, Gemini sometimes wraps the JSON in markdown code blocks — which makes your JSON.Parse throw an exception. Setting it explicitly tells the model to return raw JSON only.

We also added retry logic for free tier quota limits — up to 3 attempts with a delay between each. And all JSON parsing uses TryGetProperty instead of GetProperty to avoid crashes when optional fields are missing.


Setting up Azure — all 12 resources in the correct order

The order matters here. Some resources depend on others being created first.

Step What to Create Name Used Key Setting
1 Resource Group rg-resumepulse-pk Region: East US
2 Storage Account stresumepulsepk LRS redundancy — cheapest
3 Blob Containers resumes + reports Access level: Private
4 Service Bus Namespace sb-resumepulse-pk Basic tier — free
5 Service Bus Queue resume-analysis-queue Default settings
6 Key Vault kv-resumepulse-pk RBAC access config
7 App Service Plan asp-resumepulse-pk F1 Free tier — Dev/Test tab
8 App Service (API) app-resumepulse-pk .NET 9 runtime, Linux
9 Function App func-resumepulse-pk Consumption plan, .NET 9 Isolated
10 Event Grid Topic egt-resumepulse-pk Schema: Event Grid
11 Logic App logic-resumepulse-pk Consumption plan
12 Static Web App swa-resumepulse-pk Free plan

The Key Vault secrets you need to add

After creating Key Vault, add these four secrets manually from their respective service pages:

  • StorageConnectionString — Storage Account → Access Keys → Connection String
  • ServiceBusConnectionString — Service Bus Namespace → Shared Access Policies → Primary
  • GeminiApiKeyaistudio.google.com → Get API Key → Create new key
  • EventGridTopicKey — Event Grid Topic → Access Keys → Key 1

⚠️ Critical step: After creating your App Service and Function App, go to each one's Identity tab and turn on System Assigned Managed Identity. Then go to Key Vault → Access Control (IAM) → Add Role Assignment → 'Key Vault Secrets User' → assign to both. Without this, your apps cannot read secrets and will fail silently with a vague 403 error.


Project structure and NuGet packages

The solution has three projects, each with a clear responsibility:

ResumePulse/
├── ResumePulse.sln
├── ResumePulse.Api/               ← ASP.NET Core Web API
│   ├── Controllers/
│   │   └── ResumeController.cs
│   ├── Services/
│   │   ├── BlobStorageService.cs
│   │   └── ServiceBusService.cs
│   └── Program.cs
├── ResumePulse.Client/            ← Blazor WASM Frontend
│   ├── Pages/
│   │   └── Home.razor
│   ├── Services/
│   │   └── ResumeApiService.cs
│   └── Program.cs
└── ResumePulse.Functions/         ← Azure Functions
    ├── Functions/
    │   └── ResumeAnalysisFunction.cs
    ├── Services/
    │   └── GeminiService.cs
    ├── host.json
    └── local.settings.json        ← ⚠️ NEVER commit this file
Enter fullscreen mode Exit fullscreen mode

API NuGet packages

cd ResumePulse.Api
dotnet add package Azure.Storage.Blobs
dotnet add package Azure.Messaging.ServiceBus
dotnet add package Azure.Security.KeyVault.Secrets
dotnet add package Azure.Identity
dotnet add package Azure.Extensions.AspNetCore.Configuration.Secrets
Enter fullscreen mode Exit fullscreen mode

Functions NuGet packages

cd ResumePulse.Functions
dotnet add package Microsoft.Azure.Functions.Worker
dotnet add package Microsoft.Azure.Functions.Worker.Extensions.ServiceBus
dotnet add package Azure.Storage.Blobs
dotnet add package Azure.Messaging.EventGrid
dotnet add package Google.Cloud.AIPlatform.V1
dotnet add package UglyToad.PdfPig
Enter fullscreen mode Exit fullscreen mode

Deployment — getting everything live on Azure

Three separate deployments. Do them in this order: API first, Functions second, Blazor third. The Blazor frontend needs the real API URL before it can be published.

Deploy 1: API to App Service

cd ResumePulse.Api
dotnet publish -c Release -o ./publish
cd publish && tar -a -c -f ../api-deploy.zip *

az webapp deploy \
  --resource-group rg-resumepulse-pk \
  --name app-resumepulse-pk \
  --src-path ../api-deploy.zip \
  --type zip
Enter fullscreen mode Exit fullscreen mode

Verify it worked:

curl https://app-resumepulse-pk.azurewebsites.net/api/resume/health
# Expected: {"status":"Healthy","timestamp":"2026-..."}
Enter fullscreen mode Exit fullscreen mode

Deploy 2: Azure Functions

cd ../ResumePulse.Functions
rmdir /s /q bin && rmdir /s /q obj   # clean first — important!
dotnet publish -c Release -o ./publish
cd publish && tar -a -c -f ../func-deploy.zip *

az functionapp deployment source config-zip \
  --resource-group rg-resumepulse-pk \
  --name func-resumepulse-pk \
  --src ../func-deploy.zip
Enter fullscreen mode Exit fullscreen mode

Deploy 3: Blazor WASM to Static Web Apps

Update Program.cs with your real Azure API URL first:

var apiBaseUrl = builder.HostEnvironment.IsProduction()
    ? "https://app-resumepulse-pk.azurewebsites.net/"
    : "http://localhost:5257/";
Enter fullscreen mode Exit fullscreen mode

Then deploy:

cd ../ResumePulse.Client
dotnet publish -c Release -o ./publish
npm install -g @azure/static-web-apps-cli
swa deploy ./publish/wwwroot \
  --deployment-token YOUR-TOKEN-HERE \
  --env production
Enter fullscreen mode Exit fullscreen mode

⚠️ After getting your Static Web App URL, update the CORS policy in your API's Program.cs to include that URL, then redeploy the API. Without this, the browser will block all API calls from the frontend — and the error message in the browser console will be confusing if you don't know what to look for.


What this actually costs to run

The honest answer: almost nothing. Here's the breakdown for a real MVP with 100 test resumes analyzed:

Service Free Limit Our Usage Cost
Static Web Apps 100 GB bandwidth/month Under 1 GB for MVP ₹0
App Service F1 60 CPU minutes/day Well within limit ₹0
Blob Storage 5 GB (12 months) Under 100 MB for MVP ₹0
Event Grid 100,000 ops/month 1 event per upload ₹0
Azure Functions 1 million executions 2 per upload ₹0
Service Bus 10 million messages 1 message per upload ₹0
Key Vault 10,000 operations/month 2 per upload ₹0
Logic Apps 4,000 actions/month 3 per upload ₹0
Gemini AI Free credits on signup ~$0.001 per analysis ~₹0.08 per use

Total infrastructure cost for 100 resume analyses: approximately ₹8. The Gemini cost is the only real variable, and it's tiny at this scale.

💡 One thing to watch: the App Service F1 plan puts your app to sleep after a period of inactivity. The first request after sleep takes about 5 seconds to wake up (cold start). For a portfolio project this is fine. For production you would upgrade to B1 (~$13/month) to keep Always-On enabled.


What building this actually taught me

There were parts of this project that were genuinely frustrating. Things that looked simple in the docs turned out to have hidden requirements. Here are the honest lessons:

Managed Identity is harder to get right than it looks

The concept is simple — your app gets an Azure AD identity and uses it to access other services without passwords. The reality is that you need to: enable it on the resource, copy the Object ID, go to each target service's IAM tab, add the right role, and sometimes wait several minutes for the assignment to propagate. Miss any step and you get a vague 403 error with no helpful message.

Event-driven architecture requires you to think differently

When you start with synchronous code, you think linearly. Request comes in, you process it, you return the result. Event-driven forces you to break that mental model. The API doesn't know what the Function is doing. The Function doesn't know the Logic App exists. They communicate through contracts — the queue message format and the event schema. Once you internalize this, it becomes natural. But it's a genuine mental shift.

Free tier limits require planning

The free tiers are genuinely generous for development and portfolio use. But you need to know the limits before you design your system. The App Service F1 plan has no Always-On (cold starts). The Logic Apps Consumption plan charges per action, not per workflow run. Know these before you design, not after.

Real bugs are your best content

The Blazor file upload null bug. The Managed Identity role assignment delay. The Gemini prompt that returned markdown-wrapped JSON. Every one of these frustrated me in the moment. Every one of them is now something I can explain clearly in an interview because I actually lived through it. That's the real value of building projects like this — not the finished product, but the things that broke along the way.


How this helps you in interviews

When an interviewer asks you about cloud architecture, there are two kinds of developers in the room. The first says "I know about Service Bus and Event Grid" — they've read the docs. The second says "I used Service Bus to decouple my API from my AI processing layer because I needed the API to return immediately without blocking on a 30-second Gemini call" — they've built something.

ResumePulse puts you firmly in the second category. Here is what you can talk about in depth after building this:

  • Event-driven architecture and why you chose it over a synchronous approach
  • The difference between App Service, App Service Plan, and how they relate to each other
  • How Managed Identity works and why it's better than connection strings in config files
  • Why Service Bus gives you reliable message delivery and what happens to a message if the Function crashes
  • The specific Blazor WASM gotcha with file reads after async calls — and the fix
  • How Azure Static Web Apps differs from App Service for frontend hosting
  • The retry logic pattern for AI API calls on free tier quota limits

You're not just listing services. You're explaining trade-offs and decisions. That's what senior engineers do. That's what interviewers are looking for.


Where to go from here

ResumePulse is a solid foundation. Here are the natural next steps if you want to extend it:

  • Azure AI Search — let users query their past analyses semantically ("show me my highest scored resumes for frontend roles")
  • Azure Communication Services — send the email through Azure's own email service instead of via Logic Apps + SendGrid
  • Job description scraper — paste a LinkedIn job URL and extract the description automatically
  • AZ-204 exam prep — this architecture already covers Azure Functions, Service Bus, Key Vault, Blob Storage, App Service, and Event Grid exam objectives directly

📌 The full source code and every CLI command used in this project is in the YouTube video description. The video timestamps match every section of this article — you can jump to exactly the part you need.


About the author

Pravin Kshirsagar — .NET developer at YASH Technologies, building Azure-native applications and working toward AZ-204/AZ-305 certifications. I write about real implementation experiences — the messy parts, the bugs, and the eventual fixes — so other developers can learn from what I went through instead of starting from scratch.

If this article saved you time, share it with one other developer. That's the only metric that matters.


Tags: #Azure #DotNET #Blazor #AzureFunctions #GeminiAI #CloudArchitecture #AZ204 #CSharp #EventDriven #Hashnode

Top comments (0)