DEV Community

Brian Meinert
Brian Meinert

Posted on

From LocalStorage to Zero-Trust: Architecting a Serverless Secret Management Platform

Tech Stack: JavaScript, Node.js, Azure Functions, Azure Key Vault, Table Storage, Bicep

Hero Image: Screenshot of the main Password Generator UI with a high-strength password generated and all UI elements visible

The Genesis: Breaking Out of the Browser

This project started the way many portfolio projects do for me: a Frontend Mentor challenge. The prompt was simple, build a password generator using vanilla JavaScript.

I built the core logic quickly: a sliding scale for length, character set toggles (uppercase, symbols, etc.), and a strength meter for user feedback on strength of password. But as I tested it, I hit a wall. I wanted to save the passwords I generated.

I implemented localStorage, which worked fine until I cleared my cache, or opened the site elsewhere. The data was trapped in the browser. I realized that if I wanted this to be a real application, I needed to break out of the browser, giving me the perfect opportunity to scale to the cloud.

This is the story of how a static website evolved into a fully architected, "Zero-Trust" serverless application on Azure.

Phase 1: Designing the "Zero-Trust" Architecture

My goal wasn't just to "add a backend." I wanted to simulate a high-security environment. If I’m storing passwords and other secrets, I need to treat the application like a digital vault.

I established three core constraints for the architecture:

  1. Serverless First: I didn't want to manage VMs or OS updates. I chose Azure Functions (Node.js 18) for the API and Azure Static Web Apps for global hosting.
  2. Hardware-Backed Security: Storing a "Master PIN" in code is a vulnerability. I needed Azure Key Vault to hold the sensitive hash.
  3. Identity-Based Access: The application should authenticate against cloud resources using Managed Identities, not hardcoded connection strings.

Image: Screenshot of your Azure Resource Group in the Portal, showing the Key Vault, Function App, and Storage Account listed together
The Cloud Infrastructure: A fully serverless topology.

Phase 2: The Identity Challenge

The first major hurdle appeared when I tried to connect my API to the Key Vault.

I initially attempted to use the @azure/identity library to authenticate my function app against the vault. Locally, this worked seamlessly using my developer credentials. However, the moment I deployed to production, the backend crashed.

The Diagnosis:
It was a classic "It works on my machine" environment mismatch. The Node.js runtime environment within Azure Static Web Apps handles identity tokens differently than a standard App Service. The library was failing to retrieve a valid token, causing the handshake to fail.

The Solution: Key Vault References
Instead of fighting the library, I pivoted to a platform-native feature: Key Vault References. By configuring the Application Settings in Azure to point directly to the Vault URI (e.g., @Microsoft.KeyVault(...)), Azure handles the authentication at the platform level. It fetches the secret and injects it into the environment variable before my code even runs.

Phase 3: Persistence & The Build Pipeline Trap

With security solved, I needed a database. A full SQL server felt like overkill for simple key-value pairs, so I chose Azure Table Storage. It is fast, NoSQL, and incredibly cheap for this use case.

Image: Screenshot of the
The Result: Secure, cloud-persisted storage replacing the temporary localStorage.

I built the CRUD endpoints locally, but when I pushed to GitHub, the build failed silently. The Azure build server was looking at my package-lock.json file to decide what to install. I had installed the database SDK manually, but the lock file hadn't updated correctly.

The Lesson:
I performed a "clean install" locally deleting node_modules and the lock file—generating a fresh, accurate dependency tree. This reinforced a valuable lesson: The build pipeline is the ultimate source of truth, not your local machine.

Phase 4: UX Polish & The Full CRUD Lifecycle

A backend is useless if the user experience feels clunky. I spent the next phase refining the frontend to match the robustness of the backend.

Hybrid Input System:
Originally, the app only generated random passwords. I realized users often want to manage their own passwords. I refactored the display component into a hybrid input/display field, adding real-time strength detection to the strength meter as the user types.

The "Edit" Workflow:
I implemented an "Edit Mode" where clicking a pencil icon loads a saved password back into the generator context. When the user saves changes, the backend uses an update and insert, or upsert, strategy, seamlessly overwriting the old entry.

Image: Screenshot of the app in
State Management: Loading existing data back into the generator context.

Native-Feel Modals:
For deletions, I avoided the standard browser alert(). I built a custom modal with a "Don't show this again" preference that persists in the browser via localStorage, giving the app a polished, native feel.

Image: Screenshot of your custom

Phase 5: From "Pets" to "Cattle" (Infrastructure as Code)

At this point, I had a fully functional app. But I had built it by clicking around in the Azure Portal. If I accidentally deleted the Resource Group, I would have to remember every setting to rebuild it.

I decided to migrate to Infrastructure as Code (IaC) using Azure Bicep.

The Goal: Define the entire environment, Storage, Vault, Web App, and Permissions, in a single file.

The Challenge: RBAC Orchestration
Being brand new to IaC, there were many challenges that came with this phase, however, the hardest part of automation was permissions. My Static Web App needed permission to read the Key Vault. In code, this requires orchestrating a sequence:

  1. Create the Static Web App (to generate a Managed Identity).
  2. Dynamically retrieve that Identity's Principal ID.
  3. Inject that ID into a roleAssignment resource targeted at the Key Vault.

The Result:
I wrote a main.bicep file that defined the entire stack. I successfully deployed a "Test Environment" parallel to my live app, verified it worked, and then destroyed it with a single CLI command.

Conclusion: What I Learned

This project started as a JavaScript exercise, but it became a much needed crash course in Cloud Engineering.

Key Takeaways:

  • Security is an Architecture: Storing secrets requires thinking about Identity, not just hashing passwords.
  • The Cloud is Asynchronous: You have to wait for resources to exist before you can grant permissions to them—Bicep handles this dependency graph beautifully.
  • Serverless is powerful: I built a scalable, global, secure application for pennies a month.

I now have a Serverless Secret Management Platform that is secure, scalable, and fully reproducible to show off in my portfolio, and for my own personal use.

Link to GitHub Repository

Top comments (0)