Introduction
In Part 3 of this series, I continue building a fully functional Azure environment using only the Azure CLI, expanding on the resources created in earlier parts. This phase focuses on working with storage, securing sensitive data, and implementing operational best practices.
You’ll see how to create and interact with a storage account, upload and manage files in Blob Storage, securely handle secrets using Azure Key Vault, and explore cost management strategies. Along the way,I also highlight real-world challenges like RBAC permission barriers and subscription limitations. I'll be showing you how to navigate them effectively as a cloud engineer.
Create a Storage Account & Upload Files
Step 1: Create the Storage Account
This will generate a unique random string and create a storage account in Azure.
This is very important because storage accounts provide the scalable backend object storage required for storing logs, backups, container apps, and static assets.
Reliability - Local Redundant Storage (LRS) keeps 3 synchronized copies of your data across multiple fault domains in a single data center.
Protection: Guards against hardware failures (e.g., disk crash).
Limitation: If the entire data center goes down, your data may be lost.
Use case: Cheapest option, good for non-critical data.
Zone-Redundant Storage (ZRS)
Reliability - It replicates your data across multiple availability zones in the same region.
Protection: Survives data center failure (since zones are separate).
Advantage: It has a higher availability than LRS.
Use case: Applications that need high availability within a region.
Geo-Redundant Storage (GRS) (often called regional redundancy)
Reliability - It copies your data to a secondary region far away from the primary region.
Protection: Survives regional outages (e.g., natural disasters).
Bonus: Some versions allow read access to the secondary region, that is, RA-GRS.
Use case: Critical data requiring disaster recovery.
Quick Comparison
Type Copies Location Protection Level
LRS Single data center, Low protection
ZRS Multiple zones (same region), Medium protection
GRS Different regions, High protection
Run the following command block to create a storage account:
$STORAGE_NAME="labstoragefeb26"
az storage account create
--name $STORAGE_NAME
--resource-group azurecli-lab-rg
--kind StorageV2
--location korecentral
--sku Standard_LRS
Step 2:Create a Blob Container
This creates a logical folder/bucket inside the storage account.
It's needed because you cannot upload blobs directly to the storage account root, they must live inside a container.
Security — containers provide access boundaries allowing RBAC segmentation.
To upload a file into an Azure Blob Storage container, use the az storage blob upload command.
az storage container create
--name lab-files
--account-name $STORAGE_NAME
--auth-mode login
auth-mode login : This tells Azure to use your current az login credentials rather than an access key.
Step 3:Upload a file
This will locally scaffold a text file, then pushes it to Azure and stores it as a blob.
It's very much needed because Azure Blob storage is the most common storage mechanism for handling files (like images, docs, and backups).
Pillar Connection
Operational Excellence — automated asset and artifact uploads.
Run:
az storage blob upload
--account-name $STORAGE_NAME
--container-name lab-files
--name sample.txt
--file "C:\Users\Admin\Documents\New folder\sample.txt"
--auth-mode login

NOTE:This error shows that I hit a permissions wall.
The error "You do not have the required permissions" happens because, in Azure, being the "Owner" of a subscription doesn't automatically give you the right to upload data inside a storage account when using auth-mode login. You need a specific Data Plane role.
The solution is to assign the "Storage Blob Data Contributor" Role
You need to give yourself permission to handle the actual data inside the blobs.
RBAC (Role-Based Access Control): Azure separates "Management" (creating the storage account) from "Data" (uploading files).
The Storage Blob Data Contributor role is exactly what the error message is asking for so you can upload, read, and delete blobs.
If you decide to go the Role Assignment route,note that propagation time after running the command takes about 1–2 minutes. Role assignments in Azure can take a moment to "settle" across the global network.
I will go with the alternative (The "Key" Method).
I don't want to deal with roles right now, so I can bypass this by using the storage account's Access Key instead of my login:
Run this command first to get the key:
$ACCOUNT_KEY=$(az storage account keys list --account-name $STORAGE_NAME --query "[0].value" -o tsv)
Then, Upload using the key:
az storage blob upload
--account-name $STORAGE_NAME
--account-key $ACCOUNT_KEY
--container-name lab-files
--name sample.txt
--file "C:\Users\Admin\Documents\New folder\sample.txt"

That success message is exactly what I was looking for! The 100.0000% progress bar and the JSON output confirm that your file, sample.txt, has been successfully uploaded to the lab-files container in Azure.
After the upload using the Access Key method, the next logical step and good practice is to confirm the file is visible in the cloud. This takes us to step 4.
Step 4:List blobs in the container
This queries the Azure storage API for the contents of the container.
It's needed as a verification step to ensure your push succeeded.
Pillar Connection
Operational Excellence — automated verification.
Run this command to list all blobs in your container:
az storage blob list
--account-name $STORAGE_NAME
--account-key $ACCOUNT_KEY
--container-name lab-files
--output table
If this was just a test, you should know how to remove the file to keep your storage environment clean. To achieve this, run:
az storage blob delete
--account-name $STORAGE_NAME
--account-key $ACCOUNT_KEY
--container-name lab-files
--name sample.txt

By using the $ACCOUNT_KEY, I _bypassed the complicated RBAC permissions (Roles) that were blocking you earlier.
While roles are safer for large teams, using the key is the fastest way to get things done in a personal lab environment, such as this one_.
Store Secrets in Azure Key Vault
Step 1:Create a Key Vault
This provisions an Azure Key Vault instance.
It's needed credentials, connection strings, certificates and API keys must Never be hard-coded. They belong in Key Vaults.
Pillar Connection
Security — secure secret storage.
Recall that a Key Vault stores certificates, keys and secrets.
Also note that key vault names must be globally unique, just like storage accounts.
Run this command block:
$KV_NAME="labkvrahfeb26"
az keyvault create
--name $KV_NAME
--resource-group azurecli-lab-rg
--location koreacentral
--enable-rbac-authorization false

It ingests the db-password secret securely.
It's needed because it provides safe retrieval instead of storing cleartext credentials locally.
Security — ensuring robust secrets lifecycle.
Run this command:
az keyvault secret set
--vault-name $KV_NAME
--name db-password
--value 'SuperSecure@pass123'

Step 3: Retrieve the secret
It fetches the decrypted plain text variable value securely using your currently authenticated user.
It proves the CLI works to securely obtain values from Key Vault context.
Pillar Connection
Security — programmatic retrieval over TLS.
Run the following command:
az keyvault secret show
--vault-name $KV_NAME
--name db-password
--query value
--output tsv

Step 4:Assign VM a Managed Identity to access the vault
This step configures Azure AD to grant the VM identity-based permissions to extract secrets.
It's needed because it allows background services in the VM to get the secret later without logging in themselves.
Pillar Connection
Security — Zero-credential deployment utilizing Managed Identities.
Run this command:
az vm identity assign
--resource-group azurecli-lab-rg
--name lab-vm
$PRINCIPAL_ID=$(az vm show
--resource-group azurecli-lab-rg
--name lab-vm
--query identity.principalId
--output tsv)
az role assignment create
--role 'Key Vault Secrets User'
--assignee $PRINCIPAL_ID
--scope $(az keyvault show --name $KV_NAME --query id --output tsv)

Role Assignment Create:
Monitor Costs & Set a Budget Alert
Step 1:Get your subscription ID
This queries the active subscription ID programmatically.
Why It's Needed
Required when sending alerts so it explicitly monitors current active account.
Pillar Connection
Cost Optimization — understanding which account you are billing towards.
Run: SUB_ID=$(az account show --query id --output tsv)
echo "Subscription: $SUB_ID"

Step 2:Create a $10 monthly budget with an alert at 80%
This sets a strict ceiling for consumption using native Azure limits.
Why It's Needed
Setting alerts prevents surprise billing caused by unmonitored rogue or misconfigured compute arrays.
Pillar Connection
Cost Optimization — preventative guard-rails ensuring fiscal control.
Setting a budget is the "responsible" side of being a Cloud Engineer. It proves you aren't just building things, you're managing the Cost Management aspect of the cloud, which is a major focus for businesses in 2026.
Run:
az consumption budget create
--budget-name lab-budget
--amount 10
--category Cost
--time-grain Monthly
--start-date (Get-Date -Format "yyyy-MM-01")
--end-date 2026-12-31
--resource-group azurecli-lab-rg
--notifications '[{"enabled":true,"operator":"GreaterThan","threshold":80,"contactEmails":["you@example.com"]}]'
The output shows notification errors so we'll run another command. This version avoids all the JSON/notification issues that broke earlier.
**$subId = az account show --query id -o tsv
az consumption budget create
--budget-name "lab-budget"
--amount 10
--category Cost
--time-grain Monthly
--start-date "2026-03-01"
--end-date "2026-12-31" `
--subscription $subId**

This displays a RBACAccessDenied error, but this screenshot:
confirms ownership of the subscription.

This above screenshot shows Invalid budget configuration.
The CLI keeps failing with different error types.

I ran into another Invalid budget configuration error.
I confirmed the subscription is active, enabled:
I ran another command which finally confirms the "root cause" 100%.

From your output:
"quotaId": "FreeTrial_2014-09-01",
"spendingLimit": "On"
This means am using a Free Trial subscription with spending limit ON.
Why the budget creation keeps failing:
Azure does NOT allow budget creation via CLI for:
- Free Trial subscriptions
- Subscriptions with spending limit ON
That’s why I keep getting invalid budget configuration and
RBACAccessDenied (misleading error).
Important insight
Since I already HAVE a built-in spending cap, it automatically shuts down when credits finish.
So Azure assumes “You don’t need a budget — we already limit your spending.”
These are the available options:
- OPTION 1 — Use Azure Portal to create it manually (works sometimes) when CLI/API is blocked.
Go to:
Cost Management → Budgets → + Add
- OPTION 2 — Upgrade subscription (guaranteed solution) Click “Upgrade” at the top of your portal and this will remove spending limit, convert to Pay-As-You-Go. This allows budgets and alerts, with Full CLI support.
The ONLY blocker is the Free Trial restriction.
Recommendation: For learning (especially Azure CLI labs), Upgrade the subscription otherwise, you’ll keep hitting hidden limitations like this.
Step 3:Check current resource group costs
This creates Log Analytics workspace to ingest usage metrics and performance logs later on.
Why It's Needed
Provides unified overview. Essential monitoring dependency for true Production Readiness.
Pillar Connection
Operational Excellence — creating the hub for telemetry.
terminal
Run:
az monitor log-analytics workspace create
--resource-group azurecli-lab-rg
--workspace-name lab-logs
--location koreacentral
Clean Up & Document Your Work
Step 1: Delete the resource group (and everything in it)
It removes the resource group and triggers the recursive cascading wipe of all associated child network and data structures attached within it.
Why It's Needed
The ultimate power move of organized Resource tiering and management. Cloud instances incur hourly charges, immediate destruction preserves free-tiers.
Pillar Connection
Cost Optimization — decommission what you no longer use.
terminal
Run:
az group delete --name $RG --yes --no-wait
az group list --output table
The fist line of the command deletes ALL resources in the group — VM, VNet, Storage, Key Vault. While the second line
verifies deletion. (wait a few minutes, then check)
Step 2:Create a project folder and write a README
It scaffolds standard markdown files documenting everything accomplished here.
Why It's Needed
It ensures recruiters see exactly what was executed instead of an empty claim regarding Cloud expertise.
Pillar Connection
Operational Excellence — automated documentation.
Run the following commands:
mkdir azure-cli-lab; cd azure-cli-lab
This creates a new directory (folder)
(The semicolon (;) is the valid statement separator in my Powershell version. It does the exact same thing, that is, it tells the computer to finish the first task and then start the second one.)
git init
Initializes a new Git repository in the current directory.
To display the content of a file in the terminal, run this command:
**@'
Azure CLI Cloud Lab
What I Built
A complete Azure environment using only the Azure CLI — no portal.
Resources Created
- Resource Group (azurecli-lab-rg in East US)
- Virtual Network (10.0.0.0/16) with Subnet (10.0.1.0/24)
- NSG with SSH (22) and HTTP (80) rules
- Ubuntu VM (Standard_B1s) with Nginx installed
- Storage Account with blob container
- Key Vault with secret & managed identity
- Cost Budget at $10/month with 80% alert
Key Commands
az group create, az vm create, az network vnet create, az storage account create, az keyvault create
What I Learned
- How to provision a full Azure environment from the CLI
- VNet + NSG = the network security foundation
- Key Vault + Managed Identity = zero-credential secret management
- Always delete resources after a lab to avoid charges '@ | Set-Content -Path "README.md"**
.
The error you see in my screenshot resulted when I used a bash command instead of PowerShell.
Now that I've "built" the file, let's "see" it. Run this command to read it back in your terminal:
Get-Content README.md

Notice the README.md is in the azure-cli-lab file.
Step 3:Commit and push to GitHub
This pushes your locally created lab notes to an external hosted tracking service.
Why It's Needed
A standard CI workflow for real-world projects and portfolio sharing.
Pillar Connection
Operational Excellence — tracking version history in remote repos.
Run the following commands:
git add .
Stages all changed files for the next commit.
If you want to be specific, you can name the file:git add README.md
git commit -m 'docs: Azure CLI cloud lab — full environment from scratch'
Creates a new commit with all staged changes and the message after -m
git branch -M main
Lists, creates, or manages branches.
git remote add origin https://github.com//azure-cli-lab.git
Stages the specified file(s) for the next commit
git push -u origin main
Uploads your local commits to the remote repository.
The Goal is actually sending the box to the cloud.
The -u "links" your local folder to the GitHub folder forever, so next time you only have to type git push.
We need to make sure there is a "landing pad" waiting for your code on the internet.
Think of it like this - your terminal knows what to send, but GitHub doesn't know where to put it yet.
Step 1: Create the "Landing Pad" (GitHub Website)
Before running the next command, you need to do this manually in your browser:
- Go to github.com.
- Click the + icon in the top right and select New repository. Name it exactly azure-cli-lab.
Important: Do not check "Initialize this repository with a README" (because we already created one in your terminal).
- Scroll to the page bottom and Click Create repository.
Step 2: The "Git Log" Check (Terminal)
While setting that up, let's verify that the current branch 'main' already has commits, by running this command:
git log --oneline

Step 3:Connect and Push
Once the GitHub repository is created on the website,I'll run these two final commands to finish the lab:
Connect your computer to the web address (Replace YOUR_USERNAME)
git remote add origin https://github.com/rahimahisah17/azure-cli-lab.git
Upload the files
git push -u origin main

The latest screenshot shows a total success. I've officially "pushed" your code from your local machine to the cloud. Seeing * [new branch] main -> main is the final green light for any developer.
Writing objects: 100% (3/3) means All 3 parts of your Git snapshot (the files, the folder info, and the message) were uploaded.
branch mainset up to track 'origin/main' means your computer and GitHub are now "synced." Next time you change your README, you only have to type git push, no extra settings needed.
Correction to READme
I realized, I used East US, instead of Korea Central. I must also state that I failed to create the budget and state the reason.
Step 1: Update the README.md File
This command uses a "Here-String" to overwrite your existing file with the new location (Korea Central) and the note about subscription limitations.
Run this block:
**@'
Azure CLI Cloud Lab
What I Built
A complete Azure environment using only the Azure CLI — no portal.
Resources Created
- Resource Group (azurecli-lab-rg in Korea Central)
- Virtual Network (10.0.0.0/16) with Subnet (10.0.1.0/24)
- NSG with SSH (22) and HTTP (80) rules
- Ubuntu VM (Standard_B1s) with Nginx installed
- Storage Account with blob container
- Key Vault with secret & managed identity
[!IMPORTANT]
Cost Budget Note: The $10 monthly budget failed to implement in this specific environment due to Azure subscription limitations (e.g., Free Tier or specific tenant restrictions).
Key Commands
az group create, az vm create, az network vnet create, az storage account create, az keyvault create
What I Learned
- Regional differences: Migrated deployment to Korea Central.
- API Constraints: Budgeting tools are restricted on certain subscription types.
- Always delete resources after a lab to avoid charges. '@ | Set-Content -Path "README.md"**
Step 2: Update Your Resource Group Location
Since I decided to change the location to Korea Central, I ran this to update your Azure environment to match your new documentation:
az group update --name "azurecli-lab-rg" --set location="koreacentral"
Step 3: Amend and Force Push to GitHub
Since I already pushed a version of this project, I will "amend" the previous commit so your history stays clean and professional.
git add README.md
Step 4: Overwrite the last commit message
git commit --amend -m "docs: update location to Korea Central and note budget limit
Step 5: Force push to the cloud
(This is required because we are changing history that was already uploaded.)
Summary
In this part of the project, I successfully extended my Azure CLI lab by implementing storage, security, and operational workflows. I created a storage account and container, uploaded and verified files, and explored two authentication approaches: RBAC and access keys. I also set up Azure Key Vault to securely store and retrieve secrets, and configured a managed identity for secure, credential-free access.
While attempting to implement cost monitoring, I encountered Azure subscription limitations that prevented budget creation via CLI, this is an important real-world insight into how Free Trial subscriptions behave.
Overall, this phase reinforced key cloud principles of secure data handling, identity-based access, cost awareness, and environment cleanup. It also demonstrated that beyond just running commands, understanding Azure’s underlying constraints and design decisions is critical for building reliable, production-ready solutions.
























Top comments (5)
Welldone
Thanks Coach
Well done 🔥👏🏽
womenintech
😊we are here o
You dey para o