Managing file uploads and post-processing workflows often requires custom scripts, manual oversight, and complex pipelines. But with Microsoft Azure, you can build a fully automated, event-driven file processing systemโall through the Azure Portal. No command-line tools, no coding beyond minimal setup.
This guide walks you step by step through creating a seamless workflow where uploading a file automatically triggers processing, archiving, and cleanup actions.
๐ What Youโll Build
Workflow Overview:
File Upload โ Blob Storage โ Event Grid โ Logic App โ App Service โ Automation Account
Hereโs how it works in practice:
- A user uploads a file to Azure Blob Storage.
- Event Grid detects the upload and fires an event.
- Logic App orchestrates the workflow.
- App Service processes the file with your business logic.
- Automation Account archives and cleans up processed files.
This architecture scales automatically, has built-in monitoring, and charges only for what you use.
โ Prerequisites
Before you start, ensure you have:
- An active Azure subscription
- Contributor access to create resources
- A modern browser (Edge, Chrome, or Firefox)
- A test file (any text, image, or document)
Part 1: Foundation Setup
1. Create a Resource Group
-
In the Azure Portal, create a new Resource Group:
- Name:
rg-workflow-demo
- Region: Your closest Azure region
- Name:
This keeps all components organized.
2. Create a Storage Account
The storage account stores uploaded files and emits events when new files arrive.
- Name:
stworkflowdemo123456
(use random numbers for uniqueness) - Redundancy: LRS
- Secure transfer: Enabled
- Public access: Private
Then, create a container inside it:
- Name:
input-files
- Public Access: Private
Part 2: Build the Processing Engine
1. App Service Plan
Create an App Service Plan to host your file processing logic.
- Name:
asp-workflow-demo
- OS: Linux
- Pricing Tier: B1 Basic
2. Web App
Create a Web App on that plan.
- Name:
app-workflow-processor-123456
- Runtime: Node.js 18 LTS
3. Deploy Processing Logic
Through Kudu (Advanced Tools) or App Service Editor, create two files:
package.json
{
"name": "azure-file-processor",
"version": "1.0.0",
"main": "app.js",
"scripts": { "start": "node app.js" },
"dependencies": {
"express": "^4.18.2",
"@azure/storage-blob": "^12.17.0"
}
}
app.js
โ contains your processing logic (e.g., reading blob metadata).
Run npm install
in the console.
Add an App Setting for your storage connection string:
- Key:
STORAGE_CONNECTION_STRING
- Value: [Your storage connection string]
Part 3: Orchestrate with Logic Apps
The Logic App coordinates the workflow.
- Create a Logic App (Consumption).
- Add a trigger: When an HTTP request is received.
- Parse the Event Grid JSON payload.
- Extract file details with a Compose action.
- Call the App Service API using an HTTP action.
- Add a Condition to check if processing succeeded.
Part 4: Post-Processing Automation
Create an Automation Account for archiving and cleanup.
Runbook: Archive Files
Create a PowerShell Runbook that:
- Connects via Managed Identity
- Creates an
processed-files
container if missing - Copies the file into it with a timestamped name
Example archive file: 20250911_213000_myfile.txt
Part 5: Security & Permissions
- Logic App โ Storage Account: Assign Storage Blob Data Reader
- Automation Account โ Storage Account: Assign Storage Blob Data Contributor
- Logic App โ Automation Account: Assign Automation Contributor
Enable System-assigned Managed Identity on both Logic App and Automation Account.
Part 6: Event Grid Integration
Finally, wire up the trigger:
- Event Source: Blob Created in
input-files
- Event Schema: Event Grid Schema
- Endpoint: Logic App HTTP POST URL
Part 7: Test the Workflow
- Upload a test file to
input-files
. - Watch Event Grid trigger the Logic App.
- Check App Service Log Stream for processing logs.
- Verify the Automation Runbook created an archived copy in
processed-files
.
Expected Outcome:
- File uploaded โ automatically processed โ archived with timestamp.
๐ Troubleshooting Tips
- Logic App not triggering? Check event subscription status.
- App Service errors? Review Log Stream and verify connection strings.
- Automation failures? Confirm role assignments & module imports.
- Permission errors? Wait 5โ10 minutes after role assignment.
๐ก Benefits of This Architecture
- No Coding Required โ All via Azure Portal
- Fully Automated โ Zero manual intervention
- Scalable โ Handles single or bulk uploads
- Secure โ Managed identities, no secrets
- Cost-Effective โ Pay-per-use pricing
- Monitored โ Full visibility in Azure Portal
๐ฐ Cost Considerations (100 files/month)
- Logic Apps: ~\$5โ15
- App Service B1: ~\$13
- Storage: ~\$2โ10
- Automation: First 500 min free
- Event Grid: First 100K ops free
๐ Use Free App Service tier during development.
Next Steps: Go Beyond the Portal
Once comfortable with the portal, explore Infrastructure as Code with:
- Azure CLI โ Automate deployments
- ARM/Bicep โ Template-driven provisioning
- Terraform โ Cloud-agnostic automation
- CI/CD Pipelines โ Continuous deployment
Example command:
az storage blob upload --account-name stworkflowdemo123456 \
--container-name input-files \
--name test.txt --file ./test.txt
๐ฏ Conclusion
With just the Azure Portal, youโve built a production-ready, event-driven file processing pipeline. From file upload to automated archiving, every step happens seamlessly without manual effort.
This foundation is perfect for learning, prototyping, and scaling into enterprise-grade solutions with Infrastructure as Code.
Your Azure workflow is now ready to handle real-world file processing scenariosโsecurely, efficiently, and automatically.
Top comments (0)