DEV Community

Ezekiel Umesi
Ezekiel Umesi

Posted on

Building an Automated Event-Driven File Processing System in Azure (No Code Required)

Managing file uploads and post-processing workflows often requires custom scripts, manual oversight, and complex pipelines. But with Microsoft Azure, you can build a fully automated, event-driven file processing systemโ€”all through the Azure Portal. No command-line tools, no coding beyond minimal setup.

This guide walks you step by step through creating a seamless workflow where uploading a file automatically triggers processing, archiving, and cleanup actions.


๐Ÿš€ What Youโ€™ll Build

Workflow Overview:

File Upload โ†’ Blob Storage โ†’ Event Grid โ†’ Logic App โ†’ App Service โ†’ Automation Account

Hereโ€™s how it works in practice:

  1. A user uploads a file to Azure Blob Storage.
  2. Event Grid detects the upload and fires an event.
  3. Logic App orchestrates the workflow.
  4. App Service processes the file with your business logic.
  5. Automation Account archives and cleans up processed files.

This architecture scales automatically, has built-in monitoring, and charges only for what you use.


โœ… Prerequisites

Before you start, ensure you have:

  • An active Azure subscription
  • Contributor access to create resources
  • A modern browser (Edge, Chrome, or Firefox)
  • A test file (any text, image, or document)

Part 1: Foundation Setup

1. Create a Resource Group

  • In the Azure Portal, create a new Resource Group:

    • Name: rg-workflow-demo
    • Region: Your closest Azure region

This keeps all components organized.

2. Create a Storage Account

The storage account stores uploaded files and emits events when new files arrive.

  • Name: stworkflowdemo123456 (use random numbers for uniqueness)
  • Redundancy: LRS
  • Secure transfer: Enabled
  • Public access: Private

Then, create a container inside it:

  • Name: input-files
  • Public Access: Private

Part 2: Build the Processing Engine

1. App Service Plan

Create an App Service Plan to host your file processing logic.

  • Name: asp-workflow-demo
  • OS: Linux
  • Pricing Tier: B1 Basic

2. Web App

Create a Web App on that plan.

  • Name: app-workflow-processor-123456
  • Runtime: Node.js 18 LTS

3. Deploy Processing Logic

Through Kudu (Advanced Tools) or App Service Editor, create two files:

package.json

{
  "name": "azure-file-processor",
  "version": "1.0.0",
  "main": "app.js",
  "scripts": { "start": "node app.js" },
  "dependencies": {
    "express": "^4.18.2",
    "@azure/storage-blob": "^12.17.0"
  }
}
Enter fullscreen mode Exit fullscreen mode

app.js โ†’ contains your processing logic (e.g., reading blob metadata).

Run npm install in the console.

Add an App Setting for your storage connection string:

  • Key: STORAGE_CONNECTION_STRING
  • Value: [Your storage connection string]

Part 3: Orchestrate with Logic Apps

The Logic App coordinates the workflow.

  1. Create a Logic App (Consumption).
  2. Add a trigger: When an HTTP request is received.
  3. Parse the Event Grid JSON payload.
  4. Extract file details with a Compose action.
  5. Call the App Service API using an HTTP action.
  6. Add a Condition to check if processing succeeded.

Part 4: Post-Processing Automation

Create an Automation Account for archiving and cleanup.

Runbook: Archive Files

Create a PowerShell Runbook that:

  • Connects via Managed Identity
  • Creates an processed-files container if missing
  • Copies the file into it with a timestamped name

Example archive file: 20250911_213000_myfile.txt


Part 5: Security & Permissions

  • Logic App โ†’ Storage Account: Assign Storage Blob Data Reader
  • Automation Account โ†’ Storage Account: Assign Storage Blob Data Contributor
  • Logic App โ†’ Automation Account: Assign Automation Contributor

Enable System-assigned Managed Identity on both Logic App and Automation Account.


Part 6: Event Grid Integration

Finally, wire up the trigger:

  • Event Source: Blob Created in input-files
  • Event Schema: Event Grid Schema
  • Endpoint: Logic App HTTP POST URL

Part 7: Test the Workflow

  1. Upload a test file to input-files.
  2. Watch Event Grid trigger the Logic App.
  3. Check App Service Log Stream for processing logs.
  4. Verify the Automation Runbook created an archived copy in processed-files.

Expected Outcome:

  • File uploaded โ†’ automatically processed โ†’ archived with timestamp.

๐Ÿ”Ž Troubleshooting Tips

  • Logic App not triggering? Check event subscription status.
  • App Service errors? Review Log Stream and verify connection strings.
  • Automation failures? Confirm role assignments & module imports.
  • Permission errors? Wait 5โ€“10 minutes after role assignment.

๐Ÿ’ก Benefits of This Architecture

  • No Coding Required โ†’ All via Azure Portal
  • Fully Automated โ†’ Zero manual intervention
  • Scalable โ†’ Handles single or bulk uploads
  • Secure โ†’ Managed identities, no secrets
  • Cost-Effective โ†’ Pay-per-use pricing
  • Monitored โ†’ Full visibility in Azure Portal

๐Ÿ’ฐ Cost Considerations (100 files/month)

  • Logic Apps: ~\$5โ€“15
  • App Service B1: ~\$13
  • Storage: ~\$2โ€“10
  • Automation: First 500 min free
  • Event Grid: First 100K ops free

๐Ÿ‘‰ Use Free App Service tier during development.


Next Steps: Go Beyond the Portal

Once comfortable with the portal, explore Infrastructure as Code with:

  • Azure CLI โ†’ Automate deployments
  • ARM/Bicep โ†’ Template-driven provisioning
  • Terraform โ†’ Cloud-agnostic automation
  • CI/CD Pipelines โ†’ Continuous deployment

Example command:

az storage blob upload --account-name stworkflowdemo123456 \
--container-name input-files \
--name test.txt --file ./test.txt
Enter fullscreen mode Exit fullscreen mode

๐ŸŽฏ Conclusion

With just the Azure Portal, youโ€™ve built a production-ready, event-driven file processing pipeline. From file upload to automated archiving, every step happens seamlessly without manual effort.

This foundation is perfect for learning, prototyping, and scaling into enterprise-grade solutions with Infrastructure as Code.

Your Azure workflow is now ready to handle real-world file processing scenariosโ€”securely, efficiently, and automatically.

Top comments (0)