DEV Community

Cover image for Why Website Images Sometimes Fail to Load — And How I Built a Highly Available Azure Storage Backend to Fix It
Mahmud Seidu Babatunde
Mahmud Seidu Babatunde

Posted on

Why Website Images Sometimes Fail to Load — And How I Built a Highly Available Azure Storage Backend to Fix It

A few months ago, I opened an online store to check the specifications of a device I wanted to buy.

The page loaded.

The navigation bar appeared.
The product description was visible.
But the product image never showed up.

Instead, there was a small broken-image icon sitting where the picture should have been.

If you’ve ever experienced that moment, you know exactly what happens next.

You refresh the page.

Nothing changes.

And suddenly you begin to wonder:

  • Is the website broken?
  • Is the product unavailable?
  • Is the platform unreliable?

What most users never realize is that a broken image on a website usually isn’t about the website itself.

It’s almost always about where the file is stored and how the storage system is designed.

Behind every image, downloadable document, and product asset is a storage system responsible for delivering those files instantly.

If that storage system is poorly configured, small things begin to fail:

  • images disappear
  • downloads stop working
  • documentation links break
  • websites feel unreliable

Modern cloud platforms solve this using distributed storage infrastructure.

In Microsoft Azure, one of the core services used for this purpose is Azure Blob Storage.

But simply creating a storage account isn’t enough.

Engineers must think about deeper questions:

  • What happens if an entire Azure region fails?
  • How can users access files without authentication?
  • What if someone accidentally deletes important files?
  • How do we keep track of document updates?

To explore these questions, I built a high-availability storage architecture for a public website using Azure Storage.

The goal was simple:

Create a system that can:

  • survive regional outages
  • serve public website assets
  • allow anonymous file access
  • recover deleted files
  • maintain file version history

Let’s walk through how that system was designed.

The Problem Engineers Face

Imagine running an e-commerce website where thousands of customers view product images every day.

Everything works perfectly until one day an Azure region experiences an outage.

Suddenly:

  • product images stop loading
  • downloadable files fail
  • customer experience breaks

The website itself may still be online.

But the storage system serving those files is unavailable.

This is why cloud engineers design storage systems not just to store files, but to ensure they remain available, recoverable, and resilient.

Azure Storage provides several mechanisms to solve these challenges.

Before implementing the solution, I will helps to understand a few important concepts.

Key Concepts Behind Azure Storage

Azure Storage Account

An Azure Storage Account is the top-level container that holds all storage services.

Inside a storage account you can store:

  • blobs (files, images, documents)
  • file shares
  • tables
  • queues

For public websites, Blob Storage is typically used to store static assets like images and downloadable files.

High Availability

High availability means designing systems that continue operating even when failures occur.

Failures in cloud infrastructure are expected.

Servers fail.
Networks fail.
Entire regions can fail.

High availability ensures systems remain accessible despite those failures.

Geo-Redundant Storage

Azure provides multiple redundancy options.

One of the most resilient is Read-Access Geo-Redundant Storage (RA-GRS).

This configuration:

  • replicates data to another Azure region
  • stores multiple copies of data
  • allows read access from the secondary region

If the primary region fails, the secondary region can still serve requests.

Thinking Like a Cloud Architect

When designing storage for a public website, engineers typically ask four important questions:

  1. What happens if an entire region fails?
  2. Can users access files without authentication?
  3. What if someone accidentally deletes files?
  4. How can we track document updates over time?

Azure Storage provides solutions for each of these challenges:

  • Geo-redundancy protects against regional outages
  • Anonymous blob access allows public content delivery
  • Soft delete enables file recovery
  • Blob versioning keeps historical file versions

The following implementation combines these capabilities to build a resilient storage backend.

Create a Storage Account With High Availability

Create a storage account to support the public website

In the Azure portal, search for and select Storage accounts.

Azure portal search results showing Storage Accounts

Select + Create.

Azure portal create storage account button

For resource group select New, give your resource group a name, and select OK.

Azure portal create resource group screen

Set the storage account name to publicwebsite and add a unique identifier.

Azure storage account naming screen

Take the default settings for the remaining configuration.

Azure storage account configuration defaults

Select Review + Create, then Create.

Azure review and create storage account screen

Wait for deployment and select Go to resource.

Azure storage account deployment complete screen

Configure High Availability

This storage must remain accessible if a regional outage occurs.

Navigate to Data management → Redundancy.

Azure storage redundancy settings page

Select Read-access Geo-redundant storage (RA-GRS).

Azure RA-GRS redundancy option

Review the primary and secondary region information.

Azure storage primary and secondary region locations

This ensures your website assets remain accessible even if the primary region experiences downtime.

Allow Anonymous Access for Public Files

Public website content should be accessible without requiring users to log in.

Navigate to Settings → Configuration.

Azure storage configuration settings page

Enable Allow blob anonymous access.

Azure enable blob anonymous access setting

Save the configuration.

Azure save configuration settings

Create a Container for Website Files

Navigate to Data storage → Containers.

Azure storage containers page

Select + Container.

Azure create container button

Name the container public and Create.

Azure container naming screen

Configure Anonymous Read Access

Select your container.

Azure public container overview

Change the access level to:

Blob (anonymous read access for blobs only).

Azure container access level configuration

Select OK.

Azure confirm container access level change

Upload and Test Files

Select Upload.

Azure blob container upload button

Choose a file.

Azure browse file for upload

Upload the file.

Azure upload confirmation window

Refresh the container to confirm the file appears.

Azure container file listing

Copy the file URL and test it in a browser.

Azure blob file URL view

Example URL:

https://publicwebsiteproject.blob.core.windows.net/publicc/image.png

Configure Blob Soft Delete

Navigate to the Overview page.

Azure storage overview page

Locate the Blob service section.

Azure blob service configuration section

Select Blob soft delete.

Azure blob soft delete settings page

Enable soft delete.

Azure enable blob soft delete option

Set retention to 21 days.

Azure soft delete retention configuration

Save the changes.

Azure save soft delete configuration

Restore Deleted Files

Delete a file.

Azure delete blob file confirmation

Confirm deletion.

Azure confirm blob deletion dialog

Enable Show deleted blobs.

Azure show deleted blobs toggle

Restore using Undelete.

Azure blob undelete option

Enable Blob Versioning

Navigate to Blob service → Versioning.

Azure blob versioning settings

Enable versioning.

Azure enable blob versioning option

Save the configuration.

Azure save blob versioning configuration

Lessons From This Implementation

Several key lessons stood out while building this storage architecture.

• High availability must be intentional.
• Public access should be carefully controlled.
• Data protection features like soft delete and versioning are essential.
• Testing configurations ensures your infrastructure behaves as expected.

These small architectural decisions are what transform basic cloud setups into production-ready systems.

Final Thoughts

Creating cloud resources is easy.

Designing them responsibly is what defines engineering maturity.

In this implementation we built a storage architecture capable of:

  • serving public website assets
  • remaining available during regional outages
  • allowing secure anonymous access
  • protecting files from accidental deletion
  • maintaining historical file versions

Behind every reliable digital experience is infrastructure configured with intention.

And mastering these fundamentals is how engineers move from using cloud tools to designing resilient cloud systems.

Top comments (0)