DEV Community

Cover image for How a Misconfigured CloudFront Cache Can Lead to Personal Data Leaks - Understanding and Securing API Caching
Kazuya.Y
Kazuya.Y

Posted on

How a Misconfigured CloudFront Cache Can Lead to Personal Data Leaks - Understanding and Securing API Caching

Introduction

When using CloudFront, many developers tend to choose the default cache policy Managed-CachingOptimized.

However, applying this policy to APIs without fully understanding how it works can lead to serious personal data leaks and other security incidents.

What is a Cache Key

By default, CloudFront creates caches based on the request path.

In other words, the request path acts as the cache key.

Example: Image Requests

User A accesses /images/icon_1.png
→ CloudFront retrives the object from the origin (e.g., S3) and caches it.
User B accesses /images/icon_1.png
→ CloudFront returns the cached content (without accessing the origin).
User C accesses /images/icon_2.png
→ CloudFront fetches the new object from the origin and creates the new cache.

In short, CloudFront treats the request path as the cache key.

Example of the Incident

In 2021, a serious incident occurred at Klarna, a payment service provider based in Sweden.

Reference: Klarna Detailed Incident Report – Incorrect Cache Configuration

Here is what happened:

The CDN cached API responses intended for authenticated users, as a result, personal data was displayed to other users.

In other words, the response meant for user A was service to user B.

Why It Happened

The root cause was that CloudFront's cache key strategy was based solely on the request path.

Even if an API had a endpoint like /profile and returned responses for each logged-in user, CloudFront would interpret all those requests as the same path.

User A → /profile → Response A (cached)  
User B → /profile → CloudFront: “Same path!” → Response A returned
Enter fullscreen mode Exit fullscreen mode

As a result, all users received the personal data of the first user who accessed the endpoint, which led to a serious information leak.

Countermeasures

Countermeasure 1. Disable Caching on the CloudFront Side

You can prevent API responses from being cached by setting the CloudFront cache policy to CachingDisabled.

Countermeasure 2. Set Cache-Control Headers on the Backend

To add an additional layer of protection at the application level, include this following headers in your API responses:

Cache-Control: private, no-cache, no-store, must-revalidate  
Pragma: no-cache  
Expires: 0
Enter fullscreen mode Exit fullscreen mode

These headers ensure that:

  • CDNs and browsers do not cache the responses
  • The origin server is always revalidated before reuse

As a result, CloudFront and other intermediaries are forced to fetch a fresh response each time.

Countermeasure 3. Completely Separete Static Content and APIs

By hosting static content and APIs on different domains, you can prevent cache configurations from interfering with each other.

# For static content  
static.example.com → CloudFront → S3 (caching enabled)  

# For APIs  
api.example.com → ALB / API Gateway → Backend (caching disabled)
Enter fullscreen mode Exit fullscreen mode

In this setup, the communication bacomes cross-origin, therefore you need to configure CORS on the API side.

Conclusion

Cashing on the API side can be useful, but it is also extremely dangerous.

For APIs that return data for authenticated users, you should completely disable caching.

If you need to use caching, leverage Redis or Memcached inside your application, and make sure no personal data is stored in the CDN edge cache.

Top comments (0)