DEV Community

丁久
丁久

Posted on • Originally published at dingjiu1989-hue.github.io

Redis Caching Patterns

This article was originally published on AI Study Room. For the full version with working code examples and related articles, visit the original post.

Redis Caching Patterns

Redis Caching Patterns

Redis Caching Patterns

Redis Caching Patterns

Redis Caching Patterns

Redis Caching Patterns

Redis Caching Patterns

Redis Caching Patterns

Redis as a Cache

Redis is an in-memory data structure store that excels as a cache due to its sub-millisecond latency, rich data types, and built-in expiration. When used correctly, Redis can reduce database load by 90% or more while dramatically improving application response times.

Data Structures Overview

| Structure | Use Case | Example | |-----------|----------|---------| | String | Simple values, counters | User session, page views | | Hash | Object fields | User profile fields | | List | Ordered collection | Message queue, timeline | | Set | Unique values | Tags, followers | | Sorted Set | Ranked data | Leaderboard, rate limiting | | HyperLogLog | Cardinality estimation | Unique visitors | | Bitmap | Boolean flags | Daily active users |

Caching Patterns

Cache-Aside (Lazy Loading)

The most common caching pattern. The application checks the cache first; on a miss, it loads data from the database and populates the cache.

import redis

import json

r = redis.Redis(host='localhost', port=6379, decode_responses=True)

def get_user(user_id):

cache_key = f"user:{user_id}"

Try cache first

cached = r.get(cache_key)

if cached:

return json.loads(cached)

Cache miss: load from database

user = db.query("SELECT * FROM users WHERE id = %s", [user_id])

if user:

Populate cache with TTL

r.setex(cache_key, 3600, json.dumps(user))

return user

Pros: Only caches data that is requested, resilient to cache failures. Cons: Cache miss penalty (three round trips), stale data until TTL expires.

Write-Through

Data is written to the cache first, then to the database. Reads always hit the cache.

def update_user(user_id, data):

cache_key = f"user:{user_id}"

Write to cache first

r.setex(cache_key, 3600, json.dumps(data))

Then write to database

db.execute(

"UPDATE users SET name = %s, email = %s WHERE id = %s",

[data['name'], data['email'], user_id]

)

Pros: Cache is always consistent with database writes. Cons: Write latency increases, cache stores data that may never be read.

Write-Behind (Write-Back)

Data is written to cache and asynchronously written to the database later.

def write_behind(user_id, data):

cache_key = f"user:{user_id}"

Write to cache immediately

r.setex(cache_key, 3600, json.dumps(data))


Read the full article on AI Study Room for complete code examples, comparison tables, and related resources.

Found this useful? Check out more developer guides and tool comparisons on AI Study Room.

Top comments (0)