πΎ Storing 50MB in Redux Persist β With IndexedDB & Compression
In modern React applications, state persistence is often essential. Whether you're caching page data, user sessions, or complex app states β you'll hit limitations with localStorage
and sessionStorage
pretty quickly.
I recently had to implement a solution for saving cached page data in a React + Redux application. The kicker? I needed to store up to 50MB of page data persistently without slowing down the app or crashing the browser. Here's the journey that led me to combine Redux, redux-persist, IndexedDB, and compression via fflate β plus how I handled the dark side of storing too much.
π‘ The Problem: Memory vs Persistence
Most apps use localStorage
because it's easy, synchronous, and simple. But it has hard limits (around 5MB) and doesnβt work well for large datasets like deeply nested Redux state or offline-heavy apps.
For this use case, I wanted to:
- Persist Redux state across sessions.
- Store up to 50MB of data (especially cached pages).
- Optimize payload size using compression.
- Avoid the dreaded
QuotaExceededError
in the browser.
β The Solution: Redux Persist + IndexedDB + Compression
Using redux-persist
with redux-persist-indexeddb-storage
gave me the ability to use IndexedDB as the backend for persistence.
But to push performance even further, I added fflate
β a blazing fast compression library. With it, I compressed the Redux state before storing it and decompressed it when rehydrating.
π§ Setting Up the Store with Compression
Hereβs how I configured the Redux store with IndexedDB + fflate compression:
// store.ts
import { configureStore } from "@reduxjs/toolkit";
import { persistStore, persistReducer } from "redux-persist";
import createIdbStorage from "redux-persist-indexeddb-storage";
import rootReducer from "./rootReducer";
import fflateTransform from "./persistTransform"; // Compression transform
import { DEVELOPMENT } from "utils/constants/generic.constants";
import { REACT_INDEXED_DB, REACT_REDUX_PERSIST } from "utils/constants/auth.constants";
// Setup IndexedDB storage
const idbStorage = createIdbStorage(REACT_INDEXED_DB, {
version: 1,
storeName: REACT_INDEXED_DB,
description: "Redux Persist Store"
});
const persistConfig = {
key: REACT_REDUX_PERSIST,
storage: idbStorage,
transforms: [fflateTransform], // Apply compression
whitelist: ["cachedPages"]
};
const persistedReducer = persistReducer(persistConfig, rootReducer);
const store = configureStore({
reducer: persistedReducer,
devTools: process.env.REACT_APP_NODE_ENV === DEVELOPMENT,
middleware: (getDefaultMiddleware) =>
getDefaultMiddleware({
serializableCheck: false
})
});
const persistor = persistStore(store);
export { store, persistor };
𧬠persistTransform with fflate
I used a custom transform to compress and decompress the state using fflate
. Here's what it looks like:
// persistTransform.ts
import { createTransform } from "redux-persist";
import { strToU8, decompressSync, compressSync, strFromU8 } from "fflate";
const fflateTransform = createTransform(
// Compress state on save
(inboundState) => {
const json = JSON.stringify(inboundState);
const compressed = compressSync(strToU8(json));
return Array.from(compressed); // Store as array for IndexedDB
},
// Decompress state on rehydration
(outboundState) => {
const compressed = new Uint8Array(outboundState);
const json = strFromU8(decompressSync(compressed));
return JSON.parse(json);
}
);
export default fflateTransform;
π₯ This reduced my persisted state size by 40β60% depending on the complexity of data.
π§ͺ Testing Limits β and Hitting Them
Even with compression, browsers behave unpredictably when IndexedDB crosses ~50MB. Sometimes youβll hit a QuotaExceededError
, sometimes hydration just fails silently.
So, I wrote a recursive eviction function to remove the oldest cached data when space runs out.
π§Ή Auto-Clearing the Oldest Cached Data
async function getIndexedDB() {
try {
const db = await openDatabase();
const data = await getData(db, REACT_INDEXED_DB_STORE_KEY);
const size = new Blob([data], { type: "text/plain" }).size;
return size > 52428800; // 50MB
} catch (error) {
console.error("Error accessing IndexedDB:", error);
}
return false;
}
const clearOverFlowingData = async () => {
const isIndexedDBFull = await getIndexedDB();
if (isIndexedDBFull) {
const cachedPages = store.getState().cachedPages;
const { oldestPageID } = Object.entries(cachedPages).reduce(
(acc, [pageID, { date }]) => {
const currentDate = new Date(date);
if (!acc.oldestDate || currentDate < acc.oldestDate) {
return { oldestPageID: pageID, oldestDate: currentDate };
}
return acc;
},
{ oldestPageID: null, oldestDate: null }
);
dispatch(deleteCachedPage({ pageID: oldestPageID }));
clearOverFlowingData(); // Recursively call until memory is under threshold
}
};
And I triggered this with:
useEffect(() => {
clearOverFlowingData();
}, [pageID]);
This kept the IndexedDB size under control β safely below 45MB.
βοΈ Tradeoffs: IndexedDB vs localStorage (with Compression)
Feature | IndexedDB (w/ fflate) | localStorage |
---|---|---|
Storage limit | ~50MB | ~5MB |
Compression support | β Manual (fflate) | β Manual |
Speed of hydration | π’ Slower (compressed) | β‘ Fast |
Data structure support | β Complex objects | β Strings only |
Ideal for | Big, offline state | Small, fast config |
π§© Conclusion
If your Redux state is outgrowing localStorage
, using IndexedDB with compression is a powerful pattern. With redux-persist
, redux-persist-indexeddb-storage
, and fflate
, I was able to build a robust, scalable, and space-efficient caching mechanism.
β
Store big data
β
Shrink payload size
β
Keep browsers happy
This architecture is reusable across any React project dealing with offline data, caching, or large persisted state.
π TL;DR
- Use
redux-persist-indexeddb-storage
for large Redux state. - Compress data with
fflate
to reduce size by ~50%. - Add a recursive cleanup function to stay under quota.
- Expect slightly slower hydration, but much larger capacity.
π Thanks for Reading
If you're building apps with large Redux states, give this setup a try and see the impact on performance and reliability.
Happy coding! π»π
π Disclaimer:
This blog post was written with the assistance of AI (ChatGPT by OpenAI) to help structure the narrative, clarify technical concepts, and polish the content. The code examples and architectural decisions are based on real-world implementation.
Top comments (0)