I spent the last several months building Storve — a React state management library. Not because the ecosystem needed another one, but because I wanted to understand how state management actually works under the hood.
Today I'm shipping it. But before I did, I built a real-time stock market simulator to stress test it. It found 3 bugs. This is the full story.
Why I built it
Every React project I worked on ended up with two state libraries — Zustand for client state and TanStack Query for server state. Two mental models, two sets of docs, two devtools panels.
The question I kept asking: why are these separate? Loading state, error state, cached data — it's all just state. It should live in the same place as the rest of your app state.
That was the idea behind Storve.
What Storve is
Storve is a React state management library where async state is first-class. No separate server state library. No provider wrapping your app. One createStore call covers everything.
const userStore = createStore({
// async state — loading, error, cache, SWR all built in
user: createAsync(fetchUser, { ttl: 60_000, staleWhileRevalidate: true }),
// regular state — lives right next to it
theme: 'light',
sidebarOpen: false,
})
In your component:
function UserProfile({ id }: { id: string }) {
const { data, loading, error } = useStore(userStore, s => s.user)
const theme = useStore(userStore, s => s.theme)
useEffect(() => {
userStore.fetch('user', id)
}, [id])
if (loading) return <Spinner />
return <div>{data.name}</div>
}
No actions. No reducers. No provider. No query client.
How it works internally
The core store
The store is built on a Proxy that intercepts every read and write. When you call getState(), it returns a shallow copy — mutations to the returned object never affect the store.
const store = createStore({ count: 0, name: 'Alice' })
const before = store.getState()
store.setState({ count: 99 })
before.count // still 0 — it's a snapshot
store.getState().count // 99
Auto-tracking
The Proxy tracks which keys were read during a useStore selector. When those keys change, the component re-renders. When unrelated keys change, it doesn't.
// this component only re-renders when count changes
// not when name or theme changes
const count = useStore(store, s => s.count)
This is what makes Signals possible. A signal subscribes to exactly one key — the component re-renders only when that specific value changes.
// 20 stocks ticking every 500ms
// each row subscribes to one signal — zero cross-row re-renders
export const priceSignals: Record<string, Signal<Stock>> =
Object.fromEntries(
SYMBOLS.map(sym => [
sym,
signal(marketStore, 'stocks', stocks => stocks[sym])
])
)
The async engine
createAsync returns a descriptor that createStore picks up during initialization. The store creates an async engine per key that manages its own state machine:
idle → loading → success
↘ error
Every async key automatically has this shape:
store.getState().user === {
data: null,
loading: false,
error: null,
status: 'idle',
refetch: () => void
}
Race condition protection is built in — if you call fetch three times rapidly, only the last response wins. Previous responses are silently discarded.
The registry pattern
Features like withDevtools, withPersist, and withSync use a registry pattern. Instead of wrapping the store after creation, they annotate the definition object before createStore runs. The registry then picks up the metadata and extends the store during initialization.
// the registry pattern — annotate before createStore
const definition = withDevtools(
{ trades: [] as Trade[] },
{ name: 'TradeStore', maxHistory: 100 }
)
const tradeStore = createStore(definition)
// tradeStore.undo() and tradeStore.canUndo now exist
This is what enables compose — piping a store through multiple enhancers cleanly:
const store = compose(
createStore({ count: 0, theme: 'light' }),
s => withPersist(s, { key: 'app', adapter: localStorageAdapter() }),
s => withDevtools(s, { name: 'My Store' }),
s => withSync(s, { channel: 'my-app', keys: ['theme'] })
)
Comparison with Zustand and TanStack Query
vs Zustand
Zustand is great for client state but has no async story. You either roll your own loading/error/cache pattern or reach for TanStack Query.
Zustand:
const useStore = create((set) => ({
count: 0,
increment: () => set(s => ({ count: s.count + 1 })),
}))
// you still need TanStack Query for this
const { data, isLoading } = useQuery({ queryKey: ['user'], queryFn: fetchUser })
Storve:
const store = createStore({
count: 0,
user: createAsync(fetchUser, { ttl: 60_000 }),
actions: {
increment() { store.setState(s => ({ count: s.count + 1 })) }
}
})
One store. One mental model.
vs TanStack Query
TanStack Query is excellent at server state but has no client state story. You still need Zustand or Context for UI state.
Storve handles both in the same store. The tradeoff: TanStack Query has a richer ecosystem (devtools, React Native, SSR adapters). Storve is simpler and lighter — ~4KB total vs TanStack Query's ~13KB.
Bundle size
Tree-shakable — you only pay for what you import:
| Import | Size (gzipped) |
|---|---|
@storve/core |
~1.4KB |
@storve/core/async |
+1.1KB |
@storve/core/computed |
+0.8KB |
@storve/core/persist |
+1.2KB |
@storve/core/signals |
+0.4KB |
@storve/core/devtools |
+0.8KB |
@storve/core/sync |
+0.6KB |
A typical app using core + async + persistence = ~3.7KB.
The stress test — StockSim
Before posting I built StockSim — a real-time stock market simulator built entirely on Storve. No paid APIs, all data generated client-side.
What it does:
- 20 stocks ticking live every 500ms (signals, batch)
- OHLCV charts with 5 time ranges including 1Y = 100,800 candles (createAsync, TTL, SWR)
- Portfolio P&L updating on every tick (computed)
- Full trade history with undo/redo and time-travel slider (withDevtools)
- Watchlist that syncs across browser tabs (withSync)
- IndexedDB persistence (withPersist)
- Floating DevTools panel showing all 5 stores live
Every Storve subpath exercised under real load.
Benchmark numbers
signal isolation 96K ops/sec 0 foreign re-renders ✅
computed P&L 100K ops/sec 0.010ms per recompute ✅
tick engine 9.4K ops/sec 0.100ms per 20-stock batch ✅
SWR stale delay 0ms ✅
TTL cache hit 10x faster than cold fetch ✅
undo speed 91K ops/sec 0.011ms ✅
undo atomicity 29K ops/sec 1000/1000 correct ✅
Stress test — max level (120 seconds, 10x speed)
pnpm stress:max
tick engine avg: 0.519ms max: 87ms dropped: 3
computed drift 0 ✅
undo/redo cycles 1000/1000 correct
setState total 57,060 calls
ticks 2,353
verdict: STABLE ✅
57,060 setState calls. 1,000 undo/redo cycles. Zero computed drift across 2,353 ticks.
The 3 dropped ticks are GC pauses from holding 500K candles in async cache — not a Storve bug.
Bugs found during building
Building StockSim found 3 real bugs:
Bug 1 — withDevtools wrong API (fixed in v1.1.2)
The README showed:
// what the README said
const store = withDevtools(createStore({ count: 0 }), { name: 'test' })
store.undo() // undefined 💥
But the registry pattern requires annotating the definition before createStore:
// what actually works
const store = createStore(withDevtools({ count: 0 }, { name: 'test' }))
store.undo() // works ✅
Fixed in v1.1.2 with dual-signature support — both patterns now work.
Bug 2 — Burst write debounce (fixed in v1.1.2)
withPersist had a trailing-only debounce. When the tick engine fired 20 setState calls inside a batch(), all 20 triggered IndexedDB writes instead of 1. This caused 3.8ms tick engine spikes.
Fixed with leading+trailing debounce — a burst of 20 calls now produces at most 2 writes (one leading, one trailing).
Bug 3 — Async cache no size limit (fixed in v1.1.3)
createAsync had no eviction policy. Loading all 5 chart ranges simultaneously accumulated +67MB in the TTL cache, causing 70ms GC pauses.
Fixed with maxCacheSize + LRU eviction:
candles: createAsync(fetchCandles, {
ttl: 30_000,
staleWhileRevalidate: true,
maxCacheSize: 3 // keep only 3 most recent entries
})
What's next
- StackBlitz demo (StockSim)
- SSR support
- React Native adapter
eslint-plugin-storve
Try it
npm install @storve/core @storve/react
GitHub: https://github.com/Nam1001/storve
npm: https://npmjs.com/package/@storve/core
This is the first open source library I've ever shipped. If something seems wrong, over-engineered, or could be done better — I genuinely want to know.
Top comments (0)