When 4.4MB Is Too Much: Solving the "Send Everything to Frontend" Anti-Pattern in React
How to stop loading megabytes of data on page load and start building responsive React applications that scale
The Problem: When "Dump It All" Meets Reality
A developer on Reddit recently shared a nightmare scenario: their inventory management system was transferring 4.4MB of JSON data to the frontend on initial load. The result? 3-5 second delays before users could interact with the UI. Buttons wouldn't respond. Dropdowns wouldn't open. The app felt broken.
This is the "send everything to frontend and filter there" anti-pattern in action. It happens like this:
- Developer builds an API that returns all records (or a massive subset)
- Developer thinks "React will handle it" and loads everything into state
- Developer adds
.filter()and.map()calls to show only what's needed - Users on slower connections or less powerful devices suffer
The core issue isn't React—it's architecture. Client-side filtering of server-sized datasets is a trap that catches many teams, and escaping it requires rethinking how data flows through your application.
The Solution: Moving Intelligence to the Right Layer
The fix isn't about using a faster library or adding more memoization (though those help). It's about moving filtering, pagination, and data transformation to where they belong: the server. Here's how to do it properly.
1. Server-Side Pagination with TanStack Query
The most impactful change you can make: only fetch what you need. TanStack Query (formerly React Query) makes server-state management elegant and handles pagination out of the box.
import { useQuery } from '@tanstack/react-query';
import { useState } from 'react';
const ITEMS_PER_PAGE = 50;
function InventoryList() {
const [page, setPage] = useState(1);
const { data, isLoading, isFetching } = useQuery({
queryKey: ['inventory', page],
queryFn: () => fetch(`/api/inventory?page=${page}&limit=${ITEMS_PER_PAGE}`)
.then(res => res.json()),
staleTime: 30000, // Cache for 30 seconds
});
if (isLoading) return <SkeletonLoader />;
return (
<div>
{isFetching && <LoadingOverlay />}
<table>
<tbody>
{data.items.map(item => (
<InventoryRow key={item.id} item={item} />
))}
</tbody>
</table>
<Pagination
currentPage={page}
totalPages={data.totalPages}
onPageChange={setPage}
/>
</div>
);
}
What changed:
- Instead of 4.4MB, we load ~50 items (~15KB)
- TanStack Query handles caching, background refetches, and deduping
- The UI remains responsive because we're not processing thousands of records
2. Server-Side Filtering
The second piece: push filter logic to the API, not the client.
function InventoryFilters() {
const [filters, setFilters] = useState({
category: '',
minQuantity: '',
search: '',
});
// Debounce search input to avoid API spam
const [debouncedSearch] = useDebounce(filters.search, 300);
const { data } = useQuery({
queryKey: ['inventory', { ...filters, search: debouncedSearch }],
queryFn: () => {
const params = new URLSearchParams();
if (filters.category) params.set('category', filters.category);
if (filters.minQuantity) params.set('minQuantity', filters.minQuantity);
if (debouncedSearch) params.set('search', debouncedSearch);
params.set('page', '1');
params.set('limit', '50');
return fetch(`/api/inventory?${params}`).then(r => r.json());
},
});
return (
<div className="filters">
<select
value={filters.category}
onChange={e => setFilters(f => ({ ...f, category: e.target.value }))}
>
<option value="">All Categories</option>
<option value="electronics">Electronics</option>
<option value="furniture">Furniture</option>
</select>
<input
type="number"
placeholder="Min quantity"
value={filters.minQuantity}
onChange={e => setFilters(f => ({ ...f, minQuantity: e.target.value }))}
/>
<input
type="text"
placeholder="Search products..."
value={filters.search}
onChange={e => setFilters(f => ({ ...f, search: e.target.value }))}
/>
</div>
);
}
The backend handles this:
// Express.js example
app.get('/api/inventory', async (req, res) => {
const { page = 1, limit = 50, category, minQuantity, search } = req.query;
const query = {};
if (category) query.category = category;
if (minQuantity) query.quantity = { $gte: parseInt(minQuantity) };
if (search) {
query.$or = [
{ name: { $regex: search, $options: 'i' } },
{ sku: { $regex: search, $options: 'i' } },
];
}
const items = await Inventory.find(query)
.skip((page - 1) * limit)
.limit(parseInt(limit))
.lean();
const total = await Inventory.countDocuments(query);
res.json({
items,
totalPages: Math.ceil(total / limit),
total,
});
});
3. Virtualization for Large Lists
Sometimes you do need to display many rows—think analytics dashboards or data grids. In those cases, don't render what users can't see. Virtualization (or "windowing") renders only the visible viewport plus a small buffer.
import { FixedSizeList as List } from 'react-window';
import AutoSizer from 'react-virtualized-auto-sizer';
function VirtualizedInventoryList({ items }) {
const Row = ({ index, style }) => (
<div style={style} className="inventory-row">
<span>{items[index].name}</span>
<span>{items[index].quantity}</span>
<span>{items[index].category}</span>
</div>
);
return (
<AutoSizer>
{({ height, width }) => (
<List
height={height}
itemCount={items.length}
itemSize={50}
width={width}
overscanCount={5} // Render 5 extra items above/below viewport
>
{Row}
</List>
)}
</AutoSizer>
);
}
This approach:
- Renders only ~20 items at a time, regardless of list size
- Maintains 60fps scrolling through 100,000+ items
- Keeps the DOM lightweight
4. Memoization: The Final Layer
Even with server-side pagination, memoization prevents unnecessary re-renders when parent components update:
import { memo, useMemo } from 'react';
// Only re-render when item actually changes
const InventoryRow = memo(function InventoryRow({ item, onEdit }) {
return (
<tr>
<td>{item.name}</td>
<td>{item.quantity}</td>
<td>
<button onClick={() => onEdit(item.id)}>Edit</button>
</td>
</tr>
);
});
// Optimize the list rendering
function InventoryTable({ items, sortBy }) {
const sortedItems = useMemo(() => {
return [...items].sort((a, b) => {
if (sortBy === 'name') return a.name.localeCompare(b.name);
return a.quantity - b.quantity;
});
}, [items, sortBy]);
return (
<table>
<tbody>
{sortedItems.map(item => (
<InventoryRow key={item.id} item={item} />
))}
</tbody>
</table>
);
}
Results: What You Can Expect
Applying these patterns transforms performance:
| Metric | Before (4.4MB) | After (Server-Side) |
|---|---|---|
| Initial load | 4.4MB | ~15KB |
| Time to interactive | 3-5 seconds | <500ms |
| API calls | 1 (massive) | On-demand |
| Memory usage | High | Minimal |
| UX | Stuttering UI | Smooth pagination |
The key insight: your database is optimized for filtering and pagination. Use it.
Conclusion
The "send everything to frontend" anti-pattern is seductive because it works in development. Your laptop has plenty of RAM. Your fast connection masks the latency. But your users—on mobile devices, slow connections, or older hardware—will feel every megabyte.
The solution isn't one trick: it's a stack:
- Server-side pagination — fetch only what you show
- Server-side filtering — let the database do the work
- Virtualization — when you must show many items, render wisely
- Memoization — prevent unnecessary React re-renders
Start with pagination. It's the highest-impact change and the easiest to implement. Your users will feel the difference immediately—and so will your server costs.
Have you battled the "dump everything" anti-pattern? Drop your war stories in the comments—better solutions often come from shared pain.
Top comments (0)