DEV Community

ANKUSH CHOUDHARY JOHAL
ANKUSH CHOUDHARY JOHAL

Posted on • Originally published at johal.in

We Cut API Call Time by 40% Using React Query 5.0, Next.js 15, and Redis 8.0

We Cut API Call Time by 40% Using React Query 5.0, Next.js 15, and Redis 8.0

API performance is a make-or-break factor for modern web apps. Slow API calls lead to poor user experience, higher bounce rates, and lost revenue. For our team maintaining a high-traffic e-commerce dashboard, average API call times had crept up to 800ms, accounting for 60% of total page load time. After a 3-month optimization effort using React Query 5.0, Next.js 15, and Redis 8.0, we slashed average API call time by 40% to 480ms, with a 95th percentile drop from 1.2s to 700ms.

The Problem: Unoptimized API Workflows

Before our optimization, our stack relied on custom fetch hooks with minimal caching, client-side only data fetching for most endpoints, and no server-side caching layer. Common pain points included:

  • Redundant API calls for the same data across multiple components
  • Stale data displayed to users due to no revalidation strategy
  • Long-running database queries for frequent product and user data requests
  • Unnecessary client-server round trips for simple mutations

Our Tool Stack: Why These 3?

We selected three tools that work across the full stack to address caching, data fetching, and performance:

React Query 5.0

React Query 5.0 shipped with major improvements including a 30% smaller bundle size, first-class TypeScript support, and revamped caching logic. Key features we leveraged:

  • Configurable stale-while-revalidate caching out of the box
  • Built-in deduplication of in-flight requests
  • Improved infinite query support for paginated data
  • Customizable cache and garbage collection timers

Next.js 15

Next.js 15's App Router maturity, Server Components, and Server Actions were game-changers. We used:

  • Server Components for static data fetching to avoid client-side waterfalls
  • Server Actions to handle form mutations without separate API endpoints
  • Next.js built-in cache with revalidate tags for granular cache invalidation
  • Edge middleware for geolocation-based caching rules

Redis 8.0

Redis 8.0 introduced Redis Query for secondary indexing, improved JSON support, and faster cluster operations. We used it as a server-side cache for:

  • Frequently accessed product catalog and user session data
  • Caching third-party API responses to avoid rate limits
  • Rate limiting and request deduplication at the edge

Implementation Step-by-Step

Step 1: Audit Existing API Calls

We started by auditing all API endpoints using Chrome DevTools and the Next.js Performance Profiler. We identified 12 redundant endpoints, 3 long-running database queries, and 20% of calls that returned stale data. We prioritized the top 5 highest-traffic endpoints for optimization first.

Step 2: Migrate to React Query 5.0

We replaced all custom fetch hooks with React Query's useQuery and useInfiniteQuery. We set global default options to reduce boilerplate:

const queryClient = new QueryClient({
  defaultOptions: {
    "queries": {
      staleTime: 5 * 60 * 1000, // 5 minutes
      "cacheTime": 10 * 60 * 1000, // 10 minutes
      "refetchOnWindowFocus": false,
      "retry": 2,
    },
  },
});
Enter fullscreen mode Exit fullscreen mode

This reduced redundant in-flight requests by 30% immediately, as React Query automatically deduplicates requests for the same query key.

Step 3: Optimize with Next.js 15

We migrated 70% of data fetching to Server Components, which pre-renders data on the server and sends fully populated HTML to the client. For dynamic data, we used Next.js's cache with revalidate tags:

// Server Component
async function ProductList() {
  const products = await getProducts({ revalidate: 600 }); // Revalidate every 10 minutes
  return ;
}
Enter fullscreen mode Exit fullscreen mode

We also replaced 15 custom mutation endpoints with Server Actions, cutting client-server round trips for form submissions by 50%.

Step 4: Add Redis 8.0 Server-Side Caching

We set up a Redis 8.0 cluster using ioredis as our client. We cached frequent API responses with a 10-minute TTL, and implemented cache invalidation when data was updated via Server Actions:

// Cache middleware
async function cacheMiddleware(req, res, next) {
  const cacheKey = `api:${req.url}`;
  const cached = await redis.json.get(cacheKey);
  if (cached) return res.json(cached);

  res.sendResponse = res.json;
  res.json = async (data) => {
    await redis.json.set(cacheKey, '$', data, { EX: 600 }); // 10 minute TTL
    res.sendResponse(data);
  };
  next();
}
Enter fullscreen mode Exit fullscreen mode

Performance Results

We ran benchmarks over 2 weeks post-implementation, comparing against 4 weeks of pre-optimization data:

  • Average API call time: 800ms → 480ms (40% reduction)
  • 95th percentile API call time: 1.2s → 700ms (41% reduction)
  • Total page load time: 1.8s → 1.3s (28% reduction)
  • TTFB (Time to First Byte): 450ms → 290ms (35% reduction)
  • Redundant API calls: Reduced by 30%
  • Redis cache hit rate: 82% for top 10 endpoints

Lessons Learned

Our optimization effort taught us several key lessons:

  • Don't over-cache: Set appropriate stale times to avoid serving stale data indefinitely
  • Always invalidate cache on mutations: Use React Query's invalidateQueries and Next.js revalidate tags
  • Use Server Components for static data: Avoid client-side waterfalls for data that doesn't change often
  • Monitor cache hit rates: We used Redis's built-in metrics and React Query DevTools to tune cache settings

Conclusion

Combining React Query 5.0 for client-side caching, Next.js 15 for server-side rendering and Server Actions, and Redis 8.0 for server-side caching gave us a 40% reduction in API call time with minimal code changes. This stack is now our default for all new Next.js projects, and we've seen similar results across 3 other production apps. If you're struggling with API performance, we highly recommend auditing your current workflow and adopting this stack incrementally.

Top comments (0)