How I optimized Server-Side Rendering (SSR), hydration, and caching in a large Next.js 14 project — cutting load times from 3s to 400ms using ISR, caching, and Server Components.
Introduction
Performance issues often appear when a Next.js app grows — especially when rendering large datasets server-side.
In this post, I’ll walk through how I debugged and solved a serious SSR bottleneck that slowed my app from 400ms to 3 seconds per request.
The Problem
I had a Next.js 14 app with dynamic routes like /product/[id], fetching product data from PostgreSQL using Prisma ORM:
export async function getServerSideProps({ params }) {
const product = await db.products.findUnique({
where: { id: params.id },
include: { category: true, reviews: true },
});
return { props: { product } };
}
Despite fast database queries (~100ms), page load times were over 2 seconds.
Profiling revealed three culprits:
- Large prop serialization
- Heavy client hydration
- Vercel cold starts
Understanding the Bottleneck
When Next.js renders on the server, it serializes your props into HTML.
If you send large objects, the cost of serialization + hydration becomes significant.
That’s why your database may be fine, yet pages still feel sluggish.
Step 1: Switch to ISR
I replaced SSR with Incremental Static Regeneration (ISR) to serve prebuilt HTML and revalidate it periodically.
export async function getStaticProps({ params }) {
const product = await db.products.findUnique({ where: { id: params.id } });
return { props: { product }, revalidate: 60 };
}
Benefit: pages are static, cacheable, and revalidate automatically.
For large datasets (50k+ products), I used on-demand revalidation instead of full rebuilds.
Step 2: Reduce Over-Fetching
Fetching related entities (like categories and reviews) added unnecessary payload weight.
So I fetched minimal data server-side and loaded reviews client-side with SWR:
const { data: reviews } = useSWR(`/api/reviews/${id}`);
Result: smaller payloads, faster SSR, and smoother hydration.
Step 3: Server Components FTW
With Next.js 14, I used Server Components to render heavy parts on the server without sending all that data to the client.
export default async function ProductPage({ params }) {
const product = await getProduct(params.id);
return (
<div>
<ServerProductInfo product={product} />
<ClientReviews productId={params.id} />
</div>
);
}
Keeps the client bundle small, improving hydration time.
Step 4: Caching and Edge Optimization
I added route-level caching using unstable_cache:
import { unstable_cache } from "next/cache";
export const getServerSideProps = unstable_cache(async ({ params }) => {
const product = await db.products.findUnique({ where: { id: params.id } });
return { props: { product } };
}, ["product"]);
Cached results reduced cold starts and database load.
Step 5: Measure Everything
Use:
- React Profiler for hydration
- Lighthouse for JS execution
- Vercel Analytics for TTFB and cold starts
The Results
Conclusion
Performance optimization in Next.js is about architecture, not just faster queries.
By combining ISR, caching, and Server Components, I reduced load times by over 85% — and made large-scale pages feel instant.
Discussion
Have you hit similar SSR bottlenecks in your Next.js app?
Drop your insights below — I’d love to compare approaches and caching strategies.

Top comments (0)