DEV Community

Cover image for A Complete Next.js Streaming Guide: loading.tsx, Suspense, and Performance
BoopyKiki
BoopyKiki

Posted on

A Complete Next.js Streaming Guide: loading.tsx, Suspense, and Performance

When building modern web apps, speed isn't just about fast loading — it's about feeling fast. Users expect immediate feedback, even when data takes time to load.

Next.js supports streaming out of the box with the App Router, helping you deliver pages progressively, piece by piece, instead of making users wait for everything.

In this article, I'll explain what streaming means in practice, how it works in Next.js, and why loading.tsx makes it even easier to implement.

See Demos in Action

Want to explore similar examples? Check out this GitHub repo - nextjs-perf-showcase with demo projects.

Table of Contents

  1. What is Streaming?
  2. Why Does It Matter?
  3. How Streaming Works in Next.js
  4. Under the Hood: How Next.js Streaming Works
  5. Real-World Example: E-commerce Dashboard
  6. Error Boundaries with Streaming
  7. Performance Monitoring
  8. When to Use Each Loading State
  9. Best Practices
  10. Useful Links
  11. Closing Thoughts

1. What is Streaming?

Instead of waiting for the entire server-rendered page to be ready, streaming lets the server send chunks of HTML progressively as they become available. The browser can start rendering immediately, showing users content as soon as possible.

Think of it like watching a video online — you don't wait for the entire file to download before playback starts. Same concept for web pages!


2. Why Does It Matter?

Without streaming, users stare at a blank screen while the server prepares the full response. This creates a poor experience, especially on slower connections.

With streaming:

  • Faster perceived performance — Users see content immediately
  • 🧩 Load critical content first — Navigation, headers appear instantly
  • Non-blocking data fetches — Slow API calls don't freeze the entire page
  • 📱 Better mobile experience — Progressive loading on slower networks

3. How Streaming Works in Next.js

Next.js provides two complementary approaches for implementing streaming:

Page-Level Streaming with loading.tsx

Use a loading.tsx file to handle loading states globally for an entire route. Next.js automatically wraps your page in a Suspense boundary, enabling the server to stream HTML progressively while showing your loading UI immediately.

Here's how it works in practice:

// app/dashboard/layout.tsx
export default function Layout({ children }) {
  return (
    <div className="dashboard-layout">
      <Sidebar />
      <div className="main-content">
        <Header />
        {children}
      </div>
    </div>
  );
}
Enter fullscreen mode Exit fullscreen mode
// app/dashboard/loading.tsx
export default function Loading() {
  return (
    <div className="page-skeleton">
      <div className="skeleton-cards" />
      <div className="skeleton-chart" />
    </div>
  );
}
Enter fullscreen mode Exit fullscreen mode
// app/dashboard/page.tsx
export default async function Dashboard() {
  // This entire page can take time to load
  const data = await fetchDashboardData();

  return (
    <div className="dashboard-content">
      <div className="metrics-cards">
        <MetricCard title="Revenue" value={data.revenue} />
        <MetricCard title="Users" value={data.users} />
        <MetricCard title="Orders" value={data.orders} />
      </div>
      <div className="analytics-chart">
        <Chart data={data.chartData} />
      </div>
    </div>
  );
}
Enter fullscreen mode Exit fullscreen mode

Benefits of loading.tsx:

  • 🔄 Automatically applies to the route segment
  • 🎯 No need to wrap components in Suspense
  • 📁 Works with nested routes
  • 🎨 Keeps loading states organized

Component-Level Streaming with <Suspense>

For more granular control, use React's <Suspense> component to stream specific parts of your UI independently.

Next.js enables streaming by default with the App Router. Every Server Component can stream its output when wrapped with Suspense:

import { Suspense } from "react";

// This component loads fast
function Header() {
  return <h1>Dashboard</h1>;
}

// This component needs time to fetch data
async function Analytics() {
  const data = await fetchAnalytics(); // Slow API call
  return <AnalyticsChart data={data} />;
}

export default function DashboardPage() {
  return (
    <>
      <Header /> {/* Shows immediately */}
      <Suspense fallback={<div>Loading analytics...</div>}>
        <Analytics /> {/* Streams in when ready */}
      </Suspense>
    </>
  );
}
Enter fullscreen mode Exit fullscreen mode

The page renders in this order:

  1. Header appears instantly
  2. Loading fallback shows for Analytics
  3. Analytics component streams in when data is ready

4. Under the Hood: How Next.js Streaming Works

When you use streaming in Next.js, here's what happens:

  1. Server-Side: Next.js starts rendering your page and immediately sends the static structure (layout, headers, navigation)
  2. Suspense Detection: When React encounters a Suspense boundary with pending data, it pauses that component
  3. Progressive Rendering: The server continues rendering other parts of the page while data fetches happen in parallel
  4. Chunk Streaming: As data becomes available, Next.js streams HTML chunks to the browser
  5. Client Hydration: React hydrates components as they arrive, making them interactive

This process uses HTTP streaming and React's concurrent features to deliver content as fast as possible.


5. Real-World Example: E-commerce Dashboard

Let's build a complex dashboard that demonstrates multiple streaming patterns:

// app/dashboard/page.tsx
import { Suspense } from "react";
import { ErrorBoundary } from "react-error-boundary";

export default function Dashboard() {
  return (
    <div className="dashboard">
      {/* Critical content loads first */}
      <Header />
      <Navigation />

      {/* Stream multiple components in parallel */}
      <div className="dashboard-grid">
        <ErrorBoundary fallback={<ErrorCard />}>
          <Suspense fallback={<MetricsSkeleton />}>
            <RealtimeMetrics />
          </Suspense>
        </ErrorBoundary>

        <ErrorBoundary fallback={<ErrorCard />}>
          <Suspense fallback={<ChartSkeleton />}>
            <SalesChart />
          </Suspense>
        </ErrorBoundary>

        <ErrorBoundary fallback={<ErrorCard />}>
          <Suspense fallback={<TableSkeleton />}>
            <RecentOrders />
          </Suspense>
        </ErrorBoundary>

        {/* Nested streaming for complex components */}
        <div className="inventory-section">
          <h2>Inventory</h2>
          <Suspense fallback={<InventorySkeleton />}>
            <InventoryOverview />
          </Suspense>
        </div>
      </div>
    </div>
  );
}

// Each component handles its own data fetching
async function RealtimeMetrics() {
  const metrics = await fetchMetrics(); // 500ms
  return <MetricsCards data={metrics} />;
}

async function SalesChart() {
  const sales = await fetchSalesData(); // 1.2s
  return <Chart data={sales} />;
}

async function RecentOrders() {
  const orders = await fetchRecentOrders(); // 800ms
  return <OrdersTable orders={orders} />;
}

async function InventoryOverview() {
  // Parent component fetches its own data first
  const inventory = await fetchInventoryData(); // 600ms

  return (
    <div>
      <h3>{inventory.title}</h3>
      <p>Total Items: {inventory.totalItems}</p>

      {/* Nested streaming within inventory */}
      <Suspense fallback={<div>Loading stock levels...</div>}>
        <StockLevels warehouseId={inventory.warehouseId} />
      </Suspense>

      <Suspense fallback={<div>Loading alerts...</div>}>
        <LowStockAlerts threshold={inventory.alertThreshold} />
      </Suspense>
    </div>
  );
}

async function StockLevels({ warehouseId }) {
  const levels = await fetchStockLevels(warehouseId); // 400ms
  return <StockChart data={levels} />;
}

async function LowStockAlerts({ threshold }) {
  const alerts = await fetchLowStockAlerts(threshold); // 300ms
  return <AlertsList alerts={alerts} />;
}
Enter fullscreen mode Exit fullscreen mode

Loading sequence:

  1. Header & Navigation appear instantly
  2. All skeleton states show simultaneously
  3. RealtimeMetrics loads first (500ms)
  4. InventoryOverview loads next (600ms) → shows inventory summary + child loading states
  5. RecentOrders loads next (800ms)
  6. SalesChart loads last (1.2s)
  7. StockLevels and LowStockAlerts stream independently (300-400ms after inventory loads)

6. Error Boundaries with Streaming

Combine error boundaries with Suspense when your components perform fallible operations (API calls, external dependencies):

// app/dashboard/error.tsx (Next.js built-in approach)
"use client";

export default function Error({ error, reset }) {
  return (
    <div className="error-card">
      <h3>Failed to load dashboard data</h3>
      <p>{error.message}</p>
      <button onClick={reset}>Retry</button>
    </div>
  );
}
Enter fullscreen mode Exit fullscreen mode

For granular error handling:

// components/SafeStreamingComponent.tsx
"use client";

import { Suspense } from "react";
import { ErrorBoundary } from "react-error-boundary";

function DataErrorFallback({ error, resetErrorBoundary }) {
  return (
    <div className="error-card">
      <h4>Failed to load data</h4>
      <p>{error.message}</p>
      <button onClick={resetErrorBoundary}>Try again</button>
    </div>
  );
}

export function SafeStreamingComponent({ children, fallback }) {
  return (
    <ErrorBoundary FallbackComponent={DataErrorFallback}>
      <Suspense fallback={fallback}>{children}</Suspense>
    </ErrorBoundary>
  );
}
Enter fullscreen mode Exit fullscreen mode

Usage:

// Example async component that could fail
async function UserProfile({ userId }) {
  const user = await fetchUser(userId); // Could throw an error
  return <div>Welcome, {user.name}!</div>;
}

// Instead of writing this every time:
<ErrorBoundary FallbackComponent={DataErrorFallback}>
  <Suspense fallback={<div>Loading user data...</div>}>
    <UserProfile userId={123} />
  </Suspense>
</ErrorBoundary>

// You can just use:
<SafeStreamingComponent fallback={<div>Loading user data...</div>}>
  <UserProfile userId={123} />
</SafeStreamingComponent>
Enter fullscreen mode Exit fullscreen mode

When to use each approach:

  • Next.js error.tsx: Route-level errors (entire page fails)
  • Custom ErrorBoundary: Component-level errors (specific parts fail)

Use error boundaries when components:

  • Fetch data from APIs
  • Use external libraries
  • Process user input
  • Could reasonably fail at runtime

Skip for simple code splitting:

// No error boundary needed - just lazy loading
<Suspense fallback={<div>Loading...</div>}>
  <LazyModalComponent />
</Suspense>
Enter fullscreen mode Exit fullscreen mode

7. Performance Monitoring

Track streaming performance using the browser's built-in Web Performance API:

// utils/performance.ts
export function measureStreamingPerformance() {
  const navigation = performance.getEntriesByType("navigation")[0];
  const paintEntries = performance.getEntriesByType("paint");

  if (!navigation || typeof window === "undefined") return;

  // Measure Time to First Byte (TTFB)
  const ttfb = navigation.responseStart;

  // Measure First Contentful Paint
  const fcp = paintEntries.find(
    (entry) => entry.name === "first-contentful-paint"
  )?.startTime;

  console.log("TTFB:", Math.round(ttfb) + "ms");
  console.log("FCP:", Math.round(fcp || 0) + "ms");

  // LCP requires PerformanceObserver for accuracy
  if ("PerformanceObserver" in window) {
    new PerformanceObserver((list) => {
      const lcp = list.getEntries().at(-1)?.startTime;
      console.log("LCP:", Math.round(lcp || 0) + "ms");
    }).observe({ type: "largest-contentful-paint", buffered: true });
  }
}
Enter fullscreen mode Exit fullscreen mode
// hooks/useStreamingPerformance.ts
"use client";

import { useEffect } from "react";
import { measureStreamingPerformance } from "@/utils/performance";

export function useStreamingPerformance() {
  useEffect(() => {
    measureStreamingPerformance();
  }, []);
}
Enter fullscreen mode Exit fullscreen mode

Usage in components:

// components/PerformanceTracker.tsx
"use client";

import { useStreamingPerformance } from "@/hooks/useStreamingPerformance";

export function PerformanceTracker() {
  useStreamingPerformance();
  return null; // This component only tracks performance
}

// pages/dashboard.tsx
import { PerformanceTracker } from "@/components/PerformanceTracker";

export default function Dashboard() {
  return (
    <div>
      <PerformanceTracker />
      <Header />
      <Suspense fallback={<Loading />}>
        <StreamingContent />
      </Suspense>
    </div>
  );
}
Enter fullscreen mode Exit fullscreen mode

What these metrics mean:

  • TTFB (Time to First Byte): How fast your server responds - measures streaming start
  • FCP (First Contentful Paint): When users first see content - critical for perceived performance
  • LCP (Largest Contentful Paint): When main content loads - Core Web Vital for SEO

Good streaming targets:

  • TTFB < 200ms (fast server response)
  • FCP - TTFB < 300ms (content streams quickly after server response)
  • Progressive LCP improvements (content loads in chunks, not all at once)

Chrome DevTools tips:

  1. Network throttling: Click the "Network: No throttling" dropdown (visible in your screenshot) → Select "Slow 3G" to simulate slow connections and see streaming in action

  2. Performance tab: Record page load to see when HTML chunks arrive:

  • Click "Record and reload" button
  • Look for multiple "Parse HTML" entries showing streaming chunks
  1. Lighthouse: Run audit to measure streaming effectiveness:
    • Go to Lighthouse tab → Click "Analyze page load"
    • Check metrics like First Contentful Paint and Time to Interactive

8. When to Use Each Loading State

🦴 Skeleton Screens

Best for structured content like cards, tables, or lists (maintains layout and prevents content shift):

<div className="skeleton-card animate-pulse">
  <div className="h-4 bg-gray-200 rounded w-3/4" />
  <div className="h-4 bg-gray-200 rounded w-1/2 mt-2" />
</div>
Enter fullscreen mode Exit fullscreen mode

🌀 Spinners

Good for quick actions or unknown content structure (when you can't predict the layout):

<div className="flex justify-center">
  <Spinner />
</div>
Enter fullscreen mode Exit fullscreen mode

📝 Content Placeholders

Ideal for text-heavy areas (simple and lightweight for fast display):

<p className="text-gray-400">Loading article content...</p>
Enter fullscreen mode Exit fullscreen mode

9. Best Practices

✅ DO:

  • Stream critical content first — Headers, navigation, above-the-fold content
  • Use skeleton screens — They maintain layout structure and prevent content shift
  • Keep fallbacks lightweight — They should load instantly
  • Test on slow connections — Use Chrome DevTools network throttling
  • Progressive enhancement — Show basic content first, enhance with streaming
  • Test with real data volumes — Large datasets affect streaming differently

❌ DON'T:

  • Over-stream — Too many loading states can be jarring
  • Block the entire page — Stream heavy components individually
  • Forget error boundaries — Always wrap streamed components in error boundaries
  • Skip performance monitoring — Measure streaming impact with real metrics
  • Use generic spinners everywhere — Be specific with loading states

10. Useful Links


Closing Thoughts

Streaming in Next.js isn't just a performance optimization — it's a UX philosophy. By showing users content progressively, you create experiences that feel instant and responsive.

Start simple with loading.tsx for route transitions, then add component-level streaming where it makes the most impact. Your users will notice the difference!

💡 Quick tip: Use the React DevTools Profiler to identify which components take longest to render — those are perfect candidates for streaming!


⚠️ Just a note
This article doesn’t claim to be exhaustive or perfect — if something seems off or outdated, feel free to dig deeper or let me know. Happy to hear from you :)

🙏 Thanks for reading!

If you found this article useful, feel free to share it or leave a like. And of course, if you have feedback or questions, don’t hesitate to reach out. Always happy to learn and connect! 🚀

Top comments (0)