DEV Community

Purneswar Prasad
Purneswar Prasad

Posted on

A story on Frontend Architectures - Back to the Future

Till now we have talked a lot about how architectures have evolved from magazine-looking webpages to animation and transition heavy websites, with different architectural ideas to ship faster, better and more reliably.

A lot of these ideas focused more on how to efficiently use the backend to serve the frontend better. And as it happened, we developers gradually started shipping a truck load of JavaScript into the websites with SPAs.
For sure, it made them more interactive, but on the other hand, users were left to see a site loading very slowly with all the JS shot into it.

Think of a webpage as a canvas. HTML forms the borders, CSS addeds some colors and shapes and JS is the artist that brings in the details.

In a SPA heavy world, we started handing over too much of this painting work to JS. Refer to the image below:

The browser may have already received the first byte, even painted something on the screen(First Contentful Paint), but there's no actual use of it.
The main content arrives late(Largest Contentful Paint (LCP)), but clicks still feel unresponsive since the JS thread is busy(First Input Delay) and layouts keep jumping as scripts keep laoding(Cumulative Layout Shift).

So while the page looks like it’s loading, the user experience is stuck in limbo...Is it happening? Is it useful? Is it usable?
This gap between “something is visible” and “I can actually interact” became the biggest pain of SPAs, and it’s precisely this problem that pushed the ecosystem to rethink rendering strategies beyond shipping a truckload of JavaScript to the client.

In this story, we discuss how the future asked us to go back to the server era, different strategies that makes the web faster and brought the "next" innovative architectural pattern, their advantages and disadvantages and suitable use case scenarios

1) CSR - Client Side Rendering

CSR

This is the generic rendering idea where the client does almost all of the heavy lifting, fetches data from the server and renders UI at the runtime. Rendering work is minimal at build time, but bundling and optimization may still be complex, with an entire app bundle is produced in the build stage.
Here the user experience is powerful gives rich interactivity after load. But initial experience is often poor:

  • blank screen
  • loading spinners
  • JS parse + execution delays

This is the classic slow FCP / LCP problem.

This type of rendering is suitable for highly interactive apps where SEO/page-first render matters less. E.g - internal dashboards, SPAs.

2) SSG - Static Site Generation

SSG

When speed is in question, frontend teams needed something else. That's where pre-rendering came to the rescue.
Pre-rendering is building HTML ahead of time and serve it from a data store like CDN. Instead of fetching data per request on the server, data is fetched at build time. So, there can be some static content that can be served instantly and thus an ultra low TTFB(Time Taken to First Byte).
The build process is very long here since the pages are fully rendered at build time and the server just serves this pre-built HTML. The client has small to moderate JS for interactivity and some hydration may be needed for interactive widgets.

This type of rendering strategy works best for sites which need better SEO, for e.g. docs, marketing, blogs etc

And similarly, if content on the site changes frequently, SSG can be a problem since the content it serves might be stale until the next build.

3) SSR - Server Side Rendering

This is kind of an improvement on the SSG strategy for speed, scale and content freshness.
Here, instead of building entire pages in the build stage, app bundles are built.
On every request from the client to the server, pages are rendered in the server and served fresh. So there is always fresh data and better personalisation/SEO for dynamic content. The client runs the interactive parts and hydrates the server HTML.

Highly dynamic/personalised pages and real-time dashboards that must be up-to-the-second on every request (Don't Confuse: SSR only helps with initial HTML, not real-time updates) use this kind of rendering.

Since the server renders and sends this as a response on client request, there is a higher per-request compute and latency than cached static HTML.
Also, it is harder to scale without caching strategies, given the heavy load on the server.

4) ISR - Incremental Site Regeneration

This is a rendering strategy pioneered by the framework, Next.JS. It was invented to get the best of SSG and SSR i.e. serve static cached pages from CDN for speed, but also allow selected pages to be re-generated (on-demand or after a revalidate period) without a full rebuild.

Let's understand why ISR came into the picture. Before:

1) SSG was fast but couldn't update without a fresh rebuild of the entire site.
2) SSR was fresh, but slower and more expensive.

ISR brought in both:

  • Static performance, serving cached pages instantly
  • Dynamic freshness, regenerating only pages that need updating
  • Incremental updates, so build time stays low even for large sites

To understand ISR in action, which is very important for further discussion, please follow this video, where there has been a very clear implementation of the concept.

For implementation of ISR, there needs to be a specific key called revalidate. This holds a value which tells the server when to regenerate the page.

But there is a small twist here, which might make you question how is this a good mechanism to keep the freshness of the site.

For example: revalidate: 60
Let's go through the timelines

Page generated at t = 0

  • Cached globally

t = 1 → 60

  • All requests get cached HTML
  • No regeneration allowed

t = 61 (first request after expiry)

  • Old HTML is served instantly
  • Background regeneration starts
  • User never waits

t = 62+

  • Regeneration finishes
  • Cache is replaced
  • All future users get fresh content

Now you might think at t=61, when a request is sent, why is the old HTML is served instantly rather than the fresh one?

This is a deliberate design decision. Some reasons:

1) Speed > Freshness - Users prefer instant pages over perfectly fresh but slow ones.

Users care far more about:
“Did the page load instantly?”
than
“Is this data 10–60 seconds old?”

2) Only ONE request ever sees stale-after-expiry

  • The first request after expiry triggers regeneration
  • Everyone after that gets fresh content
  • There is no cascade of stale responses

This is called request collapsing

3) No blocking - Regeneration never delays a response

4) Most users never refresh immediately and by the time they do, cache is already updated

And there is also the option to have on-demand validation using web hooks from the database when content is updated. This can result in no user seeing any stale content and still remain static-fast.

This type of rendering strategy will only be helpful where there is mostly static but periodically updating content, need for a good SEO and has a lot of pages. E.g.- blogs, documentation, product detail pages, category pages etc.

Do NOT use ISR for:

  • Real-time data (stocks, live scores)
  • User-specific dashboards
  • Financially critical per-request accuracy

Basically places where the needs are of a real-time rendering, user-specific rendering, background crawling of all pages etc., ISR should not be used.

5) RSC - React Server Components

Let's run back the clock a bit to see how servers and clients interacted to show sites on the browser.

In CSR, in applications written in React, when a particular request goes from the client to the server, server sends a minimal HTML shell that references the JS bundle, and the browser sets itself to download this bundle on the client.
Then the rendering shell makes API calls to the backend, which may query the database and this is the point where the user starts to see something on the screen, even if they're non-interactable.
This often results in poor or unreliable SEO due to delayed content visibility.

To solve this issue of a very late "First Paint" as well as SEO, Next.js brought in the concept of SSR. In this, the rendering shell along with its queries to the DB goes to the server and returns a fully rendered HTML document (with CSS links), allowing content to appear immediately and thus, reducing the First paint time by a margin.
Then the JS bundle starts to download in the background on the client and a process of hydration occurs, where the event listeners and other interactive components get attached to the static page, making it interactable.

But even here, if you had noticed, there is a problem. There is always the download of a big JS bundle on the client, which slows down the time to show an interactive screen.

Comes the eureka moment - server components!

Think like this:
If you had an application with 20 components, and among them 15 of them just show some image, text or in general, non interactable components, why do you need to do it on the client and fetch info from the server?
You can just do it on the server itself and not put it in the JS bundle sent on the client! And in turn, reduce client-side JavaScript, memory usage, and hydration cost.

You might think, "but this is just SSR again right?". Well not exactly. This is where RSC is not SSR.

What SSR sends:

  • HTML
  • Then ships JS
  • Then hydrates

What RSC sends:

  • A serialized React component tree
  • Often called the “React Flight” payload

This payload is:

  • Not HTML
  • Not JS source code
  • A structured description of the UI

The client:

  • Reconstructs the React tree
  • Inserts it into the DOM
  • Without executing server component JS

So from your 20 components, you just do the same SSR process for your 5 interactive components are shipped, hydrated, and executed on the client, reducing your JS bundle size significantly!

RSC is designed to work with React Suspense, allowing streaming chunks of UI, progressive rendering from server and streaming support.

All of this needed a framework because it needs:
1) A bundler that understands server/client boundaries
2) A server runtime
3) Streaming support
4) Routing integration

And that's why Next.js 13 App Router became the first mainstream RSC implementation
& RSC is not usable in plain CRA/Vite alone.

The evolution from CSR to SSG, SSR, ISR and finally React Server Components shows that frontend architecture has never been about choosing one “best” rendering strategy, but about placing work where it makes the most sense.

As we pushed more logic to the client, performance suffered; as we brought it back to the server, the web became faster and more usable again.
Modern frameworks now let us mix these strategies thoughtfully—balancing speed, freshness, interactivity and scale—rather than overloading the browser with work it was never meant to do. The future of the web isn’t client-first or server-first, but responsibility-first.

Top comments (0)