DEV Community

Cover image for The Resolve
Akshat Ramanathan
Akshat Ramanathan

Posted on

The Resolve

This article is a continuation of - First part

Part 3 -The bridge

This is where the server came in. Over the last 2 decades, we have pioneered architectural systems that help keep a separation of cancers, while helping maintain modularity.
Databases although initially created to store data, proved to be an effective tool to process data efficiently. We developed systems for communication with such databases effectively in a consistent, isolated and durable manner. (ACID properties) Giving us atomic transactional capabilities.
Once we realized our data was secure, we moved on to the processes. Our business logic required processing of our data to produce meaningful information for our use cases.Service oriented architecture was born to allow us to club semantically-meaningful operations of data in a concise format providing us with reusable entities. These entities were further managed by the application structure by surrendering control of their creation and asking for them as and when needed. Dependency Injection and Inversion of Control is a core primitive in large scale enterprise-level application management.
This also led to new patterns like MVC pioneered by Ruby on Rails for effective application building. Convention over configuration was sought after for large scale systems to ensure consistency and reliability.
Having such transactional services, which further solidified our atomic resiliency; we finally began our task. Building the bridge.
As other things that change with time, so did this bridge.
What began in the early days simply as templating engines that wrapped data to build our interface (HTML) with data coming from services, was coined as Multi Page Applications (MPAs). The drawbacks for this approach was the need for a full template generation on each request done by user interactivity.

Over time, the client grew stronger and this lead to an interesting paradigm shift.
Initial web interactivity was limited with JS not being as standardized and powerful. Before such advancements and evolvement there were others that took upon this job to provide powerful interactions. Java applets, Microsoft silverlight, and Adobe Flash were revolutionary. This allowed for complex animations, high interactive experiences at the cost of non web-native technologies. Each had a form of injecting custom non-standard elements inside the DOM which were read and processed in the client end only if they support that technology locally in the form of a plugin. This was a major drawback of such experiences however enterprise level requirements were now being met effectively with the help of such implementations for niche use cases. However due to limited support for mobile platforms and other non standard clients these technologies succumbed to their demise over the last decade.
The good however, was the growth in native web technologies. With HTML5, CSS3 and ES3/5/6 (ECMAScript standard) of JS, it was now possible to implement many complex implementations in JS natively. The runtimes available - SpiderMonkey(Firefox), JavascriptCore(Safari) and V8 (Chrome) were to adhere to the ECMAScript standards and provide native support for all modern features including modern HTML5 and CSS3 compatibility. Complex use cases could now be implemented with JS native web interactions (PEMPA - Progressively Enhanced Multi page Apps - as coined by Kent C Dodds) which built on the MPA pattern with some small amount of template generation/manipulation capabilities in the client. The drawbacks of this approach was the need to manage 2 bundles. One for client and 1 server side that also included a lot of duplication of code on both ends for smilier template generation tasks. Around this time we also gained better Dev tools support and a growing ecosystem due to the creation of NodeJS runtime now being able to run code locally on the server without the need of a browser leading to native development directly with I/O heavy tasks like file systems via asynchronous processing.

This later lead to the modern revolution of SPAs - Single Page Apps.
Over the last decade SPAs and the growing mobile development community led to a need for a client agnostic format for data transmission from the server to the client in the bridge layer. The need previously fulfilled by SOAP was very bloated, not performant and reduced flexibility. This lead to the REST API revolution supported by AJAX where the responsibility of the server was now to handle API requests and provide JSON responses of required data instead of HTML templates. This greatly simplified server development by reducing complex flows (role P-R-G on form submissions) with direct data responses. The templating logic for DOM manipulation, however was moved to the client side which led to better UX at the cost of heavy client computation support requirements. The other main benefit was server data could now be sent client agnostic and used by either web apps or mobile apps; greatly increasing sharability. The other growth was in the authentication mechanism with OAuth and OIDC standards, PKCE implementations were easy to implement and session management became easier server side due to JWTs and other moderns solutions.
The drawbacks however were mainly JS oriented. The applications were no longer progressively enhanced meaning low bandwidth areas were rough for the application and lead to an increase in client heavy paradigms (state management, client-server waterfalls, graphQL, etc) that led to complexity overhead.

Part 4 - The modern era

So back to the drawing board. Client heavy SPAs are super useful for highly interactive real-time applications however they also come with their own set of challenges. Having a client-server network boundary, the “native” experience in web will almost always be affected by latency of network systems. There is a work around tho. Most sites (non applications) are static in nature. Be it marketing, branding, etc. the use case is not of that of an interactive application. This is also why content heavy CMS like WIX or Wordpress exist that makes non developers build such sites using low code tools. A similar alternative would be Jamstack. Tools like eleventy, Hugo or Jekyll and the like allow us to render HTML templates at build time and serve them as static files with minimal API requests for interactivity. The statically built JS takes care of the UX with API communication with the backend server. They are hence an architecture paradigm that integrate well with modern frontend tools and frameworks like React, Vue as well as build tools that power them. This also includes SSG that are a fundamental part of the Jamstack ecosystem. They allow this generation of static HTML+JS bundles ahead of time and gives great result for UX while not requiring us to maintain a server for the HTML generation. A backend is still required however for APIs to be exposed for the data interchange in order to display accurate information in the application. The drawback however remains the same as SPAs as they are built and run as SPAs in many cases by the SSG framework in order to maintain a smooth app-like UX.

We finally reached a point where we would like to have the positives of both MPAs and SPAs while mitigating their drawbacks. This basically meant having a progressively enhanced multi page app rendered by the server (at build time/ run time) while also maintaining a smooth SPA like UX in the client. With the advent of client side frameworks and libraries this was now possible with the concept of Hydration. Hydration allowed for HTML template to be generated on the server which was later rejuvenated and linked up (hooking up event listeners required by the framework) on the client after receiving it. This unlocked a new paradigm of rendering strategies that allowed new patterns of data flow. We were now able to mitigate network waterfalls caused by fetch on render pattern (a component renders first then requests for data while showing loading spinners) by changing it to a render-as-you-fetch using loaders pattern. This allowed a page request to fetch data on the server which got passed to the client for initial render and later hydrated on the client end avoiding any waterfalls. We could also stream down pending data (async data fetching) as part of the initial render with loading spinners and handle it on the client. The loaded data can stream down later and handled by the client once ready. This concept was the initial Server Side rendering paradigm. The main requirement for this was the use of NodeJS or a similar runtime (Bun, Deno, etc.) on the server end to handle async data fetching as well as frontend frameworks based rendering on the server. The advent of SPA-influenced isomorphic apps had a few drawbacks due to the isomorphic nature of code. We now had to write code with separation of network boundary in mind as loaders and actions only ran server side and there was a need of clientLoaders/actions for additional client intervention/processing (akin to Remix/ NextSJ pages router way of things). This also closed multi environment builds that had bundlers had to understand and code split to cleanly separate the server and client bundles. Hybrid rendering strategies were also possible on per route basis where static content could be regenerated (SSG) where as other pages were dynamic (SSR).

Part 5 - The Utopia

Due to increase in complexity and cognitive load of application development there was a need simplify this process. Although SSR solves 99.99% of problems it does so with an added layer of paradigms and patterns. Astro (a content heavy framework that began as SSG) lead to the new approach of Interactive islands. Rather than having a server render the entire app and client hydrate it again the idea was to maximize statically generated content with small islands of interactivity. This shift lead to a rendering agnostic architecture that allowed astronomers to combine multiple frontend solutions like React, Vue, Svelte, etc. at the same time on the same page while maintaining a slim client bundle. The minimalistic static heavy approach gave the benefits of SPA in the interactive areas while preserving SSG driven MPA feel.
This pattern mitigated to the industry in many ways. React, a widely used lib-framework started working their ideologies on this approach. What once began as a class based component driven pattern, moved to a functional + hooks based approach that closely aligned with the declarative nature of web while maintaining effective escape hatches for imperative logic using side effect management. This now evolved to a static generated paradigm which was hydrated in the client side using a custom representation called the RSC payload. The Server components model closely align with astro islands architecture that allows for weaving client and server components using this RSC representation that is hydrated by the client. Powered by react suspense, this also enables differing of server heavy tasks by displaying loading skeletons and stream in the required responses that get slotted in later. Similar to the suspense paradigm, astro recently implemented the server islands architecture this year.
I believe with more advancements in compiler based approaches, a lot of the heavy lifting done by the bundlers and dev tools like vite can be taken over and provide a smoother DX for us. Frameworks like Svelte and SvelteKit recently with async capabilities allows us to further blur the network boundary and advance the way we build full stack applications.

So that’s been my rant. If you reached till here you’re awesome! Thank you for taking the time to read my mind barf. I hope you look forward to more content like this. From Vue and React renderings to Suspense inner workings to dev tools deep dive. Im also into Java Spring and PHP Laravel backend stuff a bit. I hope to catch you in the next one!

Top comments (0)