DEV Community

Cover image for The Fat Client - Thin Client Debate
Ben Halpern
Ben Halpern

Posted on

The Fat Client - Thin Client Debate

The debate over the merits of a "fat client" oriented architecture vs a "thin client" architecture has been one of the great canonical programming arguments. A fat (or thick, heavy, etc.) client is understood to be an architecture pattern which typically provides rich functionality independent of the central server of the program. A thin client and fat server would be an architecture style where the server handles most of the intensive computation.

Each individual architecture decision should be made with the specific problem in mind, and not based on what the greater trends are, but as an industry, trends have always emerged. At any moment in history, it could be perceived that “the future of programming” lied in one of these styles or the other. Many have argued what the future will be like, only to be sideswiped by an emergence of unforeseen technology innovations. New social circumstances and hardware evolutions have meant a tug of war between the two styles where one became dominant for a period only to have the trend go back in the other direction. Aforementioned changes in hardware as well as consumer expectations, business needs, networking advancements, and a bevy of other factors have been the reason for this fundamental architecture ebb and flow.

Despite changes in the technology being referenced, the principles of the architectural debate have remained relevant. Though the debate predates entirely most of the technologies and interfaces we use today in order to serve computer programs, the ideas have remained. What was once an argument over the best ways to display text to a number of computer terminals has evolved to a similar debate over different ways to download and deliver websites and applications to desktop and mobile devices and will continue to change as more connected devices emerge, virtual and augmented reality become a thing, and computer intelligence allows for more complex message-based applications, not to mention changes that will occur on account of invention that has yet to present itself on the horizon.

Advantages of a fat client/skinny server

  • Lesser server requirements. A thick client does not require as high a level of performance as a thin client since the individual client devices handle the brunt of the work.
  • Working offline. A constant connection is not required for thick clients able to store and connect with the server only when network conditions permit doing so.
  • Better multimedia performance. Thick clients have advantages in media-rich applications that would be bandwidth intensive if fully downloaded. Thick clients are well suited for gaming, for example.
  • More flexibility. On some operating systems software products are designed for personal computers that have their own local resources. Running this software in a thin client environment can be difficult.
  • Using existing infrastructure. As many people now have very fast desktop and mobile computing devices, running fat clients requires no extra computing resources.
  • Higher server capacity. The more work that is carried out by the client, the less the server needs to do, increasing the number of users each server can support. This is of tremendous importance for some organizations' scaling efforts.

Advantages of a thin client/fat server

  • Long term stability and reliability. There is less need to keep up with browser technology and device-specific technologies where the environment is in much greater flux than server environments which evolve on the schedule of the maker.
  • Less intensive front end computing needed, meaning more devices in more markets can run the program more easily.
  • Easier to track and fix bugs, cost savings on support for such issues.
  • Write less code. As more of the functionality exists in a central location, there will be less work required for writing and changing the codebase.
  • Lower security risks. With less surface exposition and a more stable and reliable infrastructure, there are fewer scenarios in which security flaws can be exposed.

Where are we now?

I believe, in general, that the programming industry has been in a thick client era recently. More powerful devices and advancements in native and client-side web technologies have made rich experiences the norm. The benefits of scaling client-side architectures have made it appealing for the computational scaling efforts large organizations and users, with more powerful devices, have come to expect rich experiences. Some organizations are taking advantage of the rich-client enabled ecosystem to build great experiences. Others are needlessly forcing users to download massive amounts of code in order to provide simple information. News organizations, in my opinion, are the worst offenders in the category of needlessly heavy clients that hurt the user experience. I will point to Mic.com as an offender which, despite serving simple textual information, forces me to download unnecessarily heavy amounts of JavaScript and CSS, is often is janky when scrolling, and has a home page that does not even render fully without JavaScript turned on.

Where are we headed?

There are a lot of indicating trends that would point to a return to thin clients. The “internet of things” and the re-emergence of messaging as a platform will motivate the popularity of moving more of the computing to the server. As the biggest Silicon Valley companies fight hard to gain market share in emerging internet markets like India and parts of Africa, support for a variety of devices with greatly varying specifications and computing power will push these companies to focus more of their energy on building technologies that take on most of the computing power. However, in the more technically advanced markets, smarter, more powerful clients will pull technologists to create richer interfaces that users have come to fully expect and gaming and media applications. Several of the Silicon Valley giants to have their own solutions for addressing the news website bloat issue. Facebook's Instant Articles, and the Google-lead open Accelerated Mobile Pages protocol are the most notable. The merits of these different methods are debatable.

The history of computing, as I see it, has been fairly homogenous. From the early academic and military uses for network computing to the rise of the personal computer oligopoly where the major players worked hard to mimic each other's most important features. But with the emergencence more and more computating devices and the fractured computer needs that come with a global market of varying technical priveledge, I foresee this debate moving away from uniform waves and more in the direction of fractured best-practices that vary greatly based on need. Because of the herd nature of humanity and the benefits of a certain amount of convention, there will always be certain trends, but the near future of computing will be different from its past because of this uneven globalization and I see this argument becoming less dichotomous over time.

Top comments (0)