loading...

re: Stop Using React VIEW POST

TOP OF THREAD FULL DISCUSSION
re: I think this is not the problem with React (well, apart from point 5), but with modern webdev in general. React is a good tool for single-page appl...
 

Why not use sth like NextJs? That’ll create a server side rendered page using React. No SPA here.

 

SSR introduces component rehydration overhead on the client (compared to plain server rendering) which increases Time to Interactive (TTI).

Hydration

Even static websites generated with Gatsby rehydrate (by default).

Doesn't matter. With Next or Nuxt you get a fast first render with SSR, and then if you use their Link component, that requests all links from the page in the background and caches them in the client side router so that when you click something else, navigation is instant. And anyway, you can't get away from the client-server model and network lag, so why is everyone here expecting INSTANT f'ing load times? That's just ludicrous!

so why is everyone here expecting INSTANT f'ing load times? That's just ludicrous!

Because the gold standard is set by the speed of static pages.

Starting with The Cost of JavaScript 2018 Addy Osmani identified "hydration" as a potential cause for

An experience that can’t deliver on this will frustrate your users.

in reference to

For a page to be interactive, it must be capable of responding quickly to user input. A small JavaScript payload can ensure this happens fast.

2013: Progressive Enhancement: Zed’s Dead, Baby

Despite progressive enhancement repeatedly being declared dead, in 2016 Jake Archibald stated

So you should think of the network as a piece of progressive enhancement

This illustrates that PWAs got their name from progressive enhancement:

  1. Content is the foundation
  2. Markup is an enhancement
  3. Visual Design is an enhancement
  4. Interaction is an enhancement
  5. PLUS: The Network is an enhancement

Jeremy Keith outlines that hydration is no substitute for progressive enhancement.

The goal is keep the amount of JavaScript that is critical for interactivity to an absolute minimum (and load anything else separately, preferably on demand or to be cached for later use).

Matching the performance of static pages with dynamic pages is unrealistic but web site/app owners have a wide range of approaches available to them for improving performance on the server side. But they have no control over:

  • the quality of connection to the client device
  • the compute power on the client device

In light of these limitations the prudent defensive strategy is to deliver payloads that are small (the minimum for the necessary capabilities) and require relatively little compute.

Modern web development within the current JavaScript ecosystem doesn't follow these kind of practices.

The approach to move as much work away from client run time to site build time (e.g. Sapper/Svelte) is a step in the right direction.

... or not? To me, it looks like we're trying to build around JavaScript's shortcomings instead of focusing on the real problem: the inefficient runtime (and arguably the programming language as well). And then there's one "progressive" feature that almost no-one talks about: JavaScript's backwards compatibility. It's a clunky language, with so many user agents, so many code paths and so many different optimizations that have to be built-in in so many different runtimes...

I don't think the current expectations match the state, power and limitations of the tools currently used to build web applications. Arguing about the right way to do this or that is ... almost pointless because it's fruitless, unless something is done about the language and the runtimes that can't keep with the pace of innovation and at this point are only being added upon, continuously, making them heavier and slower.

Why should the platform dictate how much code I can ship? Why do I have to target sub-standard UI+UX just so my app can load in a small enough frigging time that it doesn't look like it's dead to my users? I shouldn't have to worry about this. The platform should take care of this for me. Stream my code... do something!

Sorry for ranting. I didn't mean to. Imagine I was ironic and sarcastic the entire time. :)

the inefficient runtime

At the core of the problem is that React doesn't play to the strengths of the platforms on the web.

Something like Sapper/Svelte does - though there still is room for improvement.

Why should the platform dictate how much code I can ship?

There is no one platform and there never will be - that is simply the nature of the web.

Tools don’t solve the web’s problems, they ARE the problem (2015):

there is no web platform, there’s an immensely varied collection of web platforms

Front end and back end (2015)

Back-enders work with only one environment.

Front-enders work with an unknown number of radically different environments.

Familiar approaches for the backend or desktop simply don't apply to a situation where you have no control over the capabilities of the client device and the quality of the connection to it.

Well, one of the things I wanted to say is that this is the issue with the web platform: it lacks proper standardization and leadership. OK, so it's a varied collection of web platforms... does it have to be!? Especially if they are all doing the same thing basically?

This article needs more traffic: Where Web UI Libraries are Heading

it lacks proper standardization and leadership.

Not really. Web standards define how browsers ought to behave on any platform and for the past years Google and Mozilla have been driving the standards process forward (with anyone else who might be willing to donate their effort).

For the time being Mozilla's future role is in question but "survival of the Web" is in Google's best interest which is why they keep pushing Chromium/Chrome forward (and are snarky about Apple/Safari (especially iOS)).
Granted it's not from an act of selflessness but for the time being the side product of the browser as an "Ad projector and Analytics gather" still enables the open and wide reach of the web (e.g. IndieWeb) (it's been my observation that Google's browser people on average tend to be more pragmatic than the framework people).

The post jQuery era gave rise to opinionated frameworks which use a boatload of JavaScript to promote their own design time vision (standard?)
without too much regard for the (client) runtime consequences. That was OK in the days of fat core desktops (the framework is restricted to a single thread) with solid, wired connections but is a bit presumptive with small core smart devices over flaky data networks. JavaScript was largely designed to script the browser (Python's success is partially based on popular libraries being implemented in C/C++ - and those don't have to squeeze through the network connection all of the time). The browser is the runtime - not the framework. Work should always be offloaded to the browser whenever possible - it's written in C/C++/Rust and the browser can offload some (non-JS) work to other threads.

Only recently some frameworks started to align more with the way browsers work in order to get away with less JavaScript - i.e. ship exactly as much JavaScript needed to accomplish the task, no less, no more.

... does it have to be!?

Using the web is about its reach. The reason it has that reach is because it makes minimal assumptions about connection quality and client capability (single thread performance, available memory, screen size (responsive/mobile first design), etc.). Making minimal assumptions means anticipating a range of capabilities. Unpredictable connection quality requires that the web is resilient - and it's that particular flavour of fault tolerance that defines the architecture of browsers and how the web works.

Many people see UI frameworks as being at the core of "web apps". Frankly I think that Service Worker is much more important. Right now they are relegated to cache managers. However they can play a much larger role in making a site more "app-like". For a more involved example see:

Also current frameworks are still single/fat core centric. The UI framework belongs on the main thread but in order to leverage today's multi (small) core devices (e.g. Android) web apps need to use Web Workers more for non-UI work so that the UI can stay responsive. Then there is at least a chance that the web app's load can be distributed over multiple cores (The Free Lunch is Over (2004)).

Especially if they are all doing the same thing basically?

They aren't.

Just look at Rendering on the Web - i.e. the range of rendering options. There is a continuous spectrum of architectures between a static web site and a web app. Web development covers a wide range of solutions. The issue is that people are always gravitating towards their golden hammer. They stick to the tools they are familiar with often applying them where they are less than ideal.

Thank you for your very informative answer and for shedding light on a complex topic.

Maybe I should reformulate my opinion. It's not that there's no standards or body to control the evolution of the web... I guess that's not what I meant. I know there are standards and the W3C, etc.

It's just that there's no clear solution to everyone's vision of what the web should be and should be able to do, and the sad part is that the only way the problem can be fixed is, if the web can find the perfect solution in order to become this one and only universal platform everyone wants it to be.

Otherwise we will always continue to hear these complaints about the tools and technologies we use and how they suck, when in fact they are NOT the problem, but only solutions to the real problems!

Maybe the web needs re-architecting from the ground up?

I believe the main issue is with the way we write and run code on the web. For example, having just one programming language and one code runtime is a huge detriment to the web platform. Let's hope WASM and WASI can fix that. And CSS... can't we do better than that? CSS shouldn't be a separate language IMO.

It's just that there's no clear solution to everyone's vision of what the web should be and should be able to do

That is one of the problems.

There are lots of opinions of what people want the web to be for their own personal benefit. At the same time there seems to be very little willingness to accept what the web actually is and to understand why it got to be that way.

if the web can find the perfect solution

Perfect with reference to what?

If anything the web is an exercise of "the best we can do with the constraints we are currently operating under". That's what made it successful in the past and that is why it is still around after 31 years (1989).

one and only universal platform everyone wants it to be.

Universal doesn't imply one or even a unified platform. The web succeeded so far because it was willing to compromise in the face of constraints set by technology and physics.

the real problems!

Real solutions accomodate all existing constraints. Solutions that ignore real constraints create problems.

Maybe the web needs re-architecting from the ground up?

Let's say that it was possible to come up with "the perfect" alternative (which is unlikely) - how would it be adopted in the current climate?

The web browser was transitioned onto smart/mobile devices because most customers want access to the web. If Apple removed Safari from iOS today likely a lot of people would be upset - perhaps enough to switch to a platform with web support.

However a new architecture would have very little content initially to generate consumer demand and if there is no opportunity for platform holders to directly profit from it, it's in their interest to not support it.

The web established itself during a unique window of opportunity. Something like it would have a lot more difficulties today. If the web made itself irrelevant we'd likely be back to multiple disparate platforms and walled gardens. The problem is that with mobile the web has to operate in an environment that is even more unreliable and hostile than the one it established itself in. The influx of less expensive devices into the market means that the average device is becoming less performant - even as flagship devices are still becoming more performant.

The web could quickly become irrelevant if the majority of web sites (and apps) indulge in technologies that focus on serving flagship devices with fast and clean network connections while neglecting service/user experience for mid to low-spec devices with less than perfect connections. If the majority of users have to rely on native apps even for casual engagement the reach and therefore appeal of the web is lost. And having to go through the app store for an app download is already beyond "casual".

Also in the past there have been efforts to "improve the web":

  • Java Applets
  • Active X
  • Flash
  • Silverlight

HTML, CSS and JavaScript on the other hand are still around today.

I believe the main issue is with the way we write and run code on the web.

I think this has more to do with the expectations that are grounded in the experience of developing applications for desktop, device native and the back end which aren't constrained in the same way that developing for the web is. I think this is similar to the mindset that is addressed in Convenience over Correctness (2008) - developers wanting to treat a remote procedure call no different than a local function call - it's an unrealistic expectation given that there is so much more that can go wrong with a remote procedure call.

For example, having just one programming language and one code runtime is a huge detriment to the web platform.

Earlier you wanted standardization. Now that you don't like the standard, you want choice (i.e. variation which can lead to fragmentation)?

Let's hope WASM and WASI can fix that.

While WASM is an important development I think people are overestimating its impact - again because they hope that it will bring web development closer to "traditional software development" - but WASM doesn't change the constraints that the web has to operate under. The most likely languages for WASM development will be C/C++/Rust, i.e. systems languages because they only need a skimpy "runtime".

Anything higher level than that will require a significantly larger runtime for download - e.g. Blazor has the .NET CLR (or a variation thereof) albatross around its neck. And multi-language development could require multiple runtimes. I suppose each execution context (page, worker) would require its own copy of the runtime, further increasing memory requirements for a client device. I'm not saying that this won't have interesting niche applications but I don't see these type of solutions having the same kind of reach as lean HTML/CSS/JS solutions with less demanding runtime requirements.

And CSS... can't we do better than that? CSS shouldn't be a separate language IMO.

Why not?
HTML initially included visual design. But in the end it was decided that HTML's job was to structure content - not layout and styling. Windowing systems use programmatic layout managers. For browsers it was decided to go with a declarative language instead.

Granted experience with imperative programming doesn't prepare you for the demands of CSS. It was never meant to make sense to programmers but it seems to make sense to designers (You'd be better at css if you knew how it worked (2019), Don’t QWOP Your Way Through CSS (2017)).

And CSS can be managed on its own terms (ITCSS, BEMIT, CUBE CSS), it just requires a different approach than what most programmers are accustomed to.

Interesting arguments. I'll have to sit on them for a while. Thanks.

 

Because that means JavaScript on the backend.

NodeJS is amazing why not.

NodeJS isn't actually all the great. It's certainly a step up from PHP/Ruby/Java/Python, but nowadays there are better options. Rust Lang is like a million times better

What makes Rust so much better?

Many reasons: the safety guarantees are absolutely astonishing and this not only gives you confidence in the correctness of your work, but also enables rapid iterating where you can use the compiler as a pair programmer (for example, revealing everything affected by a breaking change); from a performance / resource utilization perspective its actually capable of besting C/C++; the use of zero cost abstractions makes code very succinct and makes powerful behavior very accessible (for example, see rayon parallel iterators); the trait system allows for combining behaviors in a really succinct fashion; implementing generically over traits is extremely powerful; it’s superbly expressive; the build system and formatting just work; it can be used just about anywhere; it makes multithreading concurrent programming easy; it’s fun to work with; the language teaches you to be a better engineer; the abstractions make things vastly more readable and generally once you understand the language it’s fairly easy to read code and to know what’s going on; Rust doesn’t have spaghetti code; Rust is so freaking reliable that runtime issues are basically nonexistent; Rust has libraries like Diesel that make it impossible to do invalid SQL queries and that guide you towards optimal design pattens; GraphQL API development on Rust is way easier than on other languages due to the trait system (see async GraphQL); Rust enables APIs that cannot be implemented incorrectly by design; Rust enables system level programming to be accessible; Rust removes the need for garbage collection; Rust can already do 100% of JS web APIs; Rust can generate Node modules via WASM bindgen; Rust WASM doesn’t need to be interpreted and hence runs instantly in web env thus vastly reducing time to first render; Rust is enabling a new generation of decentralized technologies....

I’m not really sure there is an end to this list. Rust is the most significant leap forward in language design ever. It is truly mind blowing.

Does not necessarily mean "javascript on the backend". It means you'll have nodeJS serving your pages, but it could be as lightweight as just accessing your APIs pulling the data and serving them, in some cases not even at runtime.

Your more heavyweight backend, REST API, microservices, etc. could still be developed as separated services, with Rust or anything else you like. If you are already looking at something like Rust for performance you are probably already going towards that kind of architecture anyways

 

Well, you still have to use React, which is Facebook porting the PHP bad practices into JavaScript.

I don't like to say "X is better than Y" because it's usually very subjective but let's put it that way. The only people I've ever seen like React are comparing it to jQuery in their heads while the people I see using Vue/Nuxt often had a severe JS fatigue and take pleasure again in doing front-end.

Some comments have been hidden by the post's author - find out more

Code of Conduct Report abuse