DEV Community

Arjen de Ruiter for Sendcloud

Posted on • Updated on

Building for the (next) Web

January 13 the second Sendcloud Engineering meetup took place. We noticed we have a lot of knowledge and experience internally that we would like to share with a broader audience. At the same time we want to learn from others in our industry. As such, we started this meetup series.

A summary of last week's meetup focused on (the workings of) the web.

In The next web, Hidde de Vries (@hdv on Twitter), a front-end web developer and accessibility specialist, and ex W3C/WAI, ex Mozilla, talked about how the web has developed and how it can evolve further still.

Hidde takes us back to the year 1989, where a proposal by Tim, Berners-Lee at CERN is deemed "vague, but exciting...". It of course is the foundation of the World Wide Web. In 1990 the pillars that underpin the Web, HTML, URI (URL), and HTTP were formulated. It was decided that the Web should be free (of fees and permissions). Tim Berners-Lee later said that had the internet been proprietary, it would have never taken off like it did.

In 1994 the W3C was founded with the aim to "realize the full potential of the Web". But have we realized its full potential? W3C is the authority on the subject of Web Standards, so that tools that use the Web work in the same or a similar way, that they are interoperable.

Web 2.0 was coined in 1999 by a UX consultant and was different from the "screenfuls of text and graphics", but "the ether of interaction, where everything is connected". Essentially Web 2.0 was the same tech (HTTP), used in more places, and in more exciting ways. Further popularized in 2004 as "Web as a platform". The new buzz words became user generated content, software as a service, folksonomy, open APIs, widgets... What did Tim Berners-Lee think of this "new" Web? "Web 2.0 is, of course, a piece of jargon, nobody even knows what it means."

The concepts evangelists believe make up Web3 - a term coined in 2014 by the inventor of a blockchain technology - are decentralized, transparent, blockchain, immutable, creators, and scarcity. But, is Web3 really decentralized? A lot of "Web3" APIs, wallets, exchanges are central entities and/or run on central services like Google or AWS. Is it transparent to folks who lack deep technical knowledge? Is it immutable when exchanges often are able to freeze assets or do "hard resets" when they are hacked?

Is it really going to protect creators when anyone can create NFTs, even of what's not theirs? Royalties are not defined in ERC-721. And when you buy an NFT, the unique hash is the only thing that's scarce...

But the question Hidde really urges us to ask ourselves: is Web3 hyped by anyone who does not have a vested interest? Can Web3 benefit society?
Some of us just want to access information securely, accessibly and with minimal privacy impact.

Hidde sees plenty more exciting futures for the web. One that has better styles, easier payments (like W3C's proposed recommendation), easy and safe authentication. A web with more art, more personal sites. In any case, the next Web is not set in stone. There are many challenges and plans beyond those posed by the Web3 crowd.

In UX and DX in 2022, Rhian van Esch, Senior Frontend developer & architect at Sendcloud, explores how to achieve the best user experience without sacrificing developer experience.

What then makes good user experience, and what factors can be affected by the development tools we chose? Google uses Web Vitals as "quality signals that are essential to delivering a great user experience on the web". Rhian focuses on the metrics Largest Contentful Paint, which measures loading performance of a web page, and First Input Delay, which measures how quickly your page becomes interactive.

If your page loads a large JavaScript bundle, the browser needs to download it, parse it, and execute it. If you've built a single-page application, it's common that you need the bundle - or possibly multiple bundles - to be downloaded, parsed and executed before anything appears on the page. This impacts your Largest Contentful Paint metric.

For the First Input Delay metric, JavaScript again plays a role: if your JavaScript takes a long time to execute, you can end up locking up the main thread of the browser while waiting for execution to complete. Users can click on things all they like, but nothing will happen.

A good frontend developer experience is being able to build a UI that is highly interactive, that data is loaded without having to navigate away from the current page, and any navigation should be extremely fast. If you think that sounds like a mobile application... well, you're not wrong.

What else do we want? Component-driven development and state management. Enter the JavaScript frameworks. React, Vue.js, Angular, will allow you to build app-like experiences. Although popular, well-supported frameworks allow us to work faster and more effectively, they come at the before-mentioned cost.

If you're hell-bent on using JavaScript, modern frameworks encourage code-splitting, where you don't send the user all the JavaScript for the entire application, but maybe only the initial view. We can also ask the browser to cache these files aggressively.

Server-side rendering (SSR) renders the entire page on the server initially (rather than rendering it on the client using JavaScript) and then uses a technique called "hydration" to add back the interactivity using JavaScript. This leads to a very fast initial render that isn't blocked by JavaScript. The downside is that, all things considered, you are sending the user everything twice.

Of course, not everything needs to be a SPA. Ask yourself if you genuinely expect deep engagement. You'll still want to build a fast site - because who wants a slow website? - and use components. What are your options?

Rhian suggests checking out Astro. By default, everything compiles to plain HTML with no interactivity. Then you have to specify which components are interactive. You can have them become interactive at runtime, when the browser viewport is a certain size, or first when they are scrolled into view.

Qwik is another tool that aims to solve the performance problem. Like Astro, Qwik hydrates the components that need it. It can even hydrate these components out of order, and asynchronously - it gives very fine-grained control over this hydration.

Partial hydration has been around for longer than you might think. Marko is a tool built by eBay which had the concept of partially hydrating the page as early as 2014.

Keeping in mind that performance is a vital part of user experience, Rhian encourages everyone to submit their developer tooling to due diligence.

Watch the recording for the full length talks, and Q&A.

Our next meetup is planned for March 9th. RSVP on meetup.com.

Top comments (0)