DEV Community

Cover image for Why do Webdevs keep trying to kill REST?
swyx
swyx

Posted on • Updated on • Originally published at swyx.io

Why do Webdevs keep trying to kill REST?

Translations: Russian

Alt Text

Edit this chart

Watching recent trends in client-server paradigms, from Apollo GraphQL to React Server Components to Rails Hotwire, I've had a revelation that helped me make sense of it all: They're all abstractions over REST!

There are two schools of thought:

  • Smart Client: State updates are rendered clientside first, then sent back to the server.
    • You can roll your own: Use a state management solution like Redux or Svelte Stores and handwrite every piece of the client-server coordination logic.
    • You can use libraries that combine state and data fetching: Apollo Client, React Query, RxDB, GunDB, WatermelonDB and Absurd-SQL all do dual jobs of fetching data and storing related state. (you can see parallel implementations here if evaluating)
    • You can use frameworks that abstract it away for you: Blitz.js and Next.js
    • Or you can take it off the shelf: Google's Firebase and AWS' Amplify/AppSync are fully vendor provided and vertically integrated with backend resources like auth, database, and storage (arguably MongoDB Realm and Meteor's minimongo before it)
  • Smart Server: State updates are sent to the server first, which then sends rerenders to the client (whether in HTML chunks, serialized React components, or XML).

Of course the "Smart Server" paradigm isn't wholly new. It has a historical predecessor — let's call it the "Traditional Server" paradigm. The Wordpress, Django, Laravel type frameworks would fill out HTML templates and the browser's only job is to render them and send the next requests. We gradually left that behind for more persistent interactive experiences with client-side JS (nee AJAX). For a long time we were happy with just pinging REST endpoints from the client, ensuring a clean separation of concerns between frontend and backend.

So why are we tearing up the old client-server paradigm? And which side will win?

It's about User Experience

Ironically, the two sides have very different goals in UX and would probably argue that the other is less performant.

  • Smart clients enable offline-first apps and optimistic updates so your app can keep working without internet and feels instant because you are doing CRUD against a local cache of remote data (I wrote about this in Optimistic, Offline-First Apps and RxDB has a good writeup here).
    • This improves perceived performance for apps.
    • However their downside is tend to come with large JS bundles upfront: Firebase adds as much as 1mb to your bundle, Amplify got it down to 230kb after a lot of modularization effort, Realm stands at 42kb.
  • Smart servers directly cut JS weight by doing work serverside rather than clientside, yet seamlessly patching in updates as though they were done clientside. Facebook has reported as high as 29% bundle reduction.
    • This improves first-load performance for sites and reduces total JavaScript sent throughout the session.
    • However their downside is that every single user of yours is doing their rendering on your server, not their browser. This is bound to be more resource intensive and inserts a full network roundtrip for every user interaction. The problem is mitigated if you can autoscale compute AND storage at the edge (eg with serverless rendering on Cloudflare Workers or AWS Lambda). There are also real security concerns that should get ironed out over time.

The "winner" here, if there is such, will depend on usecase - if you are writing a web app where any delay in response will be felt by users, then you want the smart client approach, but if you are writing an ecommerce site, then your need for speed will favor smart servers.

It's about Developer Experience

  • Platform SDKs. For the Frontend-Platform-as-a-Service vendors like Firebase and AWS Amplify, their clients are transparently just platform SDKs — since they have total knowledge of your backend, they can offer you a better DX on the frontend with idiomatic language SDKs.
  • Reducing Boilerplate. Instead of a 2 stage process of writing a backend handler/resolver and then the corresponding frontend API call/optimistic update, you can write the backend once and codegen a custom client, or offer what feels like direct database manipulation on the frontend (with authorization and syncing rules).

    • The Smart Server boilerplate reduction is extreme, since the syncing protocol eliminates all need to coordinate client-server interactions. Quote from a LiveView user:

    "LiveView is absurd. It's the biggest change I've experienced in web development since Rails v1. I've been able to build rich, interactive, games without a single line of Javascript. It takes complicated server / api / front-end build projects and results in literally 1/10th the amount of code for the same result."

  • Offline. Both Firebase Firestore and Amplify AppSync also support offline persistence. Since they know your database schema, it's easy to offer a local replica and conflict resolution. There are vendor agnostic alternatives like RxDB or Redux Offline that take more glue work.

    • Being Offline-first requires you to have a local replica of your data, which means that doing CRUD against your local replica can be much simpler (see below).
  • Reducing Boilerplate for Optimistic Updates.

    • When you do normal optimistic updates, you have to do 4 things:
      1. send update to server,
      2. optimistically update local state,
      3. complete the optimistic update on server success,
      4. undo the optimistic update on server fail
    • With a local database replica, you do 1 thing: write your update to the local DB and wait for it to sync up. The local DB should expose the status of the update (which you can reflect in UI) as well as let you centrally handle failures.
  • People. This is an organizational, rather than a technological, argument. How many times have your frontend developers been "blocked by backend" on something and now have to wait 2-3 sprints for someone else to deliver something they need? It is hugely disruptive to workflow. Give the developer full stack access to whatever they need to ship features, whether it is serverless functions, database access or something else. Smart Clients/Servers can solve people problems as much as UX problems.

    • This is why I am a big champion of shifting the industry divide from "frontend vs backend" to "product vs platform". Chris Coyier's term for this is The All-Powerful Frontend Developer.
    • GraphQL is also secretly a "people technology" because it decouples frontend data requirements from a finite set of backend endpoints.

Both smart clients and smart servers greatly improve the DX on all these fronts.

It's about Protocols

Better protocols lead to improved UX (eliminating user-facing errors and offering faster updates) and DX (shifting errors left) and they're so relevant to the "why are you avoiding REST" debate that I split them out to their own category. Technically of course, whatever protocol you use may be a layer atop of REST - if you have a separate layer (like CRDTs) that handles syncing/conflict resolution, then that is the protocol you are really using.

A lot of these comments will feature GraphQL, because it is the non-REST protocol I have the most familiarity with; but please feel free to tell me where other protocols may fit in or differ.

  • Type Safety: GraphQL validates every request at runtime. trpc does it at compile time.
    • Increased type annotation offers better codegen of client SDKs that you would otherwise have to hand-write. This is a much more established norm in gRPC than GraphQL and I'm not sure why.
  • Bandwidth: Sending less data (or data in a format that improves UX) over the wire
    • GraphQL helps solve the overfetching problem. In practice, I think the importance of this is overhyped unless you are Facebook or Airbnb. However the usefulness of persisted queries for solving upload bandwidth problems is underrated.
    • Hotwire sends literal HTML Over The wire
    • React Server Components sends serialized component data over the wire; more compact because it can assume React, and smoothly coordinated with on-screen loading states
  • Real-time: offering "live" and "collaborative" experiences on the web
    • This is doable with periodic polling and long-polling, but more native protocols like UDP, WebRTC and WebSockets are probably a better solution
    • Replicache (used for Next.js Live) and Croquet look interesting here
    • UDP itself seems like a foundation that is ripe for much more protocol innovation; even HTTP/3 will be built atop it

There remain some areas for growth that I don't think are adequately answered yet:

  • Performance: One nightmare of every backend developer is unwittingly letting a given user kick off an expensive query that could choke up system resources. Complexity budgets are not a solved problem in GraphQL. It's a touchy subject, but new protocols can at least open up a more interesting dance between performance and flexibility.
  • Security: allowing frontend developers direct database access requires much more guard rails around security. Vendors with integrated auth solutions can help somewhat, but the evangelists for a new protocol need to be as loud about their security requirements as they are the developer experience upsides.

Not Everyone is Anti-REST

Yes of course my title is a little clickbaity; REST is perfectly fine for the vast majority of webdevs. There are even people pushing boundaries within the REST paradigm.

  • Remix, the soon-to-be-launched React metaframework from the creators of React Router, embraces native browser standards so you get progressive enhancement "for free", for example requiring that you POST from a HTML form (they have clarified that anything but GET is fine, and they are pro-HTTP, and neutral REST)
  • Supabase (where I am an investor) is a "smart client" solution that works equally well on the server, which invests heavily in the open source PostgREST project.

Followups

Transitional Apps

Rich Harris recently gave a keynote at Jamstack conf that framed his take on this issue (TLDR here):

Reader Feedback

  • Jonathan W: "The framing of the issue got my brain percolating a bit. The entire situation feels very similar to the first time a developer recognizes object-relational impedance mismatch—all the subtle differences that start to crop up as you layer an Application Framework on top of an ORM on top of an RDBMS on top of your business domain (you know, that kind of important topic). Each layer of abstraction is acceptable by itself, but the effects compound at each level and over time."
  • @thxdr: Other format efforts worth exploring: JSONAPI is a JSON REST spec, and the Relay spec is essentially a GraphQL superset spec

Oldest comments (17)

Collapse
 
tannerlinsley profile image
Tanner Linsley

Similar to JSONAPI from @thxdr's comment, OpenAPI (formerly Swagger) is another superset rest spec that offers a unique perspective on REST API usage.

Thanks for this article Shawn!

Collapse
 
hamatoyogi profile image
Yoav Ganbar

Nice write up Shawn!

REST is fine, but the amount of work you can save with GQL and codegen is massive.

Collapse
 
gklijs profile image
Gerard Klijs

You can also do codegen with REST, if it's open api specced. Although to be fair in general I had more problems with open api codegen than with GraphQL codegen.

Collapse
 
steelwolf180 profile image
Max Ong Zong Bao

Just reminds me of Progressive Web Apps (PWA) for Smart clients. But yes it's a different form abstraction with it own pros and cons.

Collapse
 
jamesmfriedman profile image
James Friedman

Yes a bit click baity ;).

I can distill it down to “backend devs love rest and front end devs love GQL”. The reason is simple: you have to pay the complexity cost somewhere. From multiple experiences I can point to GQL pushing complexity to the backend and rest pushing complexity to the front end.

For me, the transport mechanism matters less these days. I do prefer GQL because it forces you to think and model your data as an infinitely extendable graph. Last project I used rest on we went through the hoops of trying to find a way to fetch related models, and select specific fields…

In the end, however you can build and maintain something of value wins regardless of the tech or framework.

And nice article review of the options!

Collapse
 
fkrasnowski profile image
Franciszek Krasnowski

you have to pay the complexity cost somewhere

I’m not sure if gql results in bigger complexity on the backend. If you consider schema first approach. When designing an API it forces consistent schema on the backend and it's something not only fronted dev will appreciate. An even more interesting approach would be not to write your resolvers at all and use gql schema to construct a graph DB - take a look at Dgraph. Doesn't it cut the complexity of managing separate DB, rest API, and keeping documentation up to date?

Collapse
 
gklijs profile image
Gerard Klijs

I'm afraid with Dgraph I might run things that need to be changed later. Not sure that's a valid fear. There are similar solutions, like how with Hasura you can generate the schema based on your PostgreSQL schemas.

Collapse
 
rivernotflowing profile image
River

Is it true that FE devs love GQL? To us it's just another string to send to the BE, it could've been a SQL string for all I care.

Collapse
 
swyx profile image
swyx

it shifts a lot of work from FE to BE, of course FE dev love it 🙃

Collapse
 
jelhan profile image
Jeldrik Hanschke

Last project I used rest on we went through the hoops of trying to find a way to fetch related models, and select specific fields…

Did you considered JSON:API specification? It supports both features within REST architecture.

In my experience many issues with REST are caused by using an ad-hoc architecture, which can't support the features needed.

Collapse
 
kinghat profile image
kinghat

ran across this: github.com/dunglas/vulcain

Collapse
 
okrohan profile image
Rohan Salunke • Edited

Well written! This sorta also serves up as a guide to choose your client-server model.
Personally I have been a fan of GraphQL because of type safety, auto caching and codegen. As a middle-ground I'd recommend using the Open API spec(swagger.io/specification/) for REST and a smart GQLish client like React Query(react-query.tanstack.com) along with Orval(orval.dev) for client codegen.
Cheers!

Collapse
 
swyx profile image
swyx

thank you for the personal recs!

Collapse
 
bklau2006 profile image
BK Lau • Edited

My own take on REST and GraphQL with analogues from the SQL database world:

REST <---> direct client SELECT query to database
GraphQL <--> a call to stored procedure on database

There are no free lunches. REST is more flexible and simpler but you have to do more work on the client side to merge together data queried from diverse sources.
GraphQL basically delegate the merging work to the server side but some handlers has to do the job. And you need some query/merging schema know-how to do the job.

Collapse
 
thewix profile image
TheWix

I believe what you are referring to as "Smart Server" is what we used to call the "Postback" model, because you post a form back to the server which returns html back to the client and forces a refresh of the page.

I feel like the post didn't do much to directly compare REST and GraphQL. What most people do with REST today isn't really RESTful. There is no HATEOAS usually. So, all the benefits we should get from REST we don't. Honestly, it is something that GraphQL managed to pull off in a different way.

Collapse
 
bklau2006 profile image
BK Lau

Have you heard about Database stored procedures?
If not, read on below:

  1. Traditionally, clients would be sending a one or more SQL to database(backend) to query the data they need. A number of times, they just fetch data from server and merge/aggregate the data on the client side. In short the smart logic lies in the clients know what they want and how they want it.
    This is analogous to calling REST API backends.
    However,...
    Client queries are expensive and slow and tedious to do over data that spans several tables with complex joins...and you need to know SQL really well.

  2. This is where database stored procedures comes in.
    You predefined a "function" on the database backend that takes in several parameters. Every programmers knows what a function is.
    This database "function" is called Stored Procedures. Look it up.
    So its like a GraphQL endpoint.
    So you just send query parameters to the backend database instead of sending raw SQL queries. You don't even need to know SQL!
    The data you need from several queries are run and merged by the database stored procedure automatically. Isn't that neat?

I see GraphQL as a reinvention of backend database stored procedure ala web style.

Collapse
 
kibiz0r profile image
Michael Harrington

I wish people would stop saying "REST" when they mean "HTTP server that maps HTTP verbs to CRUD operations and uses JSON for input/output".

You can't "kill REST".

It is a set of architectural elements and constraints that apply to a distributed cross-organizational anarchic client-server messaging system like the web. It is how you design something like HTTP.

REST does not automatically mean CRUD verbs, not does it mean JSON, nor does it even necessarily mean using HTTP as your messaging mechanism.

You can have a REST-adhering system that runs on HTTP, MQTT, or smoke signals. You can describe interactions in JSON, GraphQL, or protobuf.

Looking at it through that lens, it really matters more whether you're able to categorize a client's requests as cacheable, destructive, and/or idempotent and send back a response that represents an atomic state with hyperlinks to other valid states, so that the whole system can be scaled by infrastructure that's oblivious to the application-specific meaning of a particular message.

Many so-called "REST APIs" do not meet that criteria. Not even close. They implement a thin veneer on top of DB operations, maybe with some validation if you're lucky. It's then up to the client to have intimate understanding of valid state transitions and the characteristics of each specific request.

Moreover.... Most systems that claim to be "RESTful" shouldn't even strive to follow REST! The architectural constraints just don't make sense for (or are fundamentally incompatible with) the problem they're trying to solve.

I think where we screwed up with SPAs, as an industry, is that we took inspiration from the document-based web instead of from realtime video games.

Imagine trying to represent Counterstrike as a series of hyperlinks. I mean, you certainly could -- Twitch Plays Pokemon is something akin to that. But it's not a practical implementation for that system.

The real problem here is that even when devs understand REST deeper than "JSON CRUD over HTTP", they get the order of operations wrong. They think "Well, HTTP is the most readily-available mechanism I have in this component... and REST seems like a natural fit for HTTP..." instead of "Is REST the right set of constraints for this component? If not, can I find a mechanism other than HTTP, that makes more sense for the right set of constraints?"