DEV Community

Cover image for How Promises May Be Slowing Your App
Dean Radcliffe
Dean Radcliffe

Posted on • Updated on

How Promises May Be Slowing Your App

Hello Dev.to community, this is my first post! (Edit: And here's the second, which elaborates on this one)

usedToLovePromises.then(() => console.log( đŸ˜ĄđŸ˜€ ))
Enter fullscreen mode Exit fullscreen mode

As a solution to Callback Hell, Promises were my new hotness for a while. I gave talks on them - while juggling, even, because— concurrency đŸ€Ł Youtube.

But I realized that like every technology, they bring with them a few gotchas. So, before I tell you what I'm currently using (RxJS Observables with polyrhythm), let's look at the top 5 ways Promises may be slowing down your application today:

  • Await in a Loop Destroys Parallelism.
  • Await turns rejected Promises into Exceptions.
  • Sync Values, inside of Promises are only available Later.
  • The Single-value limitation negates Streaming benefits
  • Inability to be canceled means resources can be consumed longer than their consumers need them for

The first two you can work around, the rest require bigger changes, but are still doable.... let's dive in!

Await in a Loop Destroys Parallelism

async function(usernames) {
  let users = []

  for(let name of usernames) {
     users.push(await fetch(`http://server/users/${user}`))
  }

  return users;

}(['bob', 'angie'])
Enter fullscreen mode Exit fullscreen mode

What's wrong with this code? It aims to transform an array of usernames into the full objects returned by some remote service. And it's using the common practice of awaiting a Promise. But what happens when you await one user at a time? That's right— you are no longer able to retrieve users in parallel. Let's say your responses took 1 and 1.5 seconds - instead of having all your users back in 1.5 seconds, you'll have them back in 2.5 seconds. Instead of the overall duration being max(times), it will take sum(times).

Workaround: Use Promise.all

Using Await turns rejected Promises into Program-Crashing Exceptions

While await is nice on the happy path, it's unhappy path can be downright ugly.

Depending on your Promise-returning AJAX library, sometimes a 500 error code received from a server causes the returned Promise to be rejected. If you use await on a rejected Promise you get a synchronous exception!

Synchronous exceptions 'rip the call stack' which is a fancy way of saying 'take a long time do what they do'. And more alarmingly, they threaten to bring down your entire application! We don't want our applications to be brittle. Sometimes stateless services can be restarted, but that takes time too.

When an HTTP 200 and an HTTP 500 response may have only one byte different between them, and when they both represent a successful transfer of content from a server to your application, and when it's entirely anticipatable that servers will sometimes be slow, or unavailable, await starts to look like a real foot-gun.

Workaround: Use await in your app only when you have top-level exception handling. Don't use await at all for operations which have a reasonable chance of failing, and for which terminating a running application would not be an appropriate response.

By Design: Sync Values, inside of Promises are only available Later

let status = 'not loaded'
Promise.resolve('loaded').then(newStatus => {
  status = newStatus
})
console.log(status)
Enter fullscreen mode Exit fullscreen mode

What is the value of status above? Promise fans know that it will be "not loaded". This makes sense right, because— async. And like Jake Archibald mentions, the resolve happens as quickly after the present moment as possible. But why delay at all?

The answer to this is that by design, even if a promise is resolved now, its callbacks get called later, to resolve ambiguity. Which is a bit like dentists preemptively removing all your teeth, but hey, at least it doesn't unleash Zalgo!

Challenge: Integrating the usage of Promises for async values, and different techniques entirely for synchronously available values.

By Design: Single-value limitation negates Streaming benefits

If your server streams a large JSON document in response to /users, flushing its response stream after every user, and your client application gets the first 50 users, but uses a Promise for the resulting Array of users, your user is prevented from seeing those first 50 users until all users have been returned! And if the server fails to return the final ] in JSON, the user's browser (or your Node process) will have all the data in memory, but it will be unusable! (Ask me how I know!)

This might seem a little pedantic, but like Juan Caicedo mentions, all users are not always on fast or reliable connections.

Why are Promises to blame here? Because until the browser has an entire Array of users, it can't fulfill the Promise.

I have a little demo of how to query the same endpoint with an Observable vs a Promise, and it's no contest - the Observable, since it represents a stream of users, is able to deliver users one at a time while the Promise variant delivers nothing to the UI until it's all done. Performance is only one reason to favor many-valued Observables over single-valued Promises, but it's a valid one.

Challenge: Juggling multiple promises and relating them to each other is hard.

Inability to be canceled means resources are tied up for longer

This, for anyone working on memory/network constrained devices is perhaps the biggest deal-breaker. It's why Brian Holt gave a talk Promise Not To Use Promises that, although old, was way ahead of its time. If you start an AJAX request on a route that represents a chat room, and the user navigates to another room, Promises do not let you cancel that first request! So in the best case, the bandwidth available for the new room messages is cut in half, and in the worst case, the browser has used up its last available connection, and the new room's messages are blocked!

This is not a happy place. The moment an app knows it doesn't need the result of some process, it needs to be able to shut down that process and free its resources immediately. As a friend of mine said "Don't start something you don't know how to end". True for computing as well as children's games. :)

Challenge: Not canceling is wasteful, yet using cancelation tokens manually is tricky. (That's why the TC-39 proposal to add cancelability to Promises fizzled!)

Conclusion

I'll write up soon how I think the Observable data type solves the above issues, (and the new challenges it brings), but that's for another time. For now, just know that these are possible Promise gotchas to be mindful of. They are still a huge improvement over callbacks, and a goto tool, until you realize Observables are a Superset of Promises

Observe

Top comments (1)

Collapse
 
kiliankilmister profile image
Slick Kilmister

This really doesn’t give the promise it’s due. While the point “Sync Values, inside of Promises are only available Later” is probably something many new devs fall victim to, the final two points are simply trying to make a promise solve a problem they aren’t meant for.

A promise doesn’t try to be a stream. It’s just responsible for a single value. But multiple promises can easily create a stream. The used example is comparable to writing a function that returns an array and complaining that it doesn’t return an iterator.
Async Generators have been a builtin for years together with the for await...of syntax addition, and a promise based version of the Iterator/Iteration protocol stopped being a new idea long before that.

Promise cancellation is also something that in my doesn’t actually add much to its capabilities. The majority of times, a promise will represent something happening outside the control of the receiver (eg. a server request). A server processing what the promise represents will never never know that that promise has been declared as ‘cancelled’. Similarly some JS module would still have to actually implement the specific cancellation logic or it would run to the end regardless, making any kind of cancellation effectifly a no-op in most situations.
In addition to that, implementing a simple, but fully functional, cancelable promise is easy enough. Speculation is an npm package that was published over 4 years ago. It’s 50 lines of code (around 30 if comments and empty lines are removed) and that includes code for older browsers and edge-case handling.

But the best way to conserve processing resources when using promises is still:

“don’t start something unless you’re going to finish it”

The source of this problem usually isn’t that a promise can’t be canceled, but that promise returning functions are called to eagerly (this is a very smart joke, because promises are eager). Returning early when some condition is met and being a bit more conservative in dispatching requests will conserve resources much more effectively compared to boilerplating more complex solutions (like you mentioned: that’s what killed of the proposal).

Finally, while Observers/EventEmitters are amazing tools for a wide range of cases, the Promise is a perfect example of the unix principle: its purpose is to do one thing and do it well. It never attempted to be a solution for wide range of problems or facilitate complex exchanges, but it was always great as a small part of a larger solution for a problem.
Most observer implementations (including NodeJS Event emitters) are arguably the complete opposite of that (regarding the “one thing” part of course).
While NodeJS ‘events’ source code is amazingly pragmatic and contains very little, well designed code (the core of it is just shy of 800LoC), a library like rxjs contains many thousands of lines of code to implement its bulk of features (they have spent effort on optimizing that tho).
Removing excess bulk is a surefire way to get a performance increase. And coding in a pragmatic fashion is key to that.