Google Web Dev

Cancellable async functions in JavaScript

samthor profile image Sam Thorogood Updated on ・7 min read

(This post explains how to use generators to wrangle duplicate calls to async functions. Check out this gist for the final approach or read on to learn more! 🎓)

JavaScript is a twisty maze of horrible asynchronous calls, all alike. We've all written code like this—but in this post, I'll talk about async and await. These are keywords that are widely supported and help you migrate that code to something much more readable. 📖👀

And most importantly, I'll cover a key pitfall: how to deal with an asynchronous method being run more than once, so that it doesn't clobber other work. 🏑💥

Let's start with the example. This function will fetch some content, display it to the screen and wait a few seconds before drawing attention to it:

function fetchAndFlash(page) {
  const jsonPromise = fetch('/api/info?p=' + page)
      .then((response) => response.json());
  jsonPromise.then((json) => {
    infoNode.innerHTML = json.html;

    setTimeout(() => {
    }, 5000);

Now we can rewrite this with async and await like this, with no callbacks:

async function fetchAndFlash(page) {
  const response = await fetch('/api/info?p=' + page);
  const json = await response.json();
  infoNode.innerHTML = json.html;

  // a bit awkward, but you can make this a helper method
  await new Promise((resolve) => setTimeout(resolve, 5000));


Isn't that nicer? It jumps around and it's easy to see the steps top-to-bottom: fetch a resource, convert it to JSON, write to the page, wait five seconds and call another method. 🔜

It's A Trap!

But there's something here which can confuse readers. This isn't a regular function which is executed "all at once"—every time we call await, we basically defer to the browser's event loop so it can keep working. ⚡🤖

To put it another way: let's say you're reading code that uses fetchAndFlash(). If you hadn't read the title of this post, what might you expect to happen if you run this code?


You might expect that one will happen after the other, or that one will cancel the other. That's not the case—both will run more or less in parallel (because JavaScript can't block while we wait), finish in either order, and it's not clear what HTML will end up on your page. ⚠️

How two tasks can run in parallel and overwrite one another

To be clear, the callback-based version of this method had exactly the same problem, but it was more apparent—in a very disgusting kind of way. In modernizing our code to use async and await, we make it more ambiguous. 😕

Let's cover a few different approaches to solving this problem. Strap in! 🎢

Approach #1: The Chain

Depending on how and why you're calling an async method, it might be able to 'chain' them one after another. Let's say you are handling a click event:

let p = Promise.resolve(true);
loadButton.onclick = () => {
  const pageToLoad = pageToLoadInput.value;
  // wait for previous task to finish before doing more work
  p = p.then(() => fetchAndFlash(pageToLoad));

Every time you click, you add another task to the chain. We could also generalize this with a helper function:

// makes any function a chainable function
function makeChainable(fn) {
  let p = Promise.resolve(true);
  return (...args) => {
    p = p.then(() => fn(...args));
    return p;
const fetchAndFlashChain = makeChainable(fetchAndFlash);

Now, you can just call fetchAndFlashChain() and it'll happen in-order after any other call to fetchAndFlashChain(). 🔗

But that's not the proposal in this blog post—what if we want to cancel the previous operation? Your user has just clicked on a different load button, so they probably don't care about the previous thing. 🙅

Approach #2: Barrier Checks

Inside our modernized fetchAndFlash(), we use the await keyword three times, and only really for two different reasons:

  1. to do the network fetch
  2. to flash after waiting 5 seconds

After both these points, we could stop and ask—"hey, are we still the most active task? The thing the user most recently wanted to do?" 🤔💭

We can do this by marking each distinct operation with a nonce. This means creating a unique object, storing this locally and globally, and seeing if the global version diverges—because another operation has started—from the local one.

Here's our updated fetchAndFlash() method:

let globalFetchAndFlashNonce;
async function fetchAndFlash(page) {
  const localNonce = globalFetchAndFlashNonce = new Object();

  const response = await fetch('/api/info?p=' + page);
  const json = await response.json();
  // IMMEDIATELY check
  if (localNonce !== globalFetchAndFlashNonce) { return; }

  infoNode.innerHTML = json.html;

  await new Promise((resolve) => setTimeout(resolve, 5000));
  // IMMEDIATELY check
  if (localNonce !== globalFetchAndFlashNonce) { return; }


This works fine, but is a bit of a mouthful. It's also not easy to generalize and you have to remember to add checks everywhere it matters!

There is one way, though—using generators to generalize for us.

Background: Generators

While await defers execution until the thing it's waiting for finishes—in our case, either a network request or just waiting for a timeout—a generator function basically does the opposite, moving execution back to where it was being called from.

Confused? It's worth a quick rehash:

function* myGenerator() {
  const finalOut = 300;
  yield 1;
  yield 20;
  yield finalOut;
for (const x of myGenerator()) {
// or, slightly longer (but exactly the same output)
const iterator = myGenerator();
for (;;) {
  const next = iterator.next();
  if (next.done) {

This program, both versions, will print 1, 20 and 300. What's interesting is that I can do whatever else I like inside either for loop, including break early, and all the state inside myGenerator stays the same—any variable I declare, and where I'm up to.

It's not visible here, but the code calling the generator (and specifically the .next() function of the iterator it returns) can also resume it with a variable. We'll see how soon.

We can use these parts together to just not continue working on some task if we decide to stop, and also to resume execution with some output. Hmm—sounds perfect for our problem! ✅

The Solution 🎉

Let's rewrite fetchAndFlash() for the last time. We literally just change the function type itself, and swap await with yield: the caller can wait for us—we'll see how next:

function* fetchAndFlash(page) {
  const response = yield fetch('/api/info?p=' + page);
  const json = yield response.json();

  infoNode.innerHTML = json.html;

  yield new Promise((resolve) => setTimeout(resolve, 5000));


This code doesn't really make sense right now, and it'll crash if we try to use it. The point of yielding each Promise is that now, some function that calls this generator can do the await for us, including checking a nonce. You now just don't have to care about inserting these lines whenever you wait to wait for something—you just have to use yield.

And most importantly, because this method is now a generator, not an async function, the await keyword is actually an error. This is the absolute best way to ensure you write correct code! 🚨

What is that function we need? Well, here it is—the real magic of this post:

function makeSingle(generator) {
  let globalNonce;
  return async function(...args) {
    const localNonce = globalNonce = new Object();

    const iter = generator(...args);
    let resumeValue;
    for (;;) {
      const n = iter.next(resumeValue);
      if (n.done) {
        return n.value;  // final return value of passed generator

      // whatever the generator yielded, _now_ run await on it
      resumeValue = await n.value;
      if (localNonce !== globalNonce) {
        return;  // a new call was made
      // next loop, we give resumeValue back to the generator

It's magic, but hopefully it also makes sense. We call the passed generator and get an iterator. We then await on every value it yields, resuming with the resulting value, like a network response—until the generator is done. Importantly, this lets us generalize our ability to check a global vs local nonce after each async operation.

An extension: return a special value if a new call was made, as it's useful to know if individual calls were cancelled. In the sample gist I return a Symbol, a unique object that you can compare to.

Finally, we actually use makeSingle and wrap up our generator for others to use, so now it works just like a regular async method:

// replaces fetchAndFlash so all callers use it as an async method
fetchAndFlash = makeSingle(fetchAndFlash);

// ... later, call it
loadButton.onclick = () => {
  const pageToLoad = pageToLoadInput.value;
  fetchAndFlash(pageToLoad);  // will cancel previous work

Hooray! Now, you can call fetchAndFlash() from wherever you like, and know that any previous calls will cancel as soon as possible.

Aside: Abortable Fetch

Keen folks might note that what I've covered above just cancels a method, but doesn't abort any in-flight work. I'm talking about fetch, which has a somewhat-supported way to abort the network request. This might save your users bandwidth if the async function is say, downloading a really large file, which wouldn't be stopped by what we've done—we'd just cancel once the file has already eaten up precious bytes.


If you've read this far, you've hopefully thought a bit more about the way JavaScript works.

JS can't block when you need to do asynchronous work, multiple calls to your methods can happen, and you can have strategies to deal with that—either chaining, or as the whole thesis of the post goes, cancelling previous calls.

Thanks for reading! 👋

Posted on by:

samthor profile

Sam Thorogood


Developer Relations for Web at Google.

Google Web Dev

Collected thoughts and posts on web development from the @ChromiumDev team.


markdown guide

The solution presented here isn't really "cancellation", because (as the article acknowledges), the async..await function (aka, the generator) isn't stopped right away, but only silently exits later after it eventually resumes (if ever).

A better approach, IMO, is to proactively stop the generator (using its return() or throw() methods) right away.

CAF is a library I wrote to make such truyly-cancelable async functions easy to write and work with: github.com/getify/CAF


Great post!! For my 10 line function approach #2 suffices, but the final solution will sure come in handy some day!

By the way: in my case the barrier check is in a deeper nested function. I throw an error instead of a simple return. That way I can suffice with only a single barrier check instead of 5.


When I read nonce, my first thought was a library for that, but no, just new Object() 😄


Ever-increasing numbers also works but I think an arbitrary Object makes the most sense :)