DEV Community

Jennie
Jennie

Posted on

Combining async requests in the background

When I was in Shopee, there was an interesting case about a React component triggering some API requests and a long list of this component often appears in a page. As a result, client-side send many requests in a short time and not only slowed down the interface but also occupied server resource significantly.

To improve this, my smart teammate created a "batching" solution that I loved so much and used the idea in my interview later.

How did we achieve this?

Let me recap the situation with some detailed setups first:

  1. There is a <ListItem/> component sends a API GET /item under certain event (e.g, mouse enter) with compulsory parameters like item ID, like this:
import React, { useCallback, useState } from 'react';

// Proper error handling is ignored here...
const fetchItem = id => fetch(`/item?id=${id}`).then(res => res.json());

export function ListItem({ id }) {
  const [item, setItem] = useState({});
  const onMouseEnter = useCallback(() => {
    fetchItem(id).then(setItem);
  }, [id]);
  return <div onMouseEnter={onMouseEnter} >{item.name}</div>;
}
Enter fullscreen mode Exit fullscreen mode
  1. The <ListItem/> component is used to display a long list, like this:
import React from 'react';
import { ListItem } from './ListItem';
export function List({ ids }) {
  return (
    <div>
      {ids.map(id => 
        <ListItem id={id} />
      )}
    </div>
  );
}
Enter fullscreen mode Exit fullscreen mode

Now you may question that isn't the solution just sending a list request like GET /items?ids=1,2,3 in <List/> component instead? Well, it is not the case here as the request is only sent when user triggers a specific event on the item. But you are getting closer to the point - instead of GET /item?id=xxx repeatedly with different sets of parameters, it is better to find a way to send GET /items?ids=1,2,3 instead.

As these requests are sent at different times, here is an important decision to make: under what condition shall we combine the requests?

Here are 2 ideas:

  1. Combine the requests triggered within a certain period;
  2. Combine the requests when the request queue reaches a certain length.

Timing is the first factor to consider as if the interface is waiting for the response data from the request, we shall not keep user waiting for too long.

Queuing request length however, depends more on the server side. How many items can they process at one time (considering each item may require extra information to process)? Is there any request URL (for GET requests) or body size limit (load balancer like Nginx has a default setting, and sometimes BE framework sets limit as well)?

Combining by time

Speaking of time and requests, there are 2 well-known existing strategies: throttle and debounce.

Debounce delays the function invocation by a defined period of time to avoid unnecessary invocations. It works well with button clicks and key events.

Throttle invokes the callback function at regular intervals as long as the event trigger is active. It suits more in the continuous events.

Here I will demonstrate with a strategy inspired from throttle.

First of all, let's add a fetchItems() helper:

const fetchItems = ids => fetch(`/item?ids=${ids.join(',')}`).then(res => res.json());
Enter fullscreen mode Exit fullscreen mode

Then tweak the original fetchItem() to store the single set of parameters into a queue:

const queue = [];
const fetchItem = id => {
    queque.push(id);
    // TODO: flush the queue
};
Enter fullscreen mode Exit fullscreen mode

And handover the parameters to the fetchItems() helper with a timer delaying 200ms.

const queue = [];
const fetchItem = id => {
  queue.push(id);
    if (queue.length === 1) {
      setTimeout(() => {
        fetchItems(quque);
        queue.length = 0;
      }, 200);
    }
}
Enter fullscreen mode Exit fullscreen mode

You may notice that I only created a timer when the parameters start to queue up, and clear the queue once the batch request fetchItems() is sent.

Now it should work for the requests not looking forward to responses (e.g. recording user behaviours). If we would like to process with the responses, we may create a Promise and cache it for later like this:

const queue = [];
let task;
const fetchItem = id => {
  queue.push(id);
    if (queue.length === 1) {
      task = new Promise((resolve) => {
            setTimeout(() => {
            fetchItems(quque);
            queue.length = 0;
          }, 200);
        });
    }
    return task;
}
Enter fullscreen mode Exit fullscreen mode

The promise is wrapping the timer, and each single fetchItem() "request" is returning this promise. As a result, you may get all the items in the same batch once this promise resolves:

fetchItem(1).then(items => items[0]);
fetchItem(2).then(items => items[1]);
fetchItem(3).then(items => items[2]);
Enter fullscreen mode Exit fullscreen mode

Furthermore, we may abstract this logic for future usage:

function batchRequests(request, paramMerger, period = 200) {
  const queue = [];
  let task;

  return (params) => {
    queue.push(params);
    if (queue.length === 1) {
      task = new Promise((resolve) => {
        setTimeout(() => {
          const params = paramMerger(queue.slice());
          queue.length = 0;
          resolve(request(params));
        }, period);
      });
    }
    return task;
  };
}
const fetchItem = id => batchRequests(
  fetchItems,
  ids => ids
);
Enter fullscreen mode Exit fullscreen mode

Here is the CodeSandbox to try:

Combining by count

Combining requests by count alone is quite straightforward. Let's reuse the fetchItems() created above:

const MAX_QUEUE = 5;
const queue = [];
const fetchItem = id => {
  queue.push(id);
    if (queue.length === MAX_QUEUE) {
    fetchItems(quque);
    queue.length = 0;
    }
}
Enter fullscreen mode Exit fullscreen mode

However, considering count only is a bad idea as fetchItem() may never send the request till it collects enough parameters. Therefore, we shall consider with time together, like this:

const MAX_QUEUE = 5;
const queue = [];
let timer;
const flushQueue = () => {
    clearTimeout(timer);
    fetchItems(quque);
  queue.length = 0;
};
const fetchItem = id => {
  queue.push(id);
    if (queue.length === 1) {
      timer = setTimeout(flushQueue, 200);
    } else if (queue.length === MAX_QUEUE) {
    flushQueue();
    }
}
Enter fullscreen mode Exit fullscreen mode

And oops, it gets tricky if we expect the response data from the function:

const MAX_QUEUE = 5;
const queue = [];
let timer;
let flushQueue = () => null;
const fetchItem = id => {
  queue.push(id);
  if (queue.length === 1) {
    const timer = setTimeout(() => flushQueue(), period);
    task = new Promise((resolve) => {
      flushQueue = () => {
        const params = paramMerger(queue.slice());
        queue.length = 0;
        clearTimeout(timer);
        resolve(request(params));
      };
    });
  } else if (queue.length === MAX_QUEUE) {
    flushQueue();
  }
  return task;
};
Enter fullscreen mode Exit fullscreen mode

This is because it is trying to resolve the promise for both 2 conditions, but we only create the timer and promise once for each batch. It looks quite messy and risky. We will discuss alternatives later.

Similar as above, we may extract the logic for future usage:

function batchRequests(request, paramMerger, period = 200, maxQueue = 5) {
  const queue = [];
  let task;
  let flush = () => null;
  if (maxQueue < 2) {
    throw new Error("Max queuable requests must be more than 1!");
  }

  return (params) => {
    queue.push(params);
    if (queue.length === 1) {
      const timer = setTimeout(() => flush(), period);
      task = new Promise((resolve) => {
        flush = () => {
          const params = paramMerger(queue.slice());
          queue.length = 0;
                    clearTimeout(timer);
          resolve(request(params));
        };
      });
    } else if (queue.length === maxQueue) {
      flush();
    }
    return task;
  };
}
Enter fullscreen mode Exit fullscreen mode

Here is the CodeSandbox to try:

The alternative

When a logic gets tricky, it is highly likely we are doing things wrongly or simple alternative exists.

Here for getting the fetchItems() response I believe state manager such as Redux is a better alternative. With the state manager, we just need to update the items state in our modified fetchItem(). Then the "reactive" framework will notify the change and render the latest data, and resolving promise is no longer expected from the fetchItem().

The risk

Everything looks pretty nice before user close the tab without sending the request and your important data (e.g. data relates to advertisement charges) get lost! It is necessary to consider moving this part of the logic into ServiceWorker to avoid losses.

Words at last

To wrap up, handling async requests is an essential Front-end skill nowadays, and sometimes it is more than just applying aysnc/await and Promise APIs.

Top comments (0)