🚀 Learn how to debounce and batch asynchronous server requests dynamically — without touching the form component’s internal logic.
In this article, I’ll share a pattern we used to optimize multiple asynchronous requests to our backend. While debounce is most often applied to user keystrokes, such as search inputs. Our scenario is quite different:
- We’re batching default data fetched from the server (initial form values), rather than debouncing user-entered input.
- Our form must load these default filters on mount, causing dozens of individual API calls.
- We wanted to introduce batching and debounce at the request layer, without altering the form component’s internal logic.
🥇 Step 1: The Simplest Form
In Sandbox 1, we start with a minimal that renders two selects per item and fetches options on each label change. Everything works perfectly when you have only a handful of items.
🥈 Step 2: Adding Default Values
Next, we updated the form to accept defaultItems as props so that initial values come from the server. Only a small code change was needed:
--- App.tsx (before)
- export const App = () => {
- return <CustomFrom />;
- }
--- App.tsx (after)
+ const defaultItems = [
+ { labelName: 'status', labelValue: '' },
+ { labelName: 'owner', lavelValue: ''}
+ ]
+ export const App = () => {
+ return <CustomFrom defaultItems={defaultItems} />;
+ }
This adaptation appears in Sandbox 2.
🥉 Step 3: Suddenly, 100+ Items
In Sandbox 2, we load 2 default items. Then, in Sandbox3, we scaled the items to 100+. Our naive approach now fires dozens of near‑simultaneous requests on a mount.
😱 We hadn’t anticipated this scale, and performance suffers immediately.
But we didn’t want to refactor itself — it’s used everywhere. Instead, we chose to optimize at the request layer, treating the component as a black box.
🚀 Concept: Batch Debounced Requests
We need a batch method that:
- Collects label names as they come in.
- Debounces for a short interval.
- Sends one batched API call with all pending names.
- Resolves each request promise with its specific data.
This lets us keep unchanged while dramatically reducing network chatter.
Implementation (Promise-based Queue)
Our implementation uses a generic AbstractUsageQueue class with a request(key: string): Promise<…> API.
Sandbox 4 (Promise version)
Understanding the Problem
Imagine a form component that, upon mounting, fetches a list of options from the server. If the backend provides default values for these options, the component might trigger multiple requests to fetch each option individually. This can result in unnecessary network traffic and potential performance bottlenecks.
Traditional debouncing techniques focus on limiting the rate at which user inputs trigger actions. However, in our case, the challenge lies in managing multiple asynchronous requests triggered by the server, not the user. Therefore, we need a strategy that allows us to batch these requests and debounce their execution without modifying the component's code.
Introducing the AbstractUsageQueue
To address this, we can implement a generic queue system that handles the batching and debouncing of server requests. This system will manage a queue of request identifiers and ensure that requests are sent only when necessary, either after a specified debounce period or when the queue reaches a certain length.
Here's an implementation of the AbstractUsageQueue class:
export class AbstractUsageQueue<I extends string, A> {
private queue = new Set<I>();
private resolvers = new Map<
I,
{ resolve: (value: A) => void; reject: (reason?: any) => void }
>();
private timer: NodeJS.Timeout | null = null;
constructor(
private batchRequest: BatchRequestFn<I, A>,
private buildParams: (ids: I[]) => any[] = (ids) => [ids],
private debounceMs = 300,
private maxQueueSize = 100
) {}
public request(key: I): Promise<A> {
if (!this.queue.has(key)) {
this.queue.add(key);
}
// If queue hits max size, flush immediately
if (this.queue.size >= this.maxQueueSize) {
this.flush();
} else if (!this.timer) {
// Otherwise debounce the flush
this.timer = setTimeout(() => this.flush(), this.debounceMs);
}
return new Promise<A>((resolve, reject) => {
this.resolvers.set(key, { resolve, reject });
});
}
private async flush(): Promise<void> {
if (this.timer) {
clearTimeout(this.timer);
this.timer = null;
}
const keys = Array.from(this.queue);
this.queue.clear();
try {
const params = this.buildParams(keys);
const result = await this.batchRequest(...params);
keys.forEach((key) => {
const { resolve, reject } = this.resolvers.get(key)!;
if (result[key] !== undefined) {
resolve(result[key]);
} else {
reject(new Error(`No result for key: ${key}`));
}
this.resolvers.delete(key);
});
} catch (error) {
// Reject all pending promises on error
keys.forEach((key) => {
const { reject } = this.resolvers.get(key)!;
reject(error);
this.resolvers.delete(key);
});
}
}
}
In this implementation:
requestFn: A function that accepts an array of keys and returns a promise resolving to a record of results.
debounceTime: The debounce delay in milliseconds.
maxQueueLength: The maximum number of items in the queue before forcing a flush.
flush: A method that processes the queue, sending the batched request and resolving or rejecting the corresponding promises.
Integrating with Components
To use this queue system within a component, you can instantiate the AbstractUsageQueue class and call its request method whenever a request is needed. Here's an example:
const queue = new AbstractUsageQueue(fetchLabels);
const handleRequest = async (key: string) => {
try {
const result = await queue.request(key);
console.log(result);
} catch (error) {
console.error(error);
}
};
In this example, fetchLabels is a function that fetches the labels from the server. The handleRequest function uses the queue to request the labels, ensuring that requests are batched and debounced appropriately.
🎉 Conclusion
By moving debouncing and batching out of the component and into a standalone request layer, we:
Avoided refactoring dozens of components.
Reduced network calls from 100+ down to a handful of batched requests.
Kept untouched and maintainable.
This pattern can be applied wherever you need to debounce or batch parameterized API calls at scale.
Conclusion
By implementing a generic queue system like AbstractUsageQueue, we can efficiently manage multiple server requests without modifying the component's internal logic. This approach allows us to batch and debounce requests, improving performance and maintainability in our applications.
For a live demonstration and interactive examples, you can explore the following CodeSandbox:
👉 Interactive Example on CodeSandbox
This sandbox provides a hands-on experience with the concepts discussed and allows you to experiment with the implementation in a controlled environment.
Top comments (0)