You know that one JavaScript meme? Yeah; this one:
That’s exactly how a queue feels the longer you code, especially when you go low-level.
The craziest part? It’s the simplest data structure ever.
You can simulate one with an array as easily as:
// NOTE: not recommended, O(n) for shift
const queue = []
queue.push() // enqueue
queue.shift() // dequeue
And here’s why this is wild: if you’re like me and started coding back in the ancient year of 2018, you remember how terrifying algorithms seemed.
I can’t count how many interviews I failed because I had no idea what “find the contiguous subarray (Kadane’s algorithm)” meant.
Fast forward a few years, and now?
A queue is what I throw at everything.
Not just me, literally the whole industry runs on queues.
Node.js? Queue.
MySQL driver? Queue.
RabbitMQ? I’ll stop before this turns into a sermon.
Point is; you’ll use queues. A lot. Even in production.
Yes, you will.
Things I’ve Used a Queue For
1. Buffering Real-Time Data
When working with real-time data, there’s always a producer and a consumer:
producer (usually server) ---> consumer (frontend)
Classic problem? Back pressure.
The producer sends data too fast, the consumer can’t keep up; it lags.
You can’t pause real-time data, so what do you do?
Yep; queue the data on the frontend. Classic move. I’ve used it tons of times.
producer (server) ---> queue ---> consumer (frontend)
2. Synchronizing Threads
Once, I built an immediate mode GUI (imgui) for Node.js.
Quick context; Node.js is single-threaded (one lane). So I had to spawn a separate libuv thread to simulate and render the GUI.
If you’ve worked with ImGui, you know it’s like a game loop. The GUI runs separately; you inject data, it draws each frame:
for (imgui()) {
inject_state()
render_gui()
repeat()
} // not real code, just for illustration
Meaning: state and GUI live in separate worlds.
Some of my state lived in the Node.js thread, while the GUI rendered in the libuv thread.
I couldn’t just stop the running loop to sync them; so when you create an image element, it got buffered in a queue.
On the next tick, the loop pulled it and uploaded it to the GPU, caching it in a sparse set resource manager.
That part looked like this:
{
std::lock_guard<std::mutex> lock(globalIPCData.textureLoadQueueMutex);
while (!globalIPCData.textureLoadQueue.empty()) {
auto &[key, path] = globalIPCData.textureLoadQueue.front();
gTextures.LoadTexture(key, path);
globalIPCData.textureLoadQueue.pop();
}
}
The wild part? It was blazingly fast.
3. BunniMQ Driver - Pure JavaScript Message Broker
Here’s another one.
Think of a driver as a wrapper over a low-level I/O system, like a MySQL driver.
I/O is async by nature. MySQL makes requests over the network; you never know when (or if) you’ll get a response.
Drivers solve that by buffering operations in a queue, making async code feel sync.
That’s why you can write:
const db = mysql.connect(creds)
db.insert(doc)
db.update(id, doc)
Behind the scenes, each operation is queued. When one finishes, the next runs.
Without that queue, your code would look like this mess:
mysql.connect(creds).then(() => {
db.insert(doc).then(() => {
db.update(id, doc)
})
})
See? The queue saves your sanity.
That’s exactly how I built bunniMQ and its driver, a pure JavaScript message broker built entirely on queues.
You can check it out if you’re curious.
And honestly, that’s not even half the times I’ve reached for a queue.
Node.js cluster syncing? Queue.
Breadth-first file system traversal? Queue.
Real-time audio ring buffers? Definitely a queue.
So yeah; if you’re not convinced yet, just remember: Node.js itself is one giant queue for I/O and async.
Queue in C++
The Standard Template Library (STL) makes queues just as simple:
#include <iostream>
#include <queue> // <- here
using namespace std;
int main() {
// Declare a queue that holds integers
queue<int> my_queue;
// enqueue
my_queue.push(101); // Packet #1 arrives
my_queue.push(202); // Packet #2 arrives
my_queue.push(303); // Packet #3 arrives
cout << "Three packets added to the queue." << endl;
// peek
cout << "Next packet to process: " << my_queue.front() << endl; // 101
// dequeue
my_queue.pop();
cout << "Processed the front packet." << endl;
// now who’s next?
cout << "Next packet is now: " << my_queue.front() << endl; // 202
}
Dead simple.
When in doubt; pull out the old trusty queue.
After all, most of the world already does.
Top comments (0)