DEV Community

Cover image for Using Bull.js to manage job queues in a Node.js micro-service stack

Using Bull.js to manage job queues in a Node.js micro-service stack

Alec Brunelle on February 02, 2019

Want more great content like this? Sign up for my newsletter, visit: alec.coffee/signup When switching to a micro-service oriented stack versus ...
Collapse
 
jdforsythe profile image
Jeremy Forsythe

We're doing something similar with microservices and workers processing jobs coming out of the queue. Our current architecture has one "producer" worker adding jobs to the queue and a "consumer" worker using the .process() method to grab the job when it's ready. Both of these workers are horizontally scalable. The "consumer" worker determines what type of job it is and hands it off to one of several workers (by calling a GRPC service) based on the job data.

Is there something in particular you ran into that made you decide to manually move jobs through the queue instead of letting Bull handle it? So far we haven't run into any issues and I hadn't anticipated any since we can easily add more consumer workers to handle any increased load, but I'd be interested in knowing what problems you ran into that led you to manually polling for jobs instead of using the built-in mechanisms.

Collapse
 
aleccool213 profile image
Alec Brunelle

Is there something in particular you ran into that made you decide to manually move jobs through the queue instead of letting Bull handle it?

For us, we had the need for workers (consumers in your instance) to be in any language or framework. That reason and also the need for horizontal scalability like you mentioned.

Producers for us could be anyone as the Bull Queue was hooked up to Kafka topics for different job types. As messages came into the topics, the queue pulled them and inserted the jobs into the queues.

Consumers pulled jobs from the Bull queue through an HTTP REST API which lived on in the Bull Queue.

This was a while ago and at a different company at which I am at currently but I wish I made the Kafka and HTTP REST code we wrote surrounding Bull open-source.

Collapse
 
mistersingh179 profile image
Mister Singh

hi @aleccool213 . I am new to bull and was hoping you can help me understand this.

Couldn't the worker be just ran in a separate process on a different server. As long as it can connect to the same redis instance, and knows the name of the queue, it will receive the jobs and then it can process them.

// worker.js

const Bull = require('Bull')
const myFirstQueue = new Bull('my-first-queue', 'redis://mypassword@myredis.server.com:1234')
myFirstQueue.process('*', (job, done) => {  done() })

// node worker.js

Is this not what is being shown in diagram 2, or are you accomplishing something different which i am not following?

Thanks

Collapse
 
aleccool213 profile image
Alec Brunelle

Good question. "micro-services" is a bloated term so I did assume a bit. When I talk about services, I mean they are completely separated, no shared anything, including redis instances. The pro of this approach is that any of these services (except the job queue of course) can be in written in any programming language. There are more pros to this approach, but to get those I will recommend you research micro-service architecture yourself 👍

Collapse
 
satanman profile image
Dr. Pepper

Then you can easily wrap bull in an API for use with external systems.

It would be cool for people who're just starting off with microservices to see an example of this.

Collapse
 
aleccool213 profile image
Alec Brunelle

I agree. It would take some effort on my end to re-produce this unfortunately. I implemented this at a company I used to work at and we didn't make that code open-source (it totally should have been).

Collapse
 
hmake98 profile image
Harsh Makwana

How can I remove or update the delayed queue in bullmq? It has an option like a drain but It'll delete all delayed queue contains by the queue service. Is there any option available where I can remove or update the queue array?