DEV Community

Discussion on: Using Bull.js to manage job queues in a Node.js micro-service stack

Collapse
 
mistersingh179 profile image
Mister Singh

hi @aleccool213 . I am new to bull and was hoping you can help me understand this.

Couldn't the worker be just ran in a separate process on a different server. As long as it can connect to the same redis instance, and knows the name of the queue, it will receive the jobs and then it can process them.

// worker.js

const Bull = require('Bull')
const myFirstQueue = new Bull('my-first-queue', 'redis://mypassword@myredis.server.com:1234')
myFirstQueue.process('*', (job, done) => {  done() })

// node worker.js

Is this not what is being shown in diagram 2, or are you accomplishing something different which i am not following?

Thanks

Collapse
 
aleccool213 profile image
Alec Brunelle

Good question. "micro-services" is a bloated term so I did assume a bit. When I talk about services, I mean they are completely separated, no shared anything, including redis instances. The pro of this approach is that any of these services (except the job queue of course) can be in written in any programming language. There are more pros to this approach, but to get those I will recommend you research micro-service architecture yourself 👍