DEV Community

Vimalraj Selvam
Vimalraj Selvam

Posted on

1

Asynchronous processing of data in Expressjs

I've an Express route which receives some data and process it, then insert into mongo (using mongoose).

This is working well if I return a response after the following steps are done:

  • Receive request
  • Process the request data
  • Insert the processed data into Mongo
  • Return 204 response

But client will be calling this API concurrently for millions of records. Hence the requirement is not to block the client for processing the data. Hence I made a small change in the code:

  • Receive request
  • Return response immediately with 204
  • Process the requested data
  • Insert the processed data into Mongo

The above is working fine for the first few requests (say 1000s), after that client is getting socket exception: connection reset peer error. I guess it is because server is blocking the connection as the port is not free and at some point of time, I notice my nodejs process is throwing out Out of memory error.

Sample code is as follows:

async function enqueue(data) {
    // 1. Process the data
    // 2. Insert the data in mongo
}

async function expressController(request, response) {
    logger.info('received request')
    response.status(204).send()

    try {
        await enqueue(request.body)
    } catch (err) {
        throw new Error(err)
    }
}

Enter fullscreen mode Exit fullscreen mode

Am I doing something wrong here?

Top comments (2)

Collapse
 
morz profile image
Márton Papp

For this use case I suggest using a message broker like RabbitMQ or Google PubSub etc. In the controller you only have to insert the request body into a queue and do the processing and inserting to mongo in completely separate and scalable a worker process by subscribing to that topic.

Also if you need something really fast, check out Fastify and it's benchmarks.

Collapse
 
email2vimalraj profile image
Vimalraj Selvam

Yeah I thought through having a queue which will take care of this process. I'll see if I can put it in a message queue.

Regarding fastify, i'll definitely give a try.