DEV Community

Cover image for This is why your Node.js application is slow

This is why your Node.js application is slow

Michael Owolabi on February 08, 2022

Many performance-related issues in Node.js applications have to do with how promises are implemented. Yes, you read that right. How you implemented...
Collapse
 
darkwiiplayer profile image
𒎏Wii 🏳️‍⚧️

The simple problem with it is that it blocks the event loop when not correctly used.

How does this block the event loop though? Just because one thread of execution is waiting for some data (or just a timeout) doesn't mean the rest of the application can't still be doing stuff.

What you're describing is simply a sequential fetching of data that will slow down a single thread of execution, but not the rest of the application.

Collapse
 
imichaelowolabi profile image
Michael Owolabi

Thank you @darkwiiplayer for your comment. Yes, I agree that there's a way to use async/await to ensure the other part of the program gets executed while the result of the async operation is being processed however, there's also a way as shown in the article where the event loop is blocked and no other part of the program will get executed until the result of the asynchronous operation becomes available as a result of using it wrongly.

Collapse
 
darkwiiplayer profile image
𒎏Wii 🏳️‍⚧️

Again, that's not what you show in the article. You're not blocking the event loop, you're just waiting for a timer twice in a row.

Thread Thread
 
dununubatman profile image
Joshua Dale

DarkWiiPlayer is correct. In your first example you changed the behavior so your promises are effectively running in parallel instead of running sequentially in your asynchronous example. The event loop isn’t being blocked, you just told it to wait for the first promise to complete before executing the second.

Thread Thread
 
imichaelowolabi profile image
Michael Owolabi • Edited

Thank you so much @darkwiiplayer for your follow up response and you @dununubatman for your further explanation. Yes, you're correct and I agree with you that I did not show that in the article and I am going to update that part in the article as pointed out.
I really appreciate 🙏🏻

Collapse
 
ats1999 profile image
Rahul kumar

no other part of the program will get executed until the result

This is not correct await will not block execution of the whole program. It'll pause the execution of the function.

Collapse
 
sannajammeh profile image
Sanna Jammeh

The event loop is NOT blocked by any async function unless you’re executing blocking code. Await is not blocking code.

Collapse
 
leomartindev profile image
Léo Martin

Yes, I think there is a HUGE confusion with how the event loop works.
This article is misleading.

Collapse
 
vectormike40 profile image
Victor Jonah

Great article! But I’m still confused but I think you meant to say ‘slow down the event loop’ and not block it because this long response times can not block the event loop. Things that block the event loop are bad recursion(no termination condition) and sync operations like reading a file(this blocks for a while).

I totally get your tips!

Collapse
 
imichaelowolabi profile image
Michael Owolabi

Thank you @vectormike40 for your comment. Yes, while all application that block the event loop will have slow response time the converse is not always true and that is where the confusion is here. It is not sufficiently shown in the article where or how the event loop is blocked and I'm going to update that part of the article. Thanks once again

Collapse
 
olasunkanmi profile image
Oyinlola Olasunkanmi

It is a thumb rule not to use async await in a for loop. Also a try catch block should be used alongside async await so as to catch any error that may occur.

Collapse
 
bias profile image
Tobias Nickel

I disagree of putting try catch around all await. most of the time when there is an error it should be logged, the current operation should stop and the frontend should get a error response. And this is done by frameworks nd the frameworks can often be oconfigured to do standard handling.

Only when actually doing real handling of an error like retry, or try
an alternative solution a try/catch should be used.

otherwise try catch blocks all over the place make code difficult to read (bloated) and often lead to inconsitencies how errors are logged, or discarded, responses are generated. Because we develop programs in teams not only on our own,...

Collapse
 
olasunkanmi profile image
Oyinlola Olasunkanmi • Edited

when you use a try catch block within async and await, you can catch the exception and log it.

Collapse
 
necmettin profile image
Necmettin Begiter

How did you manage to disagree with something that said the exact same thing you said?

Collapse
 
necmettin profile image
Necmettin Begiter • Edited

There are fundamental and much more basic problems with the data structure.

First of all, unless you have multiple nextofkin for your employees (which you don't), you must not keep employeeid in nextofkin data, instead, you must keep nextofkinid in employee data. This means one less index for nextofkin data, and having a reference to the nextofkin by the time you read employee id.

Second, if you have a list of employees you have already read from one table, and multiple nextofkin you need to read from another table, you should never ever read them one by one. In your data structure, the correct way is to create a list of employee ids, and fetch them all at once from the nextofkin list all at once.

Also, as far as I can see, most programmers confuse blocking the event loop with blocking the current response.

If the current response needs to read data from a database, that is not a blocking operation. The event loop will work on other things while ASYNChronously AWAITing for the data. Again, while that current response is waiting for the data, NodeJS will keep working on other requests and responses.

Collapse
 
christiankozalla profile image
Christian Kozalla

This article got me started on the event loop in Node, so I read up on it in the official documentation (nodejs.org/en/docs/guides/blocking...)

I'd like to quote an example:

"As an example, let's consider a case where each request to a web server takes 50ms to complete and 45ms of that 50ms is database I/O that can be done asynchronously. Choosing non-blocking asynchronous operations frees up that 45ms per request to handle other requests. This is a significant difference in capacity just by choosing to use non-blocking methods instead of blocking methods.
The event loop is different than models in many other languages where additional threads may be created to handle concurrent work."

Collapse
 
iliafaramarzpour profile image
ilia faramarzpour

Interesting article, I am waiting for more interesting articles from you. 😉🌹

Collapse
 
imichaelowolabi profile image
Michael Owolabi

Thank you @iliafaramarzpour glad you found it interesting

Collapse
 
bias profile image
Tobias Nickel • Edited

The changes you make are good, they cause a single API call to be faster, but it will not help to get more request per minute.

only when you use a transaction and block for the duration of the request/task any other resources, that subsequent requests need to await for to free up.

also, when you are using mysql for example, there is not going to be any difference. because the db driver (on each connection) send one query, wait the result, then send the next query. The postgres module (not the pg can send all queries instantly, and profit from the code style in this article.

Collapse
 
bias profile image
Tobias Nickel

github.com/porsager/postgres/issue...

we had some discussion about the topic in this issue.

tldr: technically it is pissible to do concurrent/pipelined transactions on a single transaction, however errorhandling can cause unwanted behavior.

Collapse
 
Sloan, the sloth mascot
Comment deleted
Collapse
 
imichaelowolabi profile image
Michael Owolabi

Thank you @romeerez for your concern for good content which I agree with you that we should hold ouselves to a better standard. However, your opening statement just isn't true because the article has since been updated before your comment.

The article isn't describing n+1 problem as that isn't peculiar to Node.js.

Anyway, thank you for your concern.

Collapse
 
romeerez profile image
Roman K

As for N+1, it's the way how you're loading nextOfKin with Promise.all, it's not a proper solution for a problem.

In the example you're using locally defined data and setTimeout to simulate delay, in real world it will most likely be a database or API call.

In case of API call, Promise.all would be ok only if you have no other choice, and in case of database call this Promise.all will consume whole connection pool and block other users from accessing database, Necmettin Begiter and Tobias Nickel mentioned this in comments.

Like, if you awaiting 100 promises, and have just 10 available connections, it won't be fast and other users won't be able to make a query in the meantime.

So I was concerned that new comers may take this article as a teaching material and will go and implement it this way on real projects. But since you're editing it and taking care of it - that's good, wish you success

Collapse
 
romeerez profile image
Roman K

Sorry! My bad, really, I read it before and yesterday just very briefly looked through and I see the same problem, I see section about event loop so I thought it wasn't changed

Collapse
 
cesarqueb profile image
cesar-queb

Thank you for this useful and great article!

Collapse
 
caohoangnam profile image
J.O.E

Thanks for sharing that helpful with me