DEV Community

Cover image for Optimize your JS code in 10 seconds

Optimize your JS code in 10 seconds

Arthur Kh on November 21, 2023

Performance is the heartbeat of any JavaScript application, wielding a profound impact on user experience and overall success. It's not just about ...
Collapse
 
efpage profile image
Eckehard

You should distinguish application performance and Javascript performance. This is NOT the same:

  • Usually, an application needs some data. Fetching data may cause a significant delay.
  • Then, Javascript can handle this data, which is usually the fastest part of the show. This is, where your "optimization" takes place!
  • After that, maybe you cause complex state changes. It may take much more time to handle all the dpendencies in your app. Luckily, all this is done on the VDOM, so it will not cause any further delay.
  • If you use React, then all your interactions have to be diffed to the real DOM
  • After the DOM is ready, the browser has to render it. This is usually the most time consuming part
  • If you change some ressources, maybe the browser needs to fetch some new images. Even that cannot be started before the DOM was ready. So, the user has to wait for all elements to be loaded.

Do you think, cutting the time of the smallest part (the Javascript routines) will cause any measureable performance boost?

Collapse
 
brense profile image
Rense Bakker

Actually continuously updating the vdom (rerendering) is the biggest performance bottleneck for react apps (other than fetching data). The proposed optimizations help to decrease the number of rerenders as well as reduce the cost of rerendering.

Collapse
 
efpage profile image
Eckehard

If you look to the React history, the VDom was invented to free the developers from the burden of writing performant code. So, it´s really an interesting question, if streamlining your Javascript will have much impact at all. If so, React did a bad job.

Thread Thread
 
arthurkh profile image
Arthur Kh

It was invented to save time on browser DOM search and updating because DOM search methods are slower than usual JS tree/array search.
And VDOM's updating is based on the basic and fast comparison === operator instead of using deep equality, which is way slower, and it's easier to delegate reference updating to devs instead of inventing fast deep equality

Thread Thread
 
brense profile image
Rense Bakker

VDOM was invented to make DOM updates more efficient and it did. How often you update the VDOM and how long the VDOM update takes is still up to the developer. If you loop over a million table rows on every render, obviously its going to cause problems.

Thread Thread
 
efpage profile image
Eckehard

You are totally right. But that means, you should avoid to loop unnecessarily over millions of table rows, not just loop faster!

Collapse
 
schemetastic profile image
Schemetastic (Rodrigo)

Hey! Great tips! There are other methods I like too, suchs as .trim() and .some()

Collapse
 
jorensm profile image
JorensM • Edited

This gave me an idea for a library - a library that implements all the common array utils, but using the Builder pattern, so you can chain the util functions together and they will be run only once at the same time.

Anyhow great article and thanks for the tip!

 
arthurkh profile image
Arthur Kh

I completely agree that Promise.then point, async/await syntax is fantastic and widely compatible across browsers and Node environments, so it should be more prioritized.

When it comes to naming and organizing components, I'm a fan of atomic design. Its structure is exceptional, clear, and easily shareable within a team.

Collapse
 
chalist profile image
Chalist

your Telegram channel is Russian! So I can't understand, unfortunately.

Collapse
 
arthurkh profile image
Arthur Kh • Edited

Hey!
Image description
I've developed a bot, which automatically translates my posts to popular languages so anyone can read

It's sometimes hard to choose which language to use when you're trilingual :D

Collapse
 
moopet profile image
Ben Sinclair

Personally I steer clear of using reduce for all but the simplest cases because it gets unreadable pretty quickly.

Collapse
 
arthurkh profile image
Arthur Kh

Using .reduce might lead to unreadable code, particularly when the reducer function grows larger. However, the reducer function can be refactored into a more readable form

Collapse
 
glntn101 profile image
Luke

Great tips, great article, thanks!

Collapse
 
arthurkh profile image
Arthur Kh

Thank you for your feedback, @lnahrf ! Are there any JavaScript antipatterns you're familiar with and dislike?

Collapse
 
djfm profile image
François-Marie de Jouvencel • Edited

Whatever the number of loops you'll do the same exact operations !

This code:

for (let i = 0; i < 2; ++i) {
  opA();
  opB();
}
Enter fullscreen mode Exit fullscreen mode

Is almost exactly the same as

opA();
opB();
opA();
opB();
Enter fullscreen mode Exit fullscreen mode

Which again is the same as (assuming order of operations doesn't matter):

for (let i = 0; i < 2; ++i) {
  opA();
}

for (let i = 0; i < 2; ++i) {
  opB();
}
Enter fullscreen mode Exit fullscreen mode

The only thing you're saving is a counter. It's peanuts.

Collapse
 
akashkava profile image
Akash Kava • Edited

Compare reduce with for of, And you would realize that for of performs faster compared to reduce.I would give least preference to reduce over for of, filter, map, reduce all can be done in a single for of.

Collapse
 
arthurkh profile image
Arthur Kh

The choice between using array methods like map, filter, and reduce versus explicit loops often depends on your code style. While both approaches achieve similar results, the array methods offer clearer readability: map modifies the data structure, filter removes specific elements from an array, and reduce accumulates values. However, using loops, like 'for', might require more effort to grasp since they can perform a variety of tasks, making it necessary to spend time understanding the logic within the loop's body.

Collapse
 
ghamadi profile image
Ghaleb

I don’t believe this to be accurate.

For starters, just because you're looping once doesn't mean the total number of operations changed. In your reduce function, you have two operations per iteration. In the chained methods there are two loops with one operation per iteration each.

In both cases the time complexity is O(2N) which is O(N). Which means their performance will be the same except for absolutely minor overhead from having two array methods instead of one.

Even if we assume you managed to reduce the number of operations, you can't reduce it a full order of magnitude. At best (if at all possible) you can make your optimization be O(N) while the chained version is O(xN), and for any constant x the difference will be quite minimal and only observable in extreme cases.

This is just a theoretical analysis of the proposal here. I haven't done any benchmarks, but my educated guess is that they will be in line with this logic.

Collapse
 
djfm profile image
François-Marie de Jouvencel • Edited

I made a very similar comment, it pleases me to see it here already :)

I'm baffled by the number of people who seem to think that if they cram more operations into one loop they are somehow gonna reduce overall complexity in any meaningful way... But no: it'll still be O(n) where n is the length of the array.

Many people don't understand what "reduce" does, it can thus make more sense if you're in a team to use a map and a filter, which is extremely straightforward to understand and not a performance penalty.

I love obscure one-liners as much as any TypeScript geek with a love for functional programming, but I know I have to moderate myself when working within a team.

Shit, is big O complexity still studied in computer science?

Collapse
 
calvinlfer profile image
Calvin Lee Fernandes

The code you start off with is readable but as you optimize the code, it becomes more unreadable since your fusing many operations into a single loop. Consider using a lazy collections or a streaming library which applies these loop fusion optimizations for you so you don’t have to worry about intermediate collections being generated and you can keep your focus on more readable single purpose code. Libraries like lazy.js and Effect.ts help you do this

Collapse
 
djfm profile image
François-Marie de Jouvencel • Edited

While doing more operations in one loop is gonna save you incrementing a counter multiple times, you'll still have O(n) complexity where n is the size of your array, so you will most likely not save any measurable amount of resources at all.

Sometimes it is more clear to write map then filter just for readability, cuz not everyone understands what reduce does unfortunately.

Code that cannot be read fluently by the whole team is in my experience way more costly than a few sub-optimal parts in an algorithm that takes nanoseconds to complete.

You should study the subject of algorithmic complexity if you don't know much about this concept.

Some comments have been hidden by the post's author - find out more