DEV Community

Sebastian Vargr
Sebastian Vargr

Posted on

Ignoring performance

I see a pattern where developers ignore performance if it is not seen as an eminent problem.

Examples of this could be ignoring components re-rendering excessively because it's not a real performance problem yet for the targets user, accepting that derived state is calculated unnecessary, having cookies/headers on requests that have no need for them etc.

From a business perspective this makes good sense.
There are usually better business gains elsewhere.

I also do this myself at work.

But 10+ browser tabs with these kinda of shenanigans going in them + other applications add up.

Especially when we start to consider mobile devices.

That leaves me wondering..

  • Isn't this kinda of lying to our selves by only looking at best case scenarios?
  • When does this become too puristic?
  • How could we mitigate this in a way that makes sense business wise?

Edit, extra question.

  • How would we measure if this is a problem for users in a none-intrusive way?

Top comments (2)

Collapse
 
sargalias profile image
Spyros Argalias

Interesting questions.

From one point of view, I think you answered the questions yourself. "There are usually better business gains elsewhere". You also implied that when performance does become a problem, it will be fixed. This seems like a reasonable approach to me from a business perspective.

About user behaviour (multiple tabs).. I think I would challenge this. In mobile I personally only use one tab at a time. On a PC, if I have 20 tabs open on YouTube I expect some slowdown. In other words, I'm not immediately convinced that this case is a problem. Maybe it is, maybe it's not.

One problem is that performance is not free. It costs additional code, potentially more complicated code, development time, and it can be finicky and error prone. E.g. with Redux if we don't use "reselect" properly, the result is worse than not using it at all (extra overhead from calling a function unnecessarily).

One way we could have better performance is if it is automatic (or close to automatic). This is one of Gatsby-js goals. "Performance by default". E.g. it is easy to have lazy-loaded images with Gatsby. Anything else is a business case, same as any feature in the software. Another way is if it's a single-time thing, such as setting up webpack.

How would we measure it? So far I've trusted statistics from Google and other large companies. E.g. the 3 second thing for loading. They have the ability to measure such things better than individual companies.

Collapse
 
sebbdk profile image
Sebastian Vargr • Edited

Thanks, these are very similar to my own thoughts.

The puritan in me keeps nagging tho.
It forces me to reconsider, my ideas at intervals.

Probably also why i practically answered some of the queries myself as you mention. :)