DEV Community

Discussion on: filter, map and reduce in JS. When and Where to use??

captainyossarian profile image
yossarian • Edited on

You will get noticeable performance hit if your array has more than 1K elements. In worst case scenario you will iterate 2 times over 1K elements. Btw, what makes you think that reducer is not readable?

type User = {
  name: string;
  city: string;
  birthYear: number;
declare const users: User[]

const currentYear = new Date().getFullYear();

const olderThan25 = (user: User) =>
  user.birthYear && (currentYear - user.birthYear) > 25 ? [user] : []

const getName = ({ name }: User) => name

const userNames = users.reduce((acc, user) =>
    ? acc.concat(getName(user))
    : acc,
  [] as Array<User['name']>
Enter fullscreen mode Exit fullscreen mode

You can chain map and filter in functional languages, like for example F# because there is no intermediate value.

nehal_mahida profile image
Nehal Mahida Author

That's a good point with a great code presentation. 👍🏻

lukeshiru profile image

I mean, I tried with 40.000 elements and is still not a huge gap. Browsers nowadays have lots of optimizations for this kind of operations. And idk about you, but I still believe that array.filter(olderThan25).map(getName) is more readable than array.reduce((acc, user) => olderThan25(user) ? acc.concat(getName(user)) : acc) X_X

Thread Thread
uuykay profile image
William Kuang

Performing map, then filter is 2 iterations of the list. Despite some small readability gains, the performance loss is just too great.

Thread Thread
lukeshiru profile image

"performance loss is too great" ... have you actually ran the code above? Browsers optimize those kind of chained operations. The performance difference might br noticeable maybe with, idk, 100k items, but if you have to filter/map that amount of items, your problem is elsewhere.....