DEV Community

Cover image for Refactoring chronicles: Extract unique values from an array of objects
Davide de Paolis
Davide de Paolis

Posted on

Refactoring chronicles: Extract unique values from an array of objects

Quick note out of a code review I made today.
Use-case was extracting the list of all unique IDs out of all the rows of a CSV file.

After loading the CSV and parsing, the array looked like this:

const rows = [
  {"id":1,"date":"03.03.2019","otherProps":483},
  {"id":2,"date":"22.02.2019","otherProps":573},
  {"id":3,"date":"11.03.2019","otherProps":645},
  {"id":4,"date":"03.03.2019","otherProps":483},
  {"id":2,"date":"08.03.2019","otherProps":573},
  {"id":4,"date":"26.02.2019","otherProps":645},
  {"id":5,"date":"13.03.2019","otherProps":483},
  {"id":3,"date":"22.01.2019","otherProps":573},
  {"id":1,"date":"01.03.2019","otherProps":645}
];
Enter fullscreen mode Exit fullscreen mode

implementation in the Pull Request was this:

const findUnique = (arr) => {
  return arr.reduce((acc, row) => {
            if (typeof row.id === 'number' && acc.indexOf(row.id) < 0) {
                acc.push(row.id)
            }
            return acc
        }, [])
}
Enter fullscreen mode Exit fullscreen mode

I really appreciated that the dev tried to use reduce here but as useful and cool reduce is, I found the code too verbose.

IMHO a more readable solution might have been first estracting only the IDs to remove the clutter and then filter for the first occurence of each and ignore the duplicates.

const findUnique = (arr) => arr.map(r => r.id).filter((id, i, ar) => ar.indexOf(id) == i)
Enter fullscreen mode Exit fullscreen mode

but my suggestion was this magical es6 trick that would make that function a one-liner

const findUnique = (arr) => [ ...new Set(arr.map(r => r.id))]
Enter fullscreen mode Exit fullscreen mode

What is this doing?
We are extracting via map the IDs and creating a Set with the result.
Since as stated in the docs A value in the Set may only occur once; it is unique in the Set's collection. all duplicates are automagically removed.
Then using the ... operator we convert the Set back into an Array.

tadaa

PS: Always be cautious when using map reduce filter and other magick tricks ( like here the conversion of an Array into a Set and back) because with very big arrays performance might be affected. In such case, it is best to sacrifice readability and coolness and execute all the necessary operation in a single loop so that array is traversed only once.

PPS: Always have Unit Tests for your methods so that when trying out other solutions you are sure your code still works as intended.

import test from "ava"
test('return list of unique IDs', t => {
    const expected = [1, 2, 3, 4, 5]
    const result = findUnique(rows);
    t.deepEqual(result, expected)
}
Enter fullscreen mode Exit fullscreen mode

You can play around with the code and try out other solutions with this CodePen

Top comments (0)