In this challenge, you will remove the left-most duplicates from a list of integers and return the result.
// Remove the 3's at indices 0 and 3 // followed by removing a 4 at index 1 solve([3, 4, 4, 3, 6, 3]); // => [4, 6, 3]
This challenge comes from KenKemau on CodeWars. Thank you to CodeWars, who has licensed redistribution of this challenge under the 2-Clause BSD License!
Want to propose a challenge idea for a future post? Email email@example.com with your suggestions!
Top comments (33)
This solution is in python
Hi, you could have use the
setmethod, like this:
[3, 4, 6]
[4, 6, 3]
which is the expected answer
3n, looping three times on the length of the input, two identical
reducees and one
Epic, but also this relies on the
arraybeing sparse in the first reduce (so, a map really), since
new Array(Number.MAX_SAFE_INTEGER)is an error.
I'd express that as
which is just a funny version of
Wouldn't this have
O(n^2)time complexity? This could be done in
O(n), using a
This is my attempt at writing a similar function but with
O(n)time complexity (
m <= 64, where
mis amount of elements in
Hi and thanks for your reply. This looks like a very good solution.
Indeed this algorithm would have an
O(n²)time complexity if the
xslist would remain the same. I believe the time complexity is
O(n log n)since we are decreasing the
xslist each time in the solution I proposed. But I'm bad at time complexity so I wouldn't know.
I didn't know about
Data.IntSetI'll look into that. Thanks for sharing.
You are forgetting OLog(n) steps used by IntMap internally, it is never O(n),
Perhaps you're mistaking
Mapdocumentation most lookup and insertion operations are indeed
IntMapdocumentation, you can see that the actual complexity is
O(min(n, W)), where W is size of integer type (32 or 64):
This effectively means that after
IntMapcontains more than 64 elements, the time complexity is constant
Interesting .. I wasn't aware of that.
In C with O(n2):
Thanks, I am new to Programming.
const originalArray = [3,4,4,3,6,3];
const distinctArray = [...new Set(originalArray)];
Simple JS O(n) solution (I think... 2n?) that will remove the left-most duplicates from a list of integers and return the result in the right order.
This is not O(n) because you're iterating over "seen" at every iteration of your for loop, so it's O(n^2) except in the degenerate case where all the elements are duplicates. Also, it isn't specified but unshift itself is also most likely an O(n) operation in most implementations.
I had my doubts, but it seems the JS implementation uses a linear search algorithm.
FWIW, I also had to check if Sets keep insertion ordering since in principle they shouldn't... but in JS they do!
Good to know, I didn't know that was actual documented behavior of the Set.
Here's my python solution
O(n). The idea is to iterate over the input and keep track of the index of the last time you saw a given number. If you see a number, replace the old value with
None, and do a second pass at the end to remove all the
The two-phase approach is necessary to keep it
O(n). If we remove items as we find them it could end up costing
The question remains, does the
for inkill the man what if it sorts lexically and not numerically? Should I have just
for (const x of sparse) if (typeof x !== "undefined") out[o++] = x?
Here is the simple solution with PHP:
Here Goes, My Solution
[3, 4, 6]
[4, 6, 3]
To get the correct result you'd have to do
which is... quite a lot.
set([3, 4, 6])
which is the expected answer (you only keep the last occurrence of each duplicates, not the first)