DEV Community

Saulo Dias
Saulo Dias

Posted on

Time complexity of two by two comparison

You have probably been faced with a very simple and common problem: comparing unique elements in a list, two by two, when the order doesn't matter.

Comparing n elements with n elements is going to have a complexity of n². But when the order doesn't matter, comparing element A to B is the same thing as comparting B to A, so the complexity is going to be less than n².

For this algorithm all we have to do is to implement a for loop that increments the initial position each time we finish comparing to the remaining items in the list.

But the question remains: How faster is it going to be in theory?

I didn't know the complexity of this second approach, so I googled it and it turns out that it is n(n-1)/2, where n is the number of elements. Now I wanted to know how faster it was going to be relative to n².

So all I had to do was to divide both complexities to see how they grow relative to each other. See the graph below:

Image description

It turns out that when n goes to infinite, this value goes to 2. So that means that the second approach is about two times faster.

Now it is obvious to me that it is 2 times faster. I mean, each comparison should appear twice. For example, we have the elements A, B, and C so we would have AB, AC, BA, BC, CA, CB. Half of those are the same comparison when the order doesn't matter.

In this case if I had been smarter I wouldn't have needed to do any calculations. However, this very same exercise of comparing different algorithmic complexities can be useful for you when your theoretical gains are not that obvious, and this information might help you decide if the complexity of the implementation outweighs the gains in processing time.

Top comments (0)