DEV Community

Using the big O notation in the wild

Peter Nycander on June 15, 2019

I see a lot of people getting scared by the big O notation, and I get it. It is a math thing, and math is scary for a lot of people. I have a mast...
Collapse
 
nestedsoftware profile image
Nested Software • Edited

It can be surprising how effective this kind of technique can be. I suppose it can be overdone, but in general the wisdom of modern computing seems to be that trading lower cpu usage for increased memory usage is often a good idea. I was just playing around with a minimax algorithm for tic-tac-toe, and it was a similar story. I hadn't really thought things through and just dove into the implementation. My (admittedly naive) python code took 1.5 minutes to play a game, which I did not expect. Simply adding caching for positions in a dict took that number down to 0.3 seconds though!

Collapse
 
jmcp profile image
James McPherson

A minor niggle - I had to re-read your comment about the prework adding an extra loop iteration, because of the way you expressed it.

You had O(n*2), which my brain read as O(n^2), so seeing the comment about dropping constants was jarring. If you'd expressed it as O(2*n) or O(2n) it would have been much more mathematical and the comment would have flowed.

As to whether we need to take account of time complexity in day to day work - that depends on what your problem space is :-)

Collapse
 
peternycander profile image
Peter Nycander

Valid point with the 2*n n*2 thing, I changed it to 2*n. Thanks for the feedback.

Yeah, maybe I'll work with something that requires thinking more about time complexity in the future. But I think most people are like me, and work with things that rarely creates the opportunity to utilize any knowledge of time complexity.

Collapse
 
tiborveresired profile image
tiborveres-ired

lookup[id] is not constant but O(log n)

Collapse
 
peternycander profile image
Peter Nycander

Interesting! I had no idea, thanks! Which data structure is actually being used? I thought it was a hash table...

Collapse
 
tiborveresired profile image
tiborveres-ired

well. it could be a hash table, but then it's even worse (for big n-s)
as with hash table and big n-s there will be hash collisions, so it needs to do a find in the collision list
best case would be a balanced tree

Thread Thread
 
peternycander profile image
Peter Nycander

Hmm, I wonder at what n you start to get enough collisions for it to matter in practice. I guess it is up to the implementation, and up to the browsers to create a good heuristic.

When I have used SpiderMonkey (and node.js to some extent) solve problems on kattis, I have never run into a problem of object lookup being a bottleneck. Maybe in some special case?

Thread Thread
 
tiborveresired profile image
tiborveres-ired

in practice it never causes problem
i suspect modern js engines are able to switch implementation based on the object "shape"

Collapse
 
taragrg6 profile image
taragurung

Honestly, it went above my mind :D