Apologies for bad notation. Also, I'm not a native speaker.
I started reading this book:
I am currently reading the first chapter, which is about time complexity / asymptotic notation.
So I'm told that for an algorithm whose running time is described by the function đ(đ) = 3đÂł + 2nÂČ we have a worst-case complexity of O(đÂł). Fine.
I'm told that I can easily prove that T(n) †5đÂł. Fine. (I'm pretty confident in my highschool maths skills.)
Basically, I ended up with T(đ) †3.0...01đÂł for đ tends to +â. A different value of đ.
It might sound dumb that 3đÂł + something could be smaller than 3đÂł but I found this using limits.
My reasoning was that, after manipulating the inequality T(n) †đ„đÂł, one could end up with the following statement:
Now đ is just a positive integer constant. So let it be 1; smallest input. We end up with đ„ = 5. The value provided by the book.
But let's say that đ gets bigger and bigger; and eventually tends to +â. Since đ-2 would then tend to 0, we have đ„ tends to 3.
So T(đ) †đ„đÂł with đ„ a real number that tends to 3 from upper values but is not exactly 3. In fact, it wouldn't work with 3.
In GeoGebra
Click here if you can't see the video.
C is the point at which đ„đÂł gets bigger than T(đ).
My question
First of all, is my reasoning correct? Can I use this kind of reasoning when dealing with algorithms complexity? I'm pretty confident that it "holds" mathematically speaking but does it apply here?
I understand that it is not really important since we end up with the same O() but still, just curious.
Thank you.
Top comments (5)
Your understanding of limits is correct. As n approaches infinity, c can get closer and closer to 3. nÂł is bigger than nÂČ by a factor of n, which means the 2nÂČ part gets closer and closer to negligibly.
Now, the definition of big O states:
T(n) belongs to O(nÂł) if there exists some finite constants c and n_0 for which every n >= n_0 satisfies T(n) <= c*nÂł
If n_0 is 1, then c has to be 5, as you showed.
As n_0 gets bigger, c can get closer to 3, as you also showed.
The important part is that there exists some pair of n_0 and c satisfying the requirement.
In fact, if one such pair exists, infinite pairs exist!
Hope this helped, cheers
Ok I think I get it now.
Thank you very much!^
I'm glad to hear!
If you want some practice, do have a look at a challenge I uploaded
Remove terrible bus routes (find an algorithm)
HĂ„vard Krogstie ă» Mar 9 ă» 2 min read
The O(nÂČ) solution is trivial, but see if you can come up with one that's O(nlog n) (Hint: sorting). I have posted a python solution in the comments, with explanations. Good luck!
There's a much more succinct and formal way of putting what I think your reasoning is:
if the limit of f(n) / g(n) is a finite number as n goes to infinity, then f is O(g).
So for example, if you're comparing
3đÂł + 2nÂČ
to5đÂł
, the limit of the ratio is3/5
, which is a finite number, so3đÂł + 2nÂČ
isO(5đÂł)
.On the other hand, if we compare
10nÂČ
andn
, the limit is +infinity. So that means10nÂČ
is NOTO(n)
.Thank you for this. It doesnât exactly help regarding my reasoning but I think your method is easier to prove that f is O(g). I didnât know about it so thanks for introducing it to me^