In computer science algorithms and data structures , most of the time we consider the **Big-O** complexity analysis as the benchmark.

But Big-O Analysis is not always the best option.

Here in this article I will discuss , how Amortized Complexity makes much more sense in some cases.

🎯 **Amortized** time is the way to express the time complexity when an algorithm has the very **bad time complexity** only **once** in a while besides the time complexity that happens most of time. Good example would be an **ArrayList** which is a data structure that contains an array and can be extended.

In a normal Array , the size of the array is predefined.

Hence Insertions in Array are O(1) or constant time.

But When we use ArrayList , the size is variable and can increase/decrease depending upon the input.

Every ArrayList has an initial capacity.

Mostly the initial capacity is 10.

Now , when we want to insert more than 10 elements then three steps need to be performed:

1.Create a new Array of double size(2N)

2.Copy all previous elements(N)

3.Add the new element.

When ArrayList hits the array capacity in it, then it create a new array with the doubled size of the old array and copy all the items in the old array to the new array. In ArrayList, two time complexities exist; one is O(1) and the other is O(n).

The first kind is O(1) as shown below:

The second kind is O(N) as show below:

Thus in case of Dynamic Array we can say that:

## The insertion takes O(n) when the capacity has been reached, and the amortized time for each insertion is O(1).

## Final Words:

In Oracle terms (which are implied tacitly) and speaking about List:

🎯 **add** method (synonym - "append method") always means boolean add(E)

🎯 **insert** method always means boolean add(int index, E)

When Oracle writes:

**The add operation runs in amortized constant time, that is, adding n elements requires O(n) time.**

it means:

**Complexity of a single boolean add(E) operation is amortized O(1).**

It cannot be just O(1) asymptotic (always) because rarely we need to grow array capacity. This single add operation which is in fact a "create new bigger array, copy old array into it, and then add one element to the end" operation is O(n) asymptotic complexity, because copying the array when increasing List capacity is O(n), complexity of growing plus adding is O(n) [calculated as O(n) + O(1) = O(n)]. Without this capacity growing operation, add complexity would have been O(1), element is always added (appended) to the end of array (max index). If we "added" (= inserted) not to array end, we would need to move rightmost elements (with bigger indexes), and complexity for single such operation would have been O(n).

Now, for a single add operation asymptotic complexity is O(1) for add-without-increasing-capacity and O(n) for add-with-increasing-capacity (which happens very rare).

Amortized complexity of a single add operation is O(1). It reflects the fact that rare O(n) grow-and-add operations get "diluted" with much more numerous O(1) add-without-grow ones, so "on the average" single operation is O(1).

"Asymptotic complexity" of n add operations is O(n). But here we speak about complexity of n operations, not complexity of one operation. It is not a strict way to put it like this ("Asymptotic complexity"), but anyway. Amortized complexity of n operations is even less sense.

Finally, boolean add(int index, E) single operation complexity is always O(n). If it triggers grows, it is O(n) + O(n) [grow + insert], but 2*O(n) is same as O(n).

So , in such cases the Amortized Analysis makes more sense.

I am providing the below links for reference and better understanding.

If you liked this post then please leave a heart ❤️

And if you have any queries or suggestions for me , then please leave them in the discussions panel.

## Top comments (0)