DEV Community

Anshuman Mahato
Anshuman Mahato

Posted on

Scalability, Complexity and Complexity Analysis

As we know, the two characteristics of a good code are that it should be readable and scalable. As far as Readability is concerned, it means writing code that is understandable, not just to you but to others in your team as well. As for Scalability, what we try to ensure is that code must perform efficiently under most circumstances. That's it.

Now you would say, "That's vague. What exactly is Scalability? How do I make my code more Scalable?" Let's break this down step by step.

What is a Scalable Code?

When talking about scalable code, the question we try to answer is, "How will my code perform when the load increases?" It is all about code performance. When it comes to performance, a code must do two things:-

  1. It should manage memory well.
  2. It should perform in optimum time.

Memory is a limited resource. If the code does not manage memory then, it may run out of memory. The same holds for processing time. If it takes a lot of time to process a request, the waiting will become tedious for the user.

The memory requirements and required runtime of a code are respectively known as Space Complexity and Time Complexity. These are collectively known as computational complexities, and we determine these through Complexity Analysis.

So we have once again hit a bunch of jargon. Let's move along and demystify these as well.

Time Complexity and Space Complexity

In simple terms, Time Complexity represents the amount of time required by your code and Space Complexity represents the amount of memory occupied by your code to complete processing. Both Space and Time Complexity do not give the actual memory requirement or processing time of an algorithm. Instead, it provides an idea of the trend in which the memory requirements and processing time will change as the length of the input changes. Thus, they represent a function dependent on the input length.

Time Complexity predicts the number of operations required instead of the actual runtime. It is because the actual runtime of an algorithm is dependent on the hardware. It will differ from one system to another. Hence, it's not a constant value. On the contrary, the number of operations required will remain constant for a given input. As we can say, the time taken for execution will be proportional to the steps to be performed. This way, we can indirectly predict how much time it will take to process the given input.

Also, while calculating the Space Complexity of any algorithm, we usually consider only the amount of space used by the variables and constants. So, from Space Complexity, what we try to understand is "For a given input length, how many new variables does the algorithm create, that is, how much more memory does it acquire."

Complexity Analysis and Asymptotic Notations

As we can understand, Complexity Analysis is a way of determining the algorithmic complexities of the code. The complexity of an algorithm is calculated based on the input. Now, there will be input cases where the code will run fast. And for some cases, the code might get very slow. To account for this, we calculate complexity for three different cases, the Best case, the Average case and the Worst case.

The Best case is the set of inputs for which the algorithm takes a minimum number of steps on input data of n elements. The Worst case is the set for which the algorithm performs maximum operations on input data of size n. The Average case depicts the average random input set.

Now the question is, how do we represent these complexity values? Well, for this purpose, we use Asymptomatic Notation. Let's continue further to build an understanding of this as well.

Asymptotic notations are the mathematical notations used to describe the computational complexities of an algorithm when the input tends towards a particular value or a limiting value. There are three different notations used to describe the three cases.

  • The Omega Notation (Ω-notation) describes the Best case scenario.
  • The Theta Notation (Θ-notation) denotes the Average cases.
  • The Big-O Notation (O-notation) describes the Worst cases.

Most of you must have read or heard about the Big-O notations. It is the most commonly used one. As we know, the entire purpose of Complexity Analysis is to manage the worse cases and make the code more scalable. Hence, we programmers are particularly interested in Big-O.

Conclusion

In this article, we discussed Scalability and Algorithmic Complexity. We also discussed Time Complexity and Space Complexity and how we represent them using asymptotic notations. We also briefly discussed the Best case, Worse case and Average case scenarios.

I hope this article was helpful to you. If you think that something is missing, feel free to add those in the comments. I'm new to all this and would love to enhance my knowledge. And with this, I would like to conclude. Happy Coding!!!

Top comments (0)