DEV Community

Babloo Kumar
Babloo Kumar

Posted on

Space Complexity

Whenever we write an algorithm or we want to perform analysis of an algorithm, we need to calculate the complexity of that algorithm. But when we calculate complexity of any algorithm, it does not give us the exact amount of resources required to accomplish it. So instead of taking the exact amount of resource, we represent that complexity in a general form (Asymptotic Notation) which produces the basic nature of that algorithm. We use that general form (Asymptotic Notation) for analysis process.
Asymptotic notation of an algorithm is a mathematical representation of it's complexity. In asymptotic notation, when we want to represent the complexity of an algorithm, we use only the most significant terms in the complexity of that algorithm and ignore least significant terms in the complexity of that algorithm (Here complexity can be Space Complexity or Time Complexity).

For example- Consider the following time complexities of two algorithms.

  • Algorithm 1 : 4n^2 + 4n + 1
  • Algorithm 2 : 10n^2 + 8n + 3

Generally, when we analyze an algorithm, we consider the time or space complexity for larger values of input data (i.e. 'n' value). In above two complexities(time & space), for larger value of 'n' the term '4n + 1' in algorithm 1 has least significance than the term '4n^2', and the term '8n + 3' in algorithm 2 has least significance than the term '10n^2'.
Here, for larger value of 'n' the value of most significant terms ( 4n^2 and 10n^2 ) is very larger than the value of least significant terms ( 4n + 1 and 8n + 3 ). So for larger value of 'n' we ignore the least significant terms to represent overall time or space required by an algorithm. In asymptotic notation, we use only the most significant terms to represent the time/space complexity of an algorithm.

Today we will be discussing more about Space Complexity. Space complexity is amount of memory of a system required to execute a program to produce some legitimate output. Similar to Time Complexity, Space Complexity is often expressed asymptotically in Big O Notation such as O(1), O(log n), O(n), O(n log n), O(n^2), O(n^3), O(n^y), O(2^n), O(n!) where n is a characteristic of the input influencing space complexity and O is the worst case scenario growth rate function.

Why it's important to know the space complexity? Answer to this is, the computational device has limited memory and our overall cost increases with the increase in storage. So it's very important to know beforehand the amount of space required for the size of input data set.
For any algorithm, memory is used for the following purpose:

  1. To store Variables (constant values, temporary values).
  2. To store program instructions(or steps).
  3. For the program Execution.

Mathematically if we put in equation, space complexity can be defined as,

Space Complexity = Auxiliary Space + Input space

Most of the times, Auxiliary Space is confused with Space Complexity. However, Auxiliary Space is the extra space or the temporary space used by the algorithm during it's execution.

When a program is under execution, it uses computational device memory for three main reasons:

  1. Instruction Space: When the code is compiled we need to keep in the memory somewhere, so that it can be executed afterwards. Instruction Space is the place where it is stored.

  2. Environmental Stack: It is used to store the addresses of the partially called functions. It means, Sometimes an algorithm(function) may be called inside another algorithm(function). In such a situation, the current variables are pushed onto the system stack, where they wait for further execution and then the call to the inside algorithm(function) is made.
    For example, If a function A() calls function B() inside it, then all the variables of the function A() will get stored on the system stack temporarily, while the function B() is called and executed inside the function A().

  3. Data Space: It is the place where data, variables, and constants are stored by the program and it is updated during execution.

When we want to perform analysis of an algorithm based on its Space complexity, we usually consider only Data Space and ignore Instruction Space as well as Environmental Stack.That means we calculate only the memory required to store Variables, Constants, Structures, etc.

Cheat sheet, to keep it handy.
Below is the list of some common Space Complexity terms, which we knowingly or unknowingly use in our day to day coding. It is arranged in order of there execution time when their input size increases.

S.NO Big O Notation Name
1. O(1) Constant Space Complexity
2. O(log n) Logarithmic Space Complexity
3. O(n) Linear Space complexity
4. O(n log n) Linearithmic Space Complexity
5. O(n^2) Quadratic Space Complexity
6. O(n^3) Cubic Space Complexity
7. O(n^y) Polynomial Space Complexity
8. O(2^n) Exponential Space Complexity
9. O(n!) Factorial Space Complexity

For better understanding See the below graph, it's combination of all the graphs for different Space Complexities, You can clearly see with the increase in input data set how space complexity varies. Based on the amount of input data set choose your algorithm wisely.

Space Complexity

Now let's see the calculation part of the space complexity through some examples,
For calculating the space complexity, we need to know the value of memory used by different types of datatype variables, which generally varies for different operating systems, but the method for calculating the space complexity remains the same. For example - Generally (not always) C Programming Language compiler requires the following:

Type Size
bool, char, unsigned char, signed char, __int8 1 byte
__int16, short, unsigned short, wchar_t, __wchar_t 2 bytes
float, __int32, int, unsigned int, long, unsigned long 4 bytes
double, __int64, long double, long long 8 bytes

However you are always free to choose some sample unit (like T size) for same type of variables to calculate the size.
Now lets take some example and decode how to calculate space time complexity:

Example 1-

  {
     int z = a + b + c;
     return(z);
  }
Enter fullscreen mode Exit fullscreen mode

In the above expression, variables a, b, c and z are all integer types, hence they will take up 4 bytes each, so total memory requirement will be (4(4) + 4) = 20 bytes, this additional 4 bytes is for return value. And because this space requirement is fixed for the above example, hence it is called Constant Space Complexity.
If any algorithm requires a fixed amount of space for all input values then that space complexity is said to be Constant Space Complexity or O(1) space complexity.

Example 2- It's bit complex than first one,

  // n is the length of array a[]
  int sum(int a[], int n)
    {
    int x = 0;      // 4 bytes for x
    for(int i = 0; i < n; i++)  // 4 bytes for i
    {   
        x  = x + a[i];      
    }
    return(x);
    }
Enter fullscreen mode Exit fullscreen mode
  • In the above code, 4*n bytes of space is required for the array a[] elements.
  • 4 bytes each for x, n, i and the return value. Hence the total memory requirement will be (4n + 12), which is increasing linearly with the increase in the input value n, hence it is called as Linear Space Complexity or O(n) Space Complexity.

Example 3- When you implement binary search, You will find total memory requirement will be something similar in terms of log(n). Hence it is called as Logarithmic Space Complexity or O(log n) Space Complexity.
In practical on day to day basis,
LOGSPACE is the set of problems that can be solved by a deterministic Turing machine using only O(log n) memory space with regards to input size. Even a single counter that can index the entire n-bit input requires log n space, so LOGSPACE algorithms can maintain only a constant number of counters or other variables of similar bit complexity.

Example 4- Let's take an example of merge sort

   function mergeSort(arr) {
    if (arr.length === 1) {
      return arr;
    }

    const center = Math.floor(arr.length / 2);
    const left = arr.slice(0, center);
    const right = arr.slice(center);

    return merge(mergeSort(left), mergeSort(right));
   }

   function merge(left, right) {
     const results = [];

     while (left.length && right.length) {
      if (left[0] < right[0]) {
        results.push(left.shift());
      } else {
        results.push(right.shift());
      }
    }

    return [...results, ...left, ...right];
  }

Enter fullscreen mode Exit fullscreen mode

here, the memory requirement will be some constant *( n * log (n)). Hence it is called Linearithmic Space complexity or O(n log n).

Similarly, we can have quadratic and other complex space complexity as well, as the complexity of an algorithm increases.

But we should always focus on writing algorithm code in such a way that we keep the space complexity minimum.

To conclude, i would say, better the space complexity of an algorithm better will be performance of it. Before finalizing the algorithm for your programming code the worst case scenario should be considered. It would help in longer terms with respect to scalability and sustainability of the overall system and it will help in keeping the resources in healthy condition without burning out.

If you have some algorithm and want to calculate the space complexity, put the code in the comment section and we can decode together.

Top comments (0)