Two important things to remember while coding are

- Readability and
- Scalability.

There are many ways to solve a problem but a good developer always strive for better efficeint code and performance.

There comes the Big-O part to scale the time and space complexity of a algorithm/problem.It is helpful in determining the complexity and also scale the performance of algorithm.

Different Big-O terms are

- O(1) - Constant time
- O(n) - Linear time
- O(n^2) - Quadratic time

# O(1) - Constant Time complexity

The constant Time complexity explains that **no matter what size of the input or output, the execution time and the resources used will always be the same**. No matter how many times or where the algorithm is executed, it produces same performance all the time. For example:

# O(n): Linear Time Complexity

If an algorithm has linear complexity, then **execution time and/or resources used are directly proportional to the input size**. For example:

# O(n2): Quadratic Time Complexity

The quadratic complexity is present when the **impact of the algorithm is directly proportional to the square of the input size**.

This complexity is common in sorting algorithms like bubble sort, insertion sort, and the selection sort.

Here are some of Tricky examples

This is a good example, if the function takes two different inputs,so Big-O changes to **O(input1 + Input2)**.

For above example,if it is a **nested for Loop** then the Big-O changes to **O(input1*input2)**.

# Big-O cheatSheet and Graph

Feel free to discuss more tricky examples.

Thank you for reading.

## Discussion

If

`O(log n)`

is logarithmic, what do you call`O(n log n)`

?Edit: It's called

`Linearithmic`