Data structures can be viewed as containers which are used to store data while algorithms are the techniques used to manipulate data. The study of data structures and algorithms(DSA) aims to help computer scientists understand and discover better ways of storing data and manipulating the data efficiently so as to solve the problem in the most effective way.

One of the most popular technique of measuring efficiency of algorithms is the Big-O notation. Big-O measures usage of time and space by the DSA.

The different complexities are:-

O(1) - Constant Time/Space Complexity: Means the storage/time used by the DSA does not change when the input size changes.

O(n) -Linear Time/Space Complexity: Means the storage/time changes as the input changes.

O(n^2) - Quadratic Time/Space Complexity: Storage/Time changes quadratically as the input size changes.

O(n!) -Exponential Time/Space Complexity: Storage/Time changes exponentially

O(log n) - Logarithmic Time/Space Complexity: Changes are logarithmic i.e. changes are based on the logarithm to base 2 of the input.

For example we can be asked to check if an array of numbers contains duplicates(The question can be found on Leetcode).

An array is a data structure which contains a sequential list of values. In this case our input will be an array with size n. We can develop an algorithm using the naÃ¯ve approach of looping through the array and check if each value has been repeated.

```
function containsDuplicates(array){
for(let i=0;i<array.length;i++){
for(let j=0;j<array.length;j++){
if(i!=j){
if(array[i]==array[j]){
return true
}
}
}
}
return false
}
```

The code uses two `for`

loops meaning the in the worst case the array will run n*n times giving use a O(n*2) runtime, where n is the input array length. For Example if we have an array [1,2,6,4,9] the can be said to run 5*5 times, where 5 is the length of the array. That means the first `if`

statement will be called 25 times, if we had an array [2,3,6,4,9,4] of size 6, the first `if`

statement would be called 36 times. However this is not always the case for all inputs. If we are lucky to have an array of the size 5 but with different values e.g [2,2,6,4,9] the match will be found on earlier on i.e the first `if`

statement will be called 2 times before finding a match. However for Big O notation we consider the worst case therefore the best case won't matter when doing Big O analysis. We can slightly improve the code by starting the inner for loop at ith index instead of zero.

```
function containsDuplicates(array){
for(let i=0;i<array.length;i++){
for(let j=i+1;j<array.length;j++){
if(array[i]==array[j]){
return true
}
}
}
return false
}
```

These improves our code slightly as we are left with one `if`

statement and we don't cross check each value twice, however our runtime is still O(n^2)(since if we have an input of 5, the `if`

statement will still be called 25 times).

The storage space does not change depending on the array size, therefore we can say we have a constant storage space for the above algorithms.

We can improve the algorithm to run on linear time, this means we only loop through the array once. One way we can do that is by using a Hashmap or a Set. Hashmaps are data structures which store data as key value pairs. Sets data structure which contains unique values. However using Sets and Hashmaps will mean that the storage space used by the algorithms will no longer be constant but linear i.e it will increase based on the array size.

Hashmap Solution

We first need to create an instance of a hashmap before using it. In javascript this is done by calling `new Map()`

. We then need to use two methods in the Map class:-

Map.has(value) which checks if the value already exists in the hashmap.

Map.set(value) which adds the value to the hashmap, if it does not exist.

```
function containsDuplicates(array){
let hashMap=new Map()
for(let i=0;i<array.length;i++){
if(hashMap.has(array[i])){
return true
}else{
hashMap.set(array[i],true)
}
}
return false
}
```

Using the Hashmap we will notice that we have gotten rid of one `for`

loop. This reduces our runtime from quadratic O(n^2) to linear O(n), however we increase our memory usage from O(1) constant to linear. Using a hashmap our `if`

statement will be called only 5 times at the worst case if we have an array of size 5. We can improve the code further by using Sets.

Using a Set

As with the hashmap solution we first need to create an instance of the Set. In javascript this is done by calling `new Set()`

. We then need to use two methods in the Map class:-

Map.has(value) which checks if the value already exists in the hashmap.

Map.set(value) which adds the value to the hashmap, if it does not exist.

```
function containsDuplicates(array){
let set=new Set()
for(let i=0;i<array.length;i++){
if(set.has(array[i])){
return true
}else{
set.add(array[i])
}
}
return false
}
```

Using a Set we only store the value unlike a Hashmap which we store a key and a value. In our case the keys are the array values and the values are boolean flags.

Another way we can utilize Sets properties to help us filter the duplicates. In JS we can create a set by passing an array of values. Remember a Set cannot have duplicates, therefore it wont include duplicates in the created array.

```
function containsDuplicates(array){
let set=new Set(array)
return set.size!=array.length
}
```

## Top comments (1)

so helpful...thanks