This week's punny title is in relation to the topic but also the sort of mood I've been in lately. I'm currently trying to study juggling through a bunch of computer science topics in preparation for my first software engineering role, but as I do so, I feel as if much of the learning I'm doing is of a sorting variety. As I'm beginning to learn more things and to grow in proficiency, I've learned that the true challenge is not just taking in a bunch of information, but carefully finding out how it applies to multiple real situations. I feel like doing these blogs have been really enjoyable, particularly because of being able to do this. Being able to connect the dots and understand where each piece of data fits in the larger spectrum. Just wanted to mention that as we step into to the technical topic, but this is not a personal blog. Let's talk about sorting algorithms.
This week, I learned about sorting algorithms, and to be honest, they were pretty tough. Usually, we're lucky enough to have some sort of array that needs to be sorted, and our language has a built in function to sort that array for us. When trying to perform basic sorting functions we admittedly don't need to write our own algorithms to do the task we are trying to perform, and that's probably why I've never bothered to learn them.
However, I've come to find that they are actually pretty important. As suggested by people online, being able to understand sorting algorithms is important in relation to the coding interviews, as most interviewers expect you to understand what is going on under the hood in these algorithms. This is for a good reason, because although you might not need a complex algorithm for your array that lists your demo app's 10 users in order, when there are multiple parameters required and larger inputs, your future company is going to want to use one that is capable of performing those complex parameters at optimum levels when dealing with millions of times your current sample data.
Today I want to cover 3 of the sorting algorithms I learned so far and they are:
1) The Bubble Sort
2) The Insertion Sort
3) The Merge Sort
I selected these 3 because they present 3 different best possible time complexities, and I believe a look at these 3 is a good overview to how greatly different sorting algorithms can affect the efficiency of the algorithm. As we do go down the list, the time complexity goes down exponentially, but so does the complexity of the algorithm. Let's take a look at bubble sort.
When I first heard the name of this algorithm, I got easy-going vibes, because how could anything have a name that starts with bubble and not be fun and easy-going? My suspicions were confirmed when I first dived in, and had found that this was the slowest possible solution with O(n^2) time complexity. The bubble sort is essentially the algorithm that consists of two parts. 1) The algorithm iterates through array and compares each pair and swapping them if necessary for the length of the array. 2) It does this loop to the end of the array a number of times that equals the lenght of the array. This is necessary, because if we consider the largest possible number being on the left most side, it would have to be swapped that many times to reach the right most end.
Let's take a look how we would code this:
function bubbleSort(array) {
const length = array.length;
for(let i = 0; i < length; i++){
for(let j = 0; j < length; j++){
if(array[j] > array[j+1]) {
let temp = array[j];
array[j] = array[j+1];
array[j+1] = temp;
}
}
}
}
We first added the length just for convenience sake, but the rest of the code is what we already talked about in those two steps with that length variable as a factor. The swapping mechanism we use sets a temporary variable and sets the separate indices with the value that was previously held at the swapping location. Like I said, the most harmless, and easy to understand bubble sort solution. That said, this solution is extremely inefficient. First of all, even if this solution were to be on an array that was already sorted, it would still have to check every single pair in the array array.length times and it has even a best time complexity of O(n^2). Also, along those lines, we are checking the first numbers in the array array.length times, because each swap is only moving the numbers 1 spot closer to where it needs to be as opposed to just moving it there directly.
That being said let's take things up one notch, and take a look at the insertion sort. This algorithm is a worst time complexity of O(n^2) as well, but it has a best of O(n), because it employs an insertion, and can move numbers where they should be everytime. It's notable that this solution is also more like how we sort. Normally when we sort, we still check every number in a list when we sort, but one at a time, we like to put them in a right place, namely somewhere that is between or before the appropriate numbers.
The two parts of the insertion sort consist of 3 simple parts. 1) We will loop through the array and 2) Check whether the first index in the array is less than the value in the current index or 3) if it's not, check whether the two indexes contain values that are concurrently less and then greater than the current value.
function insertionSort(array){
for(let i = 0; i < array.length; i++){
if(array[i] < array[0]){
array.unshift(array.splice(i,1)[0])
} else {
for(let j = 0; j < array.length; j++){
if(array[i] > array[j - 1] && array[i] < array[j]){
array.splice(j, 0, array.splice(i,1)[0])
}
}
}
}
}
This one has more bells and whistles that allow us that best possible time complexity of O(n). The first thing that catches our attention is the first condition is where we just place a number before the first index as the current lowest number. This allows for a best possible sort, that would be for an array that has numbers that are in the exact opposite order that it should be, so it would just have to loop through the array once and assign each index to the beginning. However, we are ending up having to check the array everytime in order to find the spot that the array should be in given the first index is a low number. We would have to check every spot in the array for the proper index for insertion and that could be potentially be the length of the array everytime.
Now the last sorting that I wanted to cover is the merge sort sorting algorithm, and before we get into it, if you haven't learned about recursion, I'll try my best to explain what is going in the recursion functions with context to this problem, but I recommend checking out this comprehensive article with visuals, if you want to dive deeper or are still a bit confused.
The merge sort algorithm I assume can be solved with iterations, but the recursive function that I learned allows for the simplifiation of the steps in the problem. For this problem, we should just know that recursion is basically a function that calls itself and 1) does an action that is performed everytime it calls itself and 2) provides a condition for stopping.
The merge sort algorithm is an algorithm that consists of two parts: 1) recursion is used to first split up the array in half until the arrays that are passed into the function are single pieces, then 2) two halves (starting from the individual pieces and then bigger portions of the array as they are merged) are then merged together and returned again to the recursion function as a sorted array. Recursion is really hard to visualize, but the best way I can word out this function is it's like tournament brackets, except for our merge sort algorithm, both contestants progress after sorting. These two portions are why the algorithm has O(n log n), the splitting the array in half at the beginning is the log n portion and the comparing and sorting that takes place at all the steps is proportional to the size of the input.
Anyways here's the solution for merge sort:
function mergeSort(array){
//if the array is finished,
if(array.length == 1){
return array
}
//split the array in half
let halfLength = (array.length/2)
let left = array.slice(0, halfLength)
let right = array.slice(halfLength, array.length)
console.log(left)
console.log(right)
return merge(mergeSort(left), mergeSort(right))
}
function merge(left, right){
let leftCounter = 0
let rightCounter = 0
let merged = []
console.log('in merge', left, right)
while(leftCounter < left.length && rightCounter < right.length){
if(left[leftCounter] < right[rightCounter]){
//if both have elements and left is less than right, push right to merged array
console.log('left')
merged.push(left[leftCounter])
leftCounter++
} else if(right[rightCounter] < left[leftCounter]){
//if both have elements and right is less than left, push left to merged array
console.log('right')
merged.push(right[rightCounter])
rightCounter++
}
}
//checks for input every called recursion function
//console.log(left, right, 'inputs')
//console.log(merged, 'merged array')
return merged.concat(left.slice(leftCounter)).concat(right.slice(rightCounter))
}
There's two portions for the function, the first function splits the array in half, until we return the array.length 1 portions. Then those returned values are put into the merge comparsion function. These comparison functions then take the two portions of an array and sort it into new array data structure for returning. When we have completed one side of the array, we just push the uncompleted side which is already sorted to the end of the array, and presumably larger than all the other numbers on the left side. We wrote left and right to be concatenated, but one of these will be empty. We continue doing this comparsion between bigger portions of the array until the result is a comparison of the last two portions in the final sorted array.
That was all I had for this time, but I hope you were able to internalize at least the steps to the solution. I believe exposure to these algorithms will be a big help either directly or indirectly in coding interviews.
Top comments (0)