Sorting algorithms are a mainstay of computer programming. There are countless situations where one might want to sort through things. After all, arrays are one of the most important data structures. In this blog I will discuss a couple different sorting methods, how to implement them and their merits.
It most likely is obvious for most why one would want to sort something in the first place...Obviously if we are dealing with an array that has so many things in it we might want it sorted so that we have some sort of logic to our storage. Or we might not even know what is inside it, and sorting would make it easier to find out. To start with something easy lets just look at a simple array of numbers.
Sorting can be a very expensive with large arrays, but something like this almost any way can work. In javascript there is a built in sort method. With an array this size we can sort these numbers in ascending order quite easily just using this method.
This method is iterative and has to go through each element in the array in order to get its work done. Imagine if we had a massive array with millions of elements. It would become very expensive and take a lot of memory. In general this type of built in sorting will be O(n^2). We start with the first element then compare to all the others in order to know where to place the element. And we do this with all the elements.
What we really want to do is get out algorithm to O(n log (n)). Certain sorting methods that get to O(n log(n)) are merge sort, quick sort(mostly) and heap sort. To be honest I am not sure which one is used the most, I imagine it is a case by case thing, or maybe programmers preference. In my relatively new time coding (~8 months) I would say that I have seen quick sort methods the most.
You might have noticed that I said that quick sort is O(n log(n)) mostly . This is because of the way we choose our pivot elements. The most common way that I have seen pivot elements chosen is the median element or a random element. Then you want a left and a right element. Basically what we want to do is check our left and right elements in comparison to our pivot position. This allows us to check a smaller and smaller list of elements, hence O(n log(n)).
The idea of these better sorting methods is that we are splitting our list into smaller and smaller pieces. Then we deal with those smaller pieces and put them back together as they sort. This is a much more efficient way than just comparing all the elements to each other. However with quick sort if we choose our pivot as the first element, we might encounter a problem if the list is already sorted, which will make the algorithm O(n^2) because the left will always be the min.
Top comments (0)