Merge Sort was introduced by John von Neumann in 1945, primarily to improve the efficiency of sorting large datasets. Von Neumann's algorithm aimed to provide a consistent and predictable sorting process using the divide and conquer method. This strategy allows Merge Sort to handle both small and large datasets effectively, guaranteeing a stable sort with a time complexity of O(n log n) in all cases.
Merge Sort employs the divide and conquer approach, which splits the array into smaller subarrays, recursively sorts them, and then merges the sorted arrays back together. This approach breaks the problem into manageable chunks, sorting each chunk individually and combining them efficiently. As a result, the algorithm performs well even on large datasets by dividing the sorting workload.
Recursion is a process where a function calls itself to solve a smaller version of the same problem. It keeps breaking the problem down until it reaches a point where the problem is simple enough to solve directly, which is called the base case.
Below is an implementation of Merge Sort in JavaScript, showing how the array is recursively split and merged:
function mergeSort(arr) {
if (arr.length <= 1) return arr;
const mid = Math.floor(arr.length / 2);
const left = mergeSort(arr.slice(0, mid));
const right = mergeSort(arr.slice(mid));
return merge(left, right);
}
function merge(left, right) {
let result = [];
while (left.length && right.length) {
if (left[0] < right[0]) result.push(left.shift());
else result.push(right.shift());
}
return result.concat(left, right);
}
To better understand how Merge Sort works, let's walk through the process using the array: [38, 27, 43, 3, 9, 82, 10]
Step 1: Recursive Division (mergeSort
function)
The array is recursively split into smaller subarrays until each subarray has only one element. This happens through the following lines in the mergeSort
function:
function mergeSort(arr) {
if (arr.length <= 1) return arr;
That stops our recursion.
Here’s how the recursive division unfolds:
-
Initial Call:
mergeSort([38, 27, 43, 3, 9, 82, 10])
- The array is split at the midpoint:
[38, 27, 43]
and[3, 9, 82, 10]
- The array is split at the midpoint:
-
First Half:
-
mergeSort([38, 27, 43])
- Split at the midpoint:
[38]
and[27, 43]
- Split at the midpoint:
-
mergeSort([27, 43])
- Split into
[27]
and[43]
- Split into
- Subarrays
[38]
,[27]
, and[43]
are now individual elements and ready to be merged.
-
-
Second Half:
-
mergeSort([3, 9, 82, 10])
- Split at the midpoint:
[3, 9]
and[82, 10]
- Split at the midpoint:
-
mergeSort([3, 9])
- Split into
[3]
and[9]
- Split into
-
mergeSort([82, 10])
- Split into
[82]
and[10]
- Split into
- Subarrays
[3]
,[9]
,[82]
, and[10]
are now ready to be merged.
-
Step 2: Merging the Sorted Subarrays (merge
function)
Now, we start merging the subarrays back together in sorted order using the merge function:
function merge(left, right) {
let result = [];
while (left.length && right.length) {
if (left[0] < right[0]) result.push(left.shift());
else result.push(right.shift());
}
return result.concat(left, right);
}
Here’s how the merging process works:
First Merge (from the base cases):
- Merge
[27]
and[43]
→ Result is[27, 43]
- Merge
[38]
with[27, 43]
→ Result is[27, 38, 43]
At this point, the left half of the array is fully merged: [27, 38, 43]
.
Second Merge (from the base cases):
- Merge
[3]
and[9]
→ Result is[3, 9]
- Merge
[82]
and[10]
→ Result is[10, 82]
- Merge
[3, 9]
with[10, 82]
→ Result is[3, 9, 10, 82]
Now, the right half is fully merged: [3, 9, 10, 82]
.
Step 3: Final Merge
Finally, the two halves [27, 38, 43]
and [3, 9, 10, 82]
are merged using the merge
function:
Compare 27
(left[0]) and 3
(right[0]). Since 3 < 27
, add 3
to the result.
Compare 27
and 9
. Add 9
to the result.
Compare 27
and 10
. Add 10
to the result.
Compare 27
and 82
. Add 27
to the result.
Compare 38
and 82
. Add 38
to the result.
Compare 43
and 82
. Add 43
to the result.
Add the remaining element 82
from the right array.
The fully merged and sorted array is:
[3, 9, 10, 27, 38, 43, 82]
.
Time Complexity: Best, Average, and Worst Case: O(n log n)
Let's look at it closer:
Dividing (O(log n)): Each time the array is split into two halves, the size of the problem is reduced. Since the array is divided in half at each step, the number of times you can do this is proportional to log n. For example, if you have 8 elements, you can divide them in half 3 times (since log₂(8) = 3).
Merging (O(n)): After dividing the array, the algorithm merges the smaller arrays back together in order. Merging two sorted arrays of size n takes O(n) time because you have to compare and combine each element once.
Overall Complexity (O(n log n)): Since dividing takes O(log n) steps and you merge n elements at each step, the total time complexity is the multiplication of these two: O(n log n).
Space Complexity: O(n)
Merge Sort requires extra space proportional to the size of the array because it needs temporary arrays to store the elements during the merge phase.
Comparison to Other Sorting Algorithms:
QuickSort: While QuickSort has an average time complexity of O(n log n), its worst case can be O(n^2). Merge Sort avoids this worst-case scenario, but QuickSort is generally faster for in-memory sorting when space is a concern.
Bubble Sort: Much less efficient than Merge Sort, with a time complexity of O(n^2) for average and worst-case scenarios.
Real World Usage
Merge Sort is widely used for external sorting, where large datasets need to be sorted from disk, as it efficiently handles data that doesn’t fit into memory. It's also commonly implemented in parallel computing environments, where subarrays can be sorted independently, taking advantage of multi-core processing.
Moreover, libraries and languages such as Python (Timsort), Java, and C++ (std::stable_sort) rely on variations of Merge Sort to ensure stability in sorting operations, making it particularly suitable for sorting objects and complex data structures.
Conclusion
Merge Sort continues to be a fundamental algorithm in both theoretical computer science and practical applications due to its stability, consistent performance, and adaptability for sorting large datasets. While other algorithms like QuickSort may perform faster in certain situations, Merge Sort’s guaranteed O(n log n) time complexity and versatility make it invaluable for memory-constrained environments and for maintaining the order of elements with equal keys. Its role in modern programming libraries ensures it remains relevant in real-world applications.
Sources:
- Knuth, Donald E. The Art of Computer Programming, Vol. 3: Sorting and Searching. Addison-Wesley Professional, 1997, pp. 158-160.
- Cormen, Thomas H., et al. Introduction to Algorithms. MIT Press, 2009, Chapter 2 (Merge Sort), Chapter 5 (Algorithm Complexity), Chapter 7 (QuickSort).
- Silberschatz, Abraham, et al. Database System Concepts. McGraw-Hill, 2010, Chapter 13 (External Sorting).
- "Timsort." Python Documentation, Python Software Foundation. Python's Timsort
- "Java Arrays.sort." Oracle Documentation. Java's Arrays.sort()
Top comments (0)