1. Introduction. The hook of today
If searching tells you where something is, sorting tells you how to arrange everything else around it.
Sorting is one of the oldest and most fundamental problems in computer science. Every database, browser, and spreadsheet performs sorting millions of times a day.
Today we start from the ground up, the intuitive quadratic sorting algorithms.
They are slow for large data but perfect for learning how comparisons and swaps interact, and for seeing how more advanced algorithms improve on them later.
Pro Tip: follow along and try it yourself! You can experiment with all sorting algorithms directly here:
👉 Open Sorting Algorithms Playground
2. The naive idea
The most direct way to sort is to repeatedly look for items that are out of order and swap them until everything is sorted.
This leads to the three fundamental O(n²) sorting algorithms:
- Bubble Sort: repeatedly swap adjacent out-of-order elements.
- Selection Sort: repeatedly find the smallest element and move it to its position.
- Insertion Sort: grow a sorted prefix by inserting one element at a time.
All three rely on nested loops and simple comparison logic.
Let’s walk through each.
3. Bubble Sort
3.1 Idea
Bubble sort repeatedly compares adjacent elements and swaps them if they are in the wrong order.
After each pass, the largest unsorted element “bubbles up” to its correct position at the end of the list.
3.2 Implementation
def bubble_sort(arr):
n = len(arr)
for i in range(n):
swapped = False
for j in range(0, n - i - 1):
if arr[j] > arr[j + 1]:
arr[j], arr[j + 1] = arr[j + 1], arr[j]
swapped = True
if not swapped:
break
return arr
3.3 Step-by-step
- Inner loop compares adjacent pairs and swaps if necessary.
- After the first pass, the largest element is at the end.
- Each subsequent pass ignores the sorted tail (n - i - 1).
- If no swaps happen in a full pass, the list is already sorted.
3.4 Complexity
3.5 Why keep it around
- Great for teaching because the swap logic is visible.
- Easy to implement for very small lists.
- Provides a foundation for optimized hybrid sorts.
4. Selection Sort
4.1 Idea
Selection sort divides the list into a sorted and unsorted section. At each step it selects the smallest remaining element and places it at the beginning of the unsorted section.
4.2 Implementation
def selection_sort(arr):
n = len(arr)
for i in range(n - 1):
min_idx = i
for j in range(i + 1, n):
if arr[j] < arr[min_idx]:
min_idx = j
if min_idx != i:
arr[i], arr[min_idx] = arr[min_idx], arr[i]
return arr
4.3 Step-by-step
- Find the smallest element in the unsorted region [i, n).
- Swap it with the element at index i.
- Increment i and repeat until the end.
4.4 Complexity
4.5 Why it matters
- Simple, predictable number of comparisons (always n(n−1)/2).
- Not stable because equal elements can be swapped out of order.
- Rarely used in practice, but conceptually introduces the idea of finding minima efficiently.
5. Insertion Sort
5.1 Idea
Insertion sort builds the sorted array one element at a time.
For each new element, it shifts larger elements one position to the right until the correct place is found, then inserts it.
This is how most people sort playing cards in their hand.
5.2 Implementation
def insertion_sort(arr):
for i in range(1, len(arr)):
key = arr[i]
j = i - 1
while j >= 0 and arr[j] > key:
arr[j + 1] = arr[j]
j -= 1
arr[j + 1] = key
return arr
5.3 Step-by-step
- Start with the second element as the first “unsorted” value.
- Compare it with previous elements, shifting larger ones right.
- Insert it where it belongs.
- Repeat until the list is sorted.
5.4 Complexity
5.5 Why it’s special
- Adaptive: faster on nearly sorted data.
- Stable: equal elements preserve their relative order.
- Used inside advanced algorithms like Timsort and IntroSort for small partitions.
6. Comparing all three
Pro Tip: If you want to watch these algorithms in motion, the free tool Visualgo is perfect.
You can choose Bubble, Selection, or Insertion Sort, adjust the array size and animation speed, and observe how swaps and comparisons happen step by step.
Visualgo lets you slow down or speed up the animation, highlight comparisons, and even switch to other algorithms like Merge or Quick Sort when you move on to Part 2.
7. Quick comprehension check
Which of the three sorts is stable and adaptive?
→ Insertion sort.Why does bubble sort’s early-exit optimization improve the best case?
→ Because it stops after one pass when no swaps occur, giving O(n).Why is selection sort not stable?
→ Because it can swap equal elements from different positions, reversing their order.
8. Final summary and review
Bubble, Selection, and Insertion Sort share the same big-O performance, but they teach different lessons about algorithmic design.
- Bubble Sort shows how repeated local swaps lead to global order.
- Selection Sort teaches how to find minima systematically.
- Insertion Sort demonstrates adaptivity and the value of shifting instead of swapping.
They all run in O(n²) time but form the mental foundation for divide-and-conquer sorts like Merge Sort and Quick Sort, where comparisons and movement are organized more efficiently.
9. Hook for tomorrow
Next time we leave the quadratic world behind. We will learn how divide and conquer reduces comparisons from n² to n log n.
You will see how Merge Sort builds order from merging halves, and how Quick Sort partitions the array around pivots to achieve the same speed in place.







Top comments (0)