DEV Community

Cover image for Why Your Linked List Wants to Be a Bloody Tree
Doogal Simpson
Doogal Simpson

Posted on • Originally published at doogal.dev

Why Your Linked List Wants to Be a Bloody Tree

Quick Answer: A linked list hits a wall when searching because it’s stuck in linear O(n) time. By giving every node two pointers instead of one, you create a Binary Search Tree (BST). This wacky structure uses its branching logic to cut your search space in half with every step, turning a billion-element crawl into a 30-step sprint.

So, what happens if you make a linked list where, instead of every node pointing to a single node, every node points to two more? You’re a human being, you’ve got free will, and you can do what you want. But if you do that, you’ve not got a linked list anymore, you’ve got to have made a bloody tree, haven’t you?

Trees are super interesting data structures. They are flexible, they have loads of use cases, and they are basically the reason your computer doesn't catch fire when you try to find a file. One of the coolest forms is the Binary Search Tree (BST), which takes the biggest problem we have with linked lists—the search time—and absolutely guts it.

What happens if I just give a node two pointers?

You’ve stopped being linear and started building a hierarchy. Instead of a single path from head to tail, you’ve created a structure that fans out, where one parent node leads to two children, effectively turning a simple queue into a tree.

The linked list is a linear crawl. If you want the item at the end, you have to walk past everyone else first. By giving nodes two pointers, you’re allowing the data to branch. This means you can start making decisions about where to go next based on the value you're looking for, rather than just blindly following a single wire.

Why is searching a linked list so slow?

Searching a linked list is Order n (O(n)), which is a fancy way of saying it’s a slog. If you have a billion elements and the thing you want is at the end, you’re performing a billion checks before you can go home.

We don’t like that. It's slow, it's inefficient, and it doesn't scale. If your data grows, your wait time grows at exactly the same rate. This is the bottleneck that makes linked lists frustrating for anything other than simple stacks or queues where you only care about the ends.

How do trees handle a billion elements in 30 steps?

By using a Binary Search Tree, you ensure that one path from a node contains values larger than the current one, and the other path contains smaller values. This means every time you jump from one node to the next, you’re reducing your search space by half.

After the first node, you’ve gone from a billion elements to half a billion elements. Then 250 million. Then 125 million. Log n of a billion is about 30, I think. So instead of searching through a billion elements, you’re now searching through 30. That is the power of the "wacky linked list" approach.

Data Structure Search Complexity Logic for Lookups
Linked List O(n) Linear walk; check every single node.
Binary Search Tree O(log n) Binary split; discard half the data per jump.
Sorted Array O(log n) Binary search; requires contiguous memory.

Is my computer’s folder structure really just a tree?

Yes, it’s the most obvious real-world example of this logic. Every folder contains folders, which contain more folders, which eventually contain files.

In this analogy, your files are the "leaves"—the nodes at the very end with no children—and your folders are the nodes. Imagine if your OS didn't have this tree structure. To find a single cat photo on a 2TB drive, it would have to scan every single sector on the disk in a straight line. You’d be waiting forever. Because it's a tree, the OS just follows a few branches (Root -> Users -> Doogal -> Pictures) and finds it instantly.

To build this in code, you’re basically just doubling your pointer overhead. Here is how a node looks when it stops being a list and starts being a tree:

struct Node {
  int value;
  struct Node *left;  // Smaller stuff goes here
  struct Node *right; // Larger stuff goes here
};
Enter fullscreen mode Exit fullscreen mode

FAQ

Does a tree use more memory than a linked list?

Yes, you're literally doubling the pointer overhead. In a standard linked list, you have one pointer per node; in a binary tree, you have two (left and right). You’re trading a bit of memory for a massive increase in search speed.

What if my tree isn't organized by value?

Then it’s just a regular binary tree, not a Binary Search Tree. You’ll still have a hierarchical structure, but you lose that O(log n) search advantage because you won't know which branch to pick to find your data.

Can a tree become as slow as a linked list?

It can. If you insert numbers in order (1, 2, 3, 4...), the tree just grows in one long line to the right. We call that a "degenerate" tree. It’s basically just a linked list with extra steps, which is why we usually use balancing logic to keep the tree wide instead of long.

Cheers!

Top comments (0)