DEV Community

Cover image for Majority Element: Easy Problem, Sneaky Insight
Archdemon
Archdemon

Posted on

Majority Element: Easy Problem, Sneaky Insight

The Majority Element problem on LeetCode looks like one of those questions you expect to solve in two minutes and immediately forget about.

Input: an array of numbers
Output: the number that appears more than n/2 times
(n is the size of the array)

That’s it. No tricks hiding in the corner.

LeetCode even tells you upfront:
👉 “The majority element always exists.”

So your brain does the obvious thing:

“Cool. I’ll just count.”

99% of the time, that’s exactly the right instinct.


The Obvious Solution (And Why It’s Perfectly Fine)

You loop through the array, count how many times each number appears, and return the one whose count is greater than n/2.

Example:

Input:  [1, 1, 2, 2, 2, 3, 3]
Counts: {1: 2, 2: 3, 3: 2}
Enter fullscreen mode Exit fullscreen mode

2 appears more than half the time. Done.

Go Implementation

func majorityElement(nums []int) int {
    counts, n := make(map[int]int), len(nums)

    for _, num := range nums {
        counts[num]++
    }

    for k, v := range counts {
        if v > n/2 {
            return k
        }
    }

    return -1 // unreachable due to problem guarantee
}
Enter fullscreen mode Exit fullscreen mode

Complexity

  • Time: O(N)
  • Space: O(N)

This solution is clean, readable, and correct.

In real life, you’d probably stop here, ship it, and move on.


The Follow-Up That Changes the Mood

Then LeetCode adds one extra line:

Could you solve it in linear time and O(1) space?

Translation:

“Same problem. No hashmap.”

This is where the problem quietly stops being trivial and starts testing how you think.


⏸️ Pause Here

Before reading further, I’d genuinely recommend pausing for a moment and thinking about this yourself:

  • What does “more than half” actually guarantee?
  • What information do you really need to keep around?
  • If counting everything was forbidden, what would you track instead?

If you already know the solution, wonderful read along for a fun mental model.
If you don’t, that’s completely fine. It wasn’t obvious to me the first time either.

When you’re ready, keep going.


A Mental Model Instead of Math

Let’s stop thinking in terms of arrays and counters for a second.

Instead, imagine this.


Teams, Players, and Rounds

Think of each number as a team.
Each occurrence of that number is a player on that team.

  • There are multiple teams
  • One team has more than half of all players
  • The game plays out in rounds

Rules of the Game

  1. In each round, one player steps forward
  2. If two different teams step forward, both players are eliminated
  3. Eliminated players never come back
  4. A team can only keep playing if it still has unused players

Here’s the key part:

The outcome of a round doesn’t matter.
What matters is who still has players left.


What Inevitably Happens

Smaller teams run out of players pretty quickly.
Once they’re empty, they’re done.

The majority team:

  • Loses players too
  • Gets beaten
  • Cancels others out

But because it started with more players than all the other teams combined, it can never be eliminated completely.

By the end:

  • Either it’s the only team left
  • Or it’s the only team that can still send a player

That team is the majority element.

No tricks.
No luck.
Just inevitability.


Turning That Into Code

Now let’s map that mental model back to code.

  • candidate → the team currently sending players
  • count → how many more players that team has relative to others

As you walk through the array:

  • If count == 0 → pick the current number as the new team
  • Same number again → that team sends another player (count++)
  • Different number → two players clash and are gone (count--)

You’re not tracking wins.
You’re tracking who can still play.


Go Implementation (O(1) Space)

func majorityElement(nums []int) int {
    candidate, count := 0, 0

    for _, num := range nums {
        if count == 0 {
            candidate = num
        }

        if num == candidate {
            count++
        } else {
            count--
        }
    }

    return candidate
}
Enter fullscreen mode Exit fullscreen mode

Complexity

  • Time: O(N)
  • Space: O(1)

Because the problem guarantees a majority element, no second pass is needed.


Trivia

This algorithm was introduced by Robert S. Boyer and J. Strother Moore, and is known as the Boyer–Moore Voting Algorithm.

You can think of it as a different way of running an election:

  • Instead of counting every vote up front,
  • candidates face off round by round,
  • and you only keep track of who still has supporters left.

By the end, the candidate who survives isn’t just popular —
they’re mathematically guaranteed to be the majority.

Less bookkeeping.
Same result.
Much more clever.


Side-by-Side Comparison

Approach Time Space Obvious? Interview Value
HashMap counting O(N) O(N) Yes Medium
Boyer–Moore voting O(N) O(1) No High

The hashmap solution is practical.
The Boyer–Moore solution is conceptual.

Interviewers like the second one not because it’s “better”,
but because it shows you can reason from constraints and guarantees.


Appendix: When Would You Actually Use This?

By now, you might be wondering:

“This is clever… but when would I ever use this instead of a hashmap?”

That’s a fair question.

The Boyer–Moore Voting Algorithm is not a general-purpose replacement for counting. It shines only when a strong guarantee exists.

You usually reach for it when all three of these are true:

  1. You’re dealing with large or streaming data
  2. A majority element is guaranteed
  3. You want predictable, minimal memory usage

Streaming or One-Pass Data

If data comes in as a stream (logs, events, messages), you may not want to store everything just to count it.

Boyer–Moore lets you:

  • process elements one at a time
  • keep constant memory
  • stop at any point with a meaningful candidate

Large Inputs and Performance-Sensitive Code

Hashmaps aren’t just “memory”.

They involve:

  • allocations
  • resizing
  • hashing
  • cache misses
  • garbage collection

Boyer–Moore avoids all of that. It’s fast, cache-friendly, and predictable.


Memory-Constrained Environments

In embedded systems or edge devices, even small data structures matter.

An O(1) space algorithm:

  • uses fixed memory
  • behaves consistently
  • fails less spectacularly

That simplicity is a feature.


When Not to Use This Algorithm

Just as important:

Don’t use Boyer–Moore if:

  • a majority element is not guaranteed
  • you need exact frequencies
  • you need top-K elements
  • you care about distribution, not dominance

In those cases, a hashmap or sorting is the right tool.


A Simple Rule of Thumb

Use Boyer–Moore when dominance matters
Use a hashmap when distribution matters

Both are good tools. They just solve different problems.


Final Thought

My goal here was to help you actually understand why this algorithm works and not just walk away with something to memorize.

Once that idea clicks, implementing it in your favorite language becomes a fun little exercise.

If this helped you see the problem differently, feel free to share it with someone who might enjoy the same “aha” moment. And if you have questions, feedback, or a different mental model altogether, drop a comment. Those conversations are half the fun.

Happy coding 👋

Top comments (0)