DEV Community

siva1b3
siva1b3

Posted on

From Coin Toss to LLM — Understanding Random Variables

From Coin Toss to LLM — Understanding Random Variables

A beginner friendly guide to probability and random variables — no math background needed.


1. What is Probability?

Probability is a number that measures how likely something is to happen.

This number is always between 0 and 1:

Value Meaning
0 Impossible — will never happen
1 Certain — will always happen
0.5 Equal chance — may or may not happen

Example

Flip a fair coin. Two outcomes are possible — heads or tails. Neither is more likely than the other.

So the probability of heads = 1/2 = 0.5
And the probability of tails = 1/2 = 0.5

One important rule — all probabilities of all possible outcomes must always add up to 1.

0.5 + 0.5 = 1 ✓

What happens in real life?

If you flip a coin 10 times, you might get 6 heads and 4 tails. That is normal.

But if you flip 10,000 times, the result will get very close to 50% heads and 50% tails.

More experiments you run → closer you get to the true probability.


2. What is a Random Variable?

Think of a random variable as an empty slot.

  • Before the experiment — the slot is empty
  • You run the experiment
  • After the experiment — the slot is filled with a number

Why do we need it?

Outcomes are often text — "heads", "tails", "win", "lose". Mathematics works with numbers, not words.

So we assign a number to each possible outcome. This is what a random variable does.

Example — Coin Toss

Before the experiment: Slot is empty. Two possible outcomes exist.

Possible outcome Assigned number
Heads 1
Tails 0

Experiment: Flip the coin.

After the experiment: Landed heads → Slot is filled with 1


New experiment, new slot.

Before: Fresh empty slot.

Experiment: Flip again.

After: Landed tails → Slot is filled with 0


Key point

A random variable is not a fixed number. It can be different every time you run the experiment. That is why it is called a variable.


3. Discrete vs Continuous

Not all random variables behave the same way. There are two types.

Discrete Random Variable

The slot can only be filled with countable, specific values. No values exist in between.

Examples:

  • Dice roll → only 1, 2, 3, 4, 5, 6. There is no 2.5 on a dice.
  • Number of goals in a football match → 0, 1, 2, 3... you cannot score 1.7 goals.
  • Number of students in a classroom → always a whole number.

Continuous Random Variable

The slot can be filled with any value in a range. There are always more precise values possible.

Examples:

  • Weight of a mango → 182g, 182.1g, 182.13g, 182.137g... it never stops.
  • Height of a person → 170.0cm, 170.01cm, 170.001cm...
  • Time taken to run 100 meters → infinite precision possible.

Quick test

Question Answer
Number of eggs in a basket Discrete
Exact temperature of water Continuous
Number of SMS messages sent today Discrete
Height of a building Continuous

4. Real World Examples — Coin, Dice, LLM

Now let us walk through three experiments using the same structure:

  • Start point
  • What was done as the experiment
  • What filled the slot at the end

Experiment 1 — Coin Toss

Start point: Slot is empty. Two possible values — 1 or 0.

Experiment: Flip the coin.

Result: Landed heads → Slot filled with 1

Type: Discrete — only two possible values.


Experiment 2 — Dice Roll

Start point: Slot is empty. Six possible values — 1, 2, 3, 4, 5, 6.

Experiment: Roll the dice.

Result: Landed on 4 → Slot filled with 4

Type: Discrete — six countable values.


Experiment 3 — LLM picks the next word

You type: "The sky is"

Start point: Slot is empty. The LLM has a fixed list of words called a vocabulary — roughly 50,000 words. Always the same list.

Experiment: LLM runs an internal calculation. It assigns a probability to every single word in the vocabulary.

Example result of that calculation:

Possible next word Probability
blue 0.60
clear 0.25
dark 0.10
falling 0.05
... 49,996 more words very small values

All probabilities add up to 1 — same rule as always.

Result: One word is picked based on these probabilities → Slot filled with "blue"

Type: Discrete — vocabulary is a fixed, countable list.


The pattern across all three

Coin Dice LLM
Slot before experiment Empty Empty Empty
Possible outcomes 2 6 50,000 words
Experiment Flip Roll Internal calculation
Slot after experiment 0 or 1 1 to 6 One word
Type Discrete Discrete Discrete

5. How LLM Uses Random Variables

Why does ChatGPT give different answers every time?

Even the word "falling" has probability 0.05 — not zero. So occasionally it gets picked.

The slot can be filled by any outcome. Just some are far more likely than others.

Same prompt → same probabilities → but picking is random → different word each time.

The temperature setting

There is a setting in LLMs called temperature that controls how random the picking is.

Temperature Effect
Low (near 0) Almost always picks the highest probability word — predictable, repetitive
High (1.0+) Picks lower probability words more often — creative, unpredictable

This is the same as controlling how random your experiment is.


Conclusion

You started with a simple coin toss and ended with understanding how a large language model generates text. The same idea runs through all of it.

Three things to remember:

Concept Simple definition
Random Variable An empty slot filled with a number after an experiment
Discrete Slot can only take countable, specific values
Continuous Slot can take any value in a range — infinite precision

Every time an LLM generates a word, it is filling a random variable slot — from a vocabulary of 50,000 words, each with a probability, picked by a calculation.

That is the entire connection — from coin toss to LLM.

Top comments (0)