<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: siva1b3</title>
    <description>The latest articles on DEV Community by siva1b3 (@siva1b3).</description>
    <link>https://dev.to/siva1b3</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/siva1b3"/>
    <language>en</language>
    <item>
      <title>From Coin Toss to LLM — Understanding Random Variables</title>
      <dc:creator>siva1b3</dc:creator>
      <pubDate>Wed, 01 Apr 2026 13:48:28 +0000</pubDate>
      <link>https://dev.to/siva1b3/from-coin-toss-to-llm-understanding-random-variables-k91</link>
      <guid>https://dev.to/siva1b3/from-coin-toss-to-llm-understanding-random-variables-k91</guid>
      <description>&lt;h1&gt;
  
  
  From Coin Toss to LLM — Understanding Random Variables
&lt;/h1&gt;

&lt;p&gt;A beginner friendly guide to probability and random variables — no math background needed.&lt;/p&gt;




&lt;h2&gt;
  
  
  1. What is Probability?
&lt;/h2&gt;

&lt;p&gt;Probability is a number that measures how likely something is to happen.&lt;/p&gt;

&lt;p&gt;This number is always between 0 and 1:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Value&lt;/th&gt;
&lt;th&gt;Meaning&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;0&lt;/td&gt;
&lt;td&gt;Impossible — will never happen&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;td&gt;Certain — will always happen&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;0.5&lt;/td&gt;
&lt;td&gt;Equal chance — may or may not happen&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;h3&gt;
  
  
  Example
&lt;/h3&gt;

&lt;p&gt;Flip a fair coin. Two outcomes are possible — heads or tails. Neither is more likely than the other.&lt;/p&gt;

&lt;p&gt;So the probability of heads = &lt;strong&gt;1/2 = 0.5&lt;/strong&gt;&lt;br&gt;
And the probability of tails = &lt;strong&gt;1/2 = 0.5&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;One important rule — all probabilities of all possible outcomes must always &lt;strong&gt;add up to 1&lt;/strong&gt;.&lt;/p&gt;

&lt;p&gt;0.5 + 0.5 = 1 ✓&lt;/p&gt;

&lt;h3&gt;
  
  
  What happens in real life?
&lt;/h3&gt;

&lt;p&gt;If you flip a coin 10 times, you might get 6 heads and 4 tails. That is normal.&lt;/p&gt;

&lt;p&gt;But if you flip 10,000 times, the result will get very close to 50% heads and 50% tails.&lt;/p&gt;

&lt;p&gt;More experiments you run → closer you get to the true probability.&lt;/p&gt;




&lt;h2&gt;
  
  
  2. What is a Random Variable?
&lt;/h2&gt;

&lt;p&gt;Think of a random variable as an &lt;strong&gt;empty slot&lt;/strong&gt;.&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Before the experiment — the slot is empty&lt;/li&gt;
&lt;li&gt;You run the experiment&lt;/li&gt;
&lt;li&gt;After the experiment — the slot is filled with a number&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Why do we need it?
&lt;/h3&gt;

&lt;p&gt;Outcomes are often text — "heads", "tails", "win", "lose". Mathematics works with numbers, not words.&lt;/p&gt;

&lt;p&gt;So we assign a number to each possible outcome. This is what a random variable does.&lt;/p&gt;

&lt;h3&gt;
  
  
  Example — Coin Toss
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Before the experiment:&lt;/strong&gt; Slot is empty. Two possible outcomes exist.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Possible outcome&lt;/th&gt;
&lt;th&gt;Assigned number&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Heads&lt;/td&gt;
&lt;td&gt;1&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Tails&lt;/td&gt;
&lt;td&gt;0&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;&lt;strong&gt;Experiment:&lt;/strong&gt; Flip the coin.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;After the experiment:&lt;/strong&gt; Landed heads → Slot is filled with &lt;strong&gt;1&lt;/strong&gt;&lt;/p&gt;




&lt;p&gt;&lt;strong&gt;New experiment, new slot.&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Before:&lt;/strong&gt; Fresh empty slot.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Experiment:&lt;/strong&gt; Flip again.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;After:&lt;/strong&gt; Landed tails → Slot is filled with &lt;strong&gt;0&lt;/strong&gt;&lt;/p&gt;




&lt;h3&gt;
  
  
  Key point
&lt;/h3&gt;

&lt;p&gt;A random variable is not a fixed number. It can be different every time you run the experiment. That is why it is called a &lt;em&gt;variable&lt;/em&gt;.&lt;/p&gt;




&lt;h2&gt;
  
  
  3. Discrete vs Continuous
&lt;/h2&gt;

&lt;p&gt;Not all random variables behave the same way. There are two types.&lt;/p&gt;

&lt;h3&gt;
  
  
  Discrete Random Variable
&lt;/h3&gt;

&lt;p&gt;The slot can only be filled with &lt;strong&gt;countable, specific values&lt;/strong&gt;. No values exist in between.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Examples:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Dice roll → only 1, 2, 3, 4, 5, 6. There is no 2.5 on a dice.&lt;/li&gt;
&lt;li&gt;Number of goals in a football match → 0, 1, 2, 3... you cannot score 1.7 goals.&lt;/li&gt;
&lt;li&gt;Number of students in a classroom → always a whole number.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Continuous Random Variable
&lt;/h3&gt;

&lt;p&gt;The slot can be filled with &lt;strong&gt;any value in a range&lt;/strong&gt;. There are always more precise values possible.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Examples:&lt;/strong&gt;&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Weight of a mango → 182g, 182.1g, 182.13g, 182.137g... it never stops.&lt;/li&gt;
&lt;li&gt;Height of a person → 170.0cm, 170.01cm, 170.001cm...&lt;/li&gt;
&lt;li&gt;Time taken to run 100 meters → infinite precision possible.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Quick test
&lt;/h3&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Question&lt;/th&gt;
&lt;th&gt;Answer&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Number of eggs in a basket&lt;/td&gt;
&lt;td&gt;Discrete&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Exact temperature of water&lt;/td&gt;
&lt;td&gt;Continuous&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Number of SMS messages sent today&lt;/td&gt;
&lt;td&gt;Discrete&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Height of a building&lt;/td&gt;
&lt;td&gt;Continuous&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;




&lt;h2&gt;
  
  
  4. Real World Examples — Coin, Dice, LLM
&lt;/h2&gt;

&lt;p&gt;Now let us walk through three experiments using the same structure:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Start point&lt;/li&gt;
&lt;li&gt;What was done as the experiment&lt;/li&gt;
&lt;li&gt;What filled the slot at the end&lt;/li&gt;
&lt;/ul&gt;




&lt;h3&gt;
  
  
  Experiment 1 — Coin Toss
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Start point:&lt;/strong&gt; Slot is empty. Two possible values — 1 or 0.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Experiment:&lt;/strong&gt; Flip the coin.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Result:&lt;/strong&gt; Landed heads → Slot filled with &lt;strong&gt;1&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Type: &lt;strong&gt;Discrete&lt;/strong&gt; — only two possible values.&lt;/p&gt;




&lt;h3&gt;
  
  
  Experiment 2 — Dice Roll
&lt;/h3&gt;

&lt;p&gt;&lt;strong&gt;Start point:&lt;/strong&gt; Slot is empty. Six possible values — 1, 2, 3, 4, 5, 6.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Experiment:&lt;/strong&gt; Roll the dice.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Result:&lt;/strong&gt; Landed on 4 → Slot filled with &lt;strong&gt;4&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Type: &lt;strong&gt;Discrete&lt;/strong&gt; — six countable values.&lt;/p&gt;




&lt;h3&gt;
  
  
  Experiment 3 — LLM picks the next word
&lt;/h3&gt;

&lt;p&gt;You type: &lt;em&gt;"The sky is"&lt;/em&gt;&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Start point:&lt;/strong&gt; Slot is empty. The LLM has a fixed list of words called a &lt;strong&gt;vocabulary&lt;/strong&gt; — roughly 50,000 words. Always the same list.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Experiment:&lt;/strong&gt; LLM runs an internal calculation. It assigns a probability to every single word in the vocabulary.&lt;/p&gt;

&lt;p&gt;Example result of that calculation:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Possible next word&lt;/th&gt;
&lt;th&gt;Probability&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;blue&lt;/td&gt;
&lt;td&gt;0.60&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;clear&lt;/td&gt;
&lt;td&gt;0.25&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;dark&lt;/td&gt;
&lt;td&gt;0.10&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;falling&lt;/td&gt;
&lt;td&gt;0.05&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;... 49,996 more words&lt;/td&gt;
&lt;td&gt;very small values&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;All probabilities add up to &lt;strong&gt;1&lt;/strong&gt; — same rule as always.&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Result:&lt;/strong&gt; One word is picked based on these probabilities → Slot filled with &lt;strong&gt;"blue"&lt;/strong&gt;&lt;/p&gt;

&lt;p&gt;Type: &lt;strong&gt;Discrete&lt;/strong&gt; — vocabulary is a fixed, countable list.&lt;/p&gt;




&lt;h3&gt;
  
  
  The pattern across all three
&lt;/h3&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;&lt;/th&gt;
&lt;th&gt;Coin&lt;/th&gt;
&lt;th&gt;Dice&lt;/th&gt;
&lt;th&gt;LLM&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Slot before experiment&lt;/td&gt;
&lt;td&gt;Empty&lt;/td&gt;
&lt;td&gt;Empty&lt;/td&gt;
&lt;td&gt;Empty&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Possible outcomes&lt;/td&gt;
&lt;td&gt;2&lt;/td&gt;
&lt;td&gt;6&lt;/td&gt;
&lt;td&gt;50,000 words&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Experiment&lt;/td&gt;
&lt;td&gt;Flip&lt;/td&gt;
&lt;td&gt;Roll&lt;/td&gt;
&lt;td&gt;Internal calculation&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Slot after experiment&lt;/td&gt;
&lt;td&gt;0 or 1&lt;/td&gt;
&lt;td&gt;1 to 6&lt;/td&gt;
&lt;td&gt;One word&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Type&lt;/td&gt;
&lt;td&gt;Discrete&lt;/td&gt;
&lt;td&gt;Discrete&lt;/td&gt;
&lt;td&gt;Discrete&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;




&lt;h2&gt;
  
  
  5. How LLM Uses Random Variables
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Why does ChatGPT give different answers every time?
&lt;/h3&gt;

&lt;p&gt;Even the word &lt;em&gt;"falling"&lt;/em&gt; has probability 0.05 — not zero. So occasionally it gets picked.&lt;/p&gt;

&lt;p&gt;The slot can be filled by any outcome. Just some are far more likely than others.&lt;/p&gt;

&lt;p&gt;Same prompt → same probabilities → but picking is random → different word each time.&lt;/p&gt;

&lt;h3&gt;
  
  
  The temperature setting
&lt;/h3&gt;

&lt;p&gt;There is a setting in LLMs called &lt;strong&gt;temperature&lt;/strong&gt; that controls how random the picking is.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Temperature&lt;/th&gt;
&lt;th&gt;Effect&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Low (near 0)&lt;/td&gt;
&lt;td&gt;Almost always picks the highest probability word — predictable, repetitive&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;High (1.0+)&lt;/td&gt;
&lt;td&gt;Picks lower probability words more often — creative, unpredictable&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;This is the same as controlling how random your experiment is.&lt;/p&gt;




&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;You started with a simple coin toss and ended with understanding how a large language model generates text. The same idea runs through all of it.&lt;/p&gt;

&lt;p&gt;Three things to remember:&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;Concept&lt;/th&gt;
&lt;th&gt;Simple definition&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;Random Variable&lt;/td&gt;
&lt;td&gt;An empty slot filled with a number after an experiment&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Discrete&lt;/td&gt;
&lt;td&gt;Slot can only take countable, specific values&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;Continuous&lt;/td&gt;
&lt;td&gt;Slot can take any value in a range — infinite precision&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;Every time an LLM generates a word, it is filling a random variable slot — from a vocabulary of 50,000 words, each with a probability, picked by a calculation.&lt;/p&gt;

&lt;p&gt;That is the entire connection — from coin toss to LLM.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>beginners</category>
      <category>machinelearning</category>
      <category>tutorial</category>
    </item>
  </channel>
</rss>
