DEV Community

Sreekar Reddy
Sreekar Reddy

Posted on • Originally published at sreekarreddy.com

๐Ÿ‘ป Hallucinations Explained Like You're 5

When AI confidently makes things up

Day 57 of 149

๐Ÿ‘‰ Full deep-dive with code examples


The Confident Liar

You ask your friend about a book.

Friend hasn't read it, but says confidently:
"Oh yeah! The main character is named David, and he lives in Paris!"

...The book has no David. It's set in Tokyo. ๐Ÿคท

Friend made it up but sounded certain!

AI does this too. It's called hallucination.


Why It Happens

AI predicts the next word based on patterns.

Sometimes those patterns create plausible-looking but false information:

  • Fake citations that don't exist
  • Made-up statistics
  • Wrong facts stated confidently
  • Names of people who aren't real

Examples

โŒ "The paper by Smith et al. (YEAR) shows..."
(Paper doesn't exist)

โŒ "The population of Sydney is way higher than it really is."
(The number is made up)

โŒ "Einstein invented the telephone."
(Nope, that was Bell)


How to Avoid

  1. Verify important facts
  2. Use RAG (give AI real documents)
  3. Ask AI to cite sources (and check them!)
  4. Be skeptical of specific numbers/names

In One Sentence

Hallucinations are when AI generates false information that looks and sounds completely believable.


๐Ÿ”— Enjoying these? Follow for daily ELI5 explanations!

Making complex tech concepts simple, one day at a time.

Top comments (0)