When AI confidently makes things up
Day 57 of 149
๐ Full deep-dive with code examples
The Confident Liar
You ask your friend about a book.
Friend hasn't read it, but says confidently:
"Oh yeah! The main character is named David, and he lives in Paris!"
...The book has no David. It's set in Tokyo. ๐คท
Friend made it up but sounded certain!
AI does this too. It's called hallucination.
Why It Happens
AI predicts the next word based on patterns.
Sometimes those patterns create plausible-looking but false information:
- Fake citations that don't exist
- Made-up statistics
- Wrong facts stated confidently
- Names of people who aren't real
Examples
โ "The paper by Smith et al. (YEAR) shows..."
(Paper doesn't exist)
โ "The population of Sydney is way higher than it really is."
(The number is made up)
โ "Einstein invented the telephone."
(Nope, that was Bell)
How to Avoid
- Verify important facts
- Use RAG (give AI real documents)
- Ask AI to cite sources (and check them!)
- Be skeptical of specific numbers/names
In One Sentence
Hallucinations are when AI generates false information that looks and sounds completely believable.
๐ Enjoying these? Follow for daily ELI5 explanations!
Making complex tech concepts simple, one day at a time.
Top comments (0)