The Famous JavaScript Surprise ⚡
Have you ever come across some tutorial or blog saying "JavaScript mathematics is weird"? Something where you tried writing down on your console to run "0.1 + 0.2" and it didn't give "0.3", well not exactly?
And probably you would have seen nothing of that sort in C++ or Python or Java and maybe any other language you would have used. The answer never defies the basic addition rules, it is always "0.3" as it should be. Isn't that right?
Expectation:
> 0.1 + 0.2
0.3
Reality:
> 0.1 + 0.2
0.30000000000000004
Let's unravel this weirdness!
What JavaScript Is Really Doing With Numbers 🔢
Unlike other programming languages, JavaScript has only one numeric type: Number. Under the hood, all numbers are stored as floating-point values.
Understanding where things go wrong requires the understanding of how values are stored in bits, especially values having decimals.
The curse of binary 💻
Just like the famous value pi which can never be represented properly and wholly, there are values which when tried to represent in binary end up being infinitely repeating and thus non-terminating.
One of such values is 0.1:
0.1 (base 10) = 0.0001100110011001100110011… (base 2)
with "0011" repeating forever.
Since, IEEE 754 floating-point representation is used to store values and due to fixed number of bits available to store the value, the value of non-terminating sequence is rounded-off.
Thus, the value 0.1 itself does not get stored as 0.1. The stored value is a tiny bit above the real 0.1.
Similar thing happens with 0.2.
And thus 0.3 never comes up. Instead it is something like 0.30000000000000004.
Want to learn more in depth regarding the representation?
Read more about floating-point representation
And guess what? JavaScript isn't the only one having it this way.
It Happens in Other Languages Too 🌐
This behavior is not unique to JavaScript.
You’ll see the same thing in:
- Python
- Java
- C++
- C#, etc.
So why does this look fine?
cout << (0.1 + 0.2); // prints 0.3
Because printing and comparing are different things.
- Floating-point math produces tiny inaccuracies
- Printing rounds values for humans
- Equality checks don’t round
Most languages format floating-point numbers when displaying them, rounding the value for readability. JavaScript simply makes the raw result more visible.
Don't believe me? Try doing this:
if ((0.1 + 0.2) == 0.3) {
cout << "Pranav was wrong! Haha 😅";
} else {
cout << "Damn! How come I never noticed it? 🤯";
}
The issue exists everywhere—JavaScript just doesn’t hide it as much.
Conclusion: JavaScript isn't weird 😎 (well not for this)
If 0.1 + 0.2 !== 0.3 still feels wrong, that’s okay—it should. It’s a clash between human intuition and machine reality. JavaScript just happens to be honest about it. Learn the rule, respect the edge cases, and your code will be better for it.
Top comments (0)