DEV Community

Doan Trong Nam
Doan Trong Nam

Posted on • Edited on

Do you know that 0.1 + 0.2 is not equal to 0.3?

Yes, this is a well-known issue in JavaScript (and many other programming languages) that use binary floating-point numbers. The problem arises from the fact that some numbers cannot be represented exactly in binary floating point, which leads to small rounding errors.
Here's an example:

console.log(0.1 + 0.2);  // Outputs: 0.30000000000000004
Enter fullscreen mode Exit fullscreen mode

Floating-point numbers are a way of representing real numbers in computer systems. They are used to approximate real numbers, but they can't always do it perfectly due to the way they are stored in memory.

In many programming languages, including JavaScript, floating-point numbers are represented according to the IEEE 754 standard. This standard represents numbers in binary, which means they are expressed as a sum of powers of 2.

While some numbers can be represented exactly in this system (like 0.5, which is 2^-1), others cannot. For example, the decimal number 0.1 cannot be represented exactly as a finite binary fraction. In binary, 1/10 is an infinitely repeating fraction.

When you perform arithmetic operations with these approximated numbers, the small errors can accumulate, leading to unexpected results. For example, in JavaScript, the expression 0.1 + 0.2 doesn't exactly equal 0.3.

This is not a flaw in the programming languages themselves, but a consequence of using binary floating-point numbers. To deal with this, programmers often use special libraries for high-precision arithmetic, or use techniques to compare floating-point numbers with a certain tolerance, rather than expecting them to be exactly equal.

Yes, there are libraries in JavaScript that can help with precision issues related to floating point arithmetic. One of them is decimal.js.

Here's an example of how you can use it:

First, install the library:

npm install decimal.js
Enter fullscreen mode Exit fullscreen mode

Then, use it in your code:

const Decimal = require('decimal.js');

let result = new Decimal(0.1).plus(new Decimal(0.2));

console.log(result.toString());  // Outputs: 0.3
Enter fullscreen mode Exit fullscreen mode

This library allows for much more precise arithmetic than JavaScript's native Number type. Try Playground

Top comments (0)