That's what I said: it depends on the use case. And I wholeheartedly agree that we should be conscious about the data formats we use every day and the implications of their use - especially in the context of coercion in weak-typed languages like JavaScript.
in the context of coercion in weak-typed languages like JavaScript
As you point out the implicitness of type coercion (also falsy and truthy) can lead to surprising behaviour - coming from other languages. However the notion of strong or weak-typed isn't all that useful.
JavaScript is dynamically typed (as opposed to statically typed) and it's loosely typed as non const bindings can change the type they refer to at run-time. But everything has a type in the sense that it is either a primitive value, a structural type, or null - a structural root primitive.
The issue is that before BigInt (caiuse BigInt: note: IOS prior to 14 doesn't support it) there wasn't a dedicated integer type and even now there are performance implications with using BigInt. So for the general Number use case anything outside the [-253 - 1, 253 - 1] range is effectively a floating point value due to the lack of precision and constant care has to be taken that an "integer" value isn't inadvertently turned into a "floating point" value (e.g. dividend/divisor instead of Math.trunc(dividend/divisor)).
Before BigInt, there was asm.js (which automatically handled numbers that were floored with |0 in any operation as integers) and Typed arrays. But other than that, you're right.
That's what I said: it depends on the use case. And I wholeheartedly agree that we should be conscious about the data formats we use every day and the implications of their use - especially in the context of coercion in weak-typed languages like JavaScript.
As you point out the implicitness of type coercion (also falsy and truthy) can lead to surprising behaviour - coming from other languages. However the notion of strong or weak-typed isn't all that useful.
JavaScript is dynamically typed (as opposed to statically typed) and it's loosely typed as non
const
bindings can change the type they refer to at run-time. But everything has a type in the sense that it is either a primitive value, a structural type, ornull
- a structural root primitive.The issue is that before
BigInt
(caiuse BigInt: note: IOS prior to 14 doesn't support it) there wasn't a dedicated integer type and even now there are performance implications with usingBigInt
. So for the generalNumber
use case anything outside the [-253 - 1, 253 - 1] range is effectively a floating point value due to the lack of precision and constant care has to be taken that an "integer" value isn't inadvertently turned into a "floating point" value (e.g.dividend/divisor
instead ofMath.trunc(dividend/divisor)
).And then bitwise operators only operate on 32-bit values (Int32, Uint32 for unsigned right shift (>>>); Integers and shift operators in JavaScript).
Douglas Crockford went as far as proposing an alternate number format some time around or before 2014 - DEC64.
Before BigInt, there was asm.js (which automatically handled numbers that were floored with
|0
in any operation as integers) and Typed arrays. But other than that, you're right.i.e. trunctated, not floored - and that tactic is limited to signed 32 bit values.
Interestingly enough Rescript adopted 32 bit signed integers while TypeScript never bothered with them.
TypedArray/ArrayBuffer/DataView are less about integers and more about a performant means of sharing/moving binary data between execution contexts.