JavaScript has been my main work language for years now (and of course these days it’s mostly TypeScript 😉). I started working with it before ES6, ...
For further actions, you may consider blocking this person and/or reporting abuse
Yes.
People making fun of JS often reveal their ignorance and stupidity.
My favorite is everything NaN related like
NaN !== NaNandtypeof NaN === "number".If you think about it, it totally makes sense. You do some mathematical operations and something goes wrong (i.e. divide by 0). The result is NaN. You continue to calculate and all following results are also NaN. The final result is NaN. And if you compare this result to anything, the result is false. This makes perfect sense.
I disagree that "it totally makes sense" If you divide by zero, you should receive an explicit error, not a special value that silently propagates through subsequent computations as if nothing went wrong.
The core issue with
NaNis not that it exists, but that it represents a failure state without forcing the developer to acknowledge it. Silent failure is precisely what leads to subtle, hard-to-debug bugs...I think this behavior only "made sense" to the language designers, who opted to propagate
NaNrather than surface a hard error or exception, accepting silent failure as a trade-off. For everyone else, it is simply a poor decision that we now have to live with...Of course, throwing an error would be the "correct" way.
But: JS was not designed to be "correct" - it was designed to "work". In a language where you can add strings to numbers (!), you shouldn't be bothered with exceptions just because you divide by 0 or tried to
parseInt("g01").And this coercion stuff made sense: you get some user
input.valueand you just want to work with it. (There was noinput.valueAsNumberback then)It is still possible to use this behavior to write very brief code, that does what it's supposed to do.
As said earlier: JS follows a philosophy that aligns well with the philosophies of HTML and CSS: Make the best of what the developer provides.
I started to write some sentences on how JS was not designed to write complex server or client applications. But I guess @sylwia-lask will elaborate on this.
Yes, exactly and I’ll definitely expand on this, but in a separate post 😄 I can already see that discussion there will be just as interesting as this one!
Totally fair take 👍 And I definitely get the frustration with silent failure - it can absolutely hide bugs. I’d just add one tiny bit of context: those “language designers” were basically Brendan Eich… in the mid-90s… building JavaScript in about a week 😅
Different era, different constraints, very browser-first priorities - and we’re all still living with those decisions today.
Exactly! 😄 I totally agree - NaN behavior actually makes perfect sense once you understand it, and I kind of love it too! I even debated making NaN the very first episode of this “series”… but in the end
true + true === 2won the opening slot 😅 NaN is definitely coming soon though!The NaN thing is mostly because of the IEEE float standard and the weirdness comes from the fact that NaN has multiple bit representations. Moreover, according to the standard, all operations involving NaN should result in NaN. Equality is really just checked by subtracting and checking if the result is 0, which wouldn't NaN is not.
That being said, the way JS plays fast and loose with types is generally a downside, as it makes it easy to shoot yourself in the foot and not catch it. Conversely, the entire benefit of strong static typing is the fact that it makes sure only well defined operations are applied, which eliminates an entire class of bugs that dynamically or weakly typed languages have to contend with to various extents.
Thanks a lot for the thoughtful comment! I completely agree that strong static typing has huge benefits - that’s exactly why I mostly work in TypeScript these days. It helps catch mistakes early and saves a lot of pain later.
At the same time, it’s also worth remembering the historical context: JavaScript was created in the ’90s, in about a week, for a very different web than the one we have today. Back then, its loose typing was arguably even a feature, because it made the language more approachable and quick to use. We’re just still living with some of those legacy design choices 🙂
I agree that historical context is important here. JS's philosophy is very much in line with that of HTML and CSS, where the browser tries it's best to still interpret "incorrect" code, for example by inserting missing closing tags. JS tries it's best not to throw an error, which is perfectly fine for what it was originally designed for (the early web), but in a lot of contexts where it's used today is more of a downside, especially backend development through Node.js.
TypeScript addresses most of the correctness issues, but it's still weak, as one can just assert that an object is of a given type by using the
askeyword, which makes it a static but weak type system, akin to the one used by C, where casts also pretty much never fail. A decent example of a stronger cast is C#'sisoperator, which returns a boolean if the cast succeeded and allows you to get a reference to the target type via pattern matching. Here's a quick example:C# also has an
asoperator, but it can also fail, in this case by returningnull, but it also has nullability checks, so it again forces you to handle the failing case.I agree with the article that JS is not that confusing as it is often made out to be, but most of the problems people point to are real problems if you're not careful, the same way that data races are a problem if you're not careful. They both can lead to unexpected or even non-deterministic behaviour, which is plain bad in almost every case.
Yesss - you nailed it with TypeScript as well! It definitely helps a lot, but it’s still not bulletproof. In my experience, the
askeyword is actually the least of the problems - plenty of juniors simply go forany, and that’s where the real slippery slope begins 😅Really love how thoughtful and insightful your comments are here - honestly, you should totally consider writing a post about this on dev.to, I’d read it in a heartbeat 😊
I'm also tired of the ones who do
[] + []and wonder why it shows:''like, first off - why? why would you do that? and then go on about calling js being broken.
stupid twats.
Haha, I get the frustration 😄
But yeah - most of these cases are really about understanding coercion rules, not JS being broken.
We loved your post so we shared it on social.
Keep up the great work!
"computer science behavior"
No, this is not computer science behavior, these are examples of compromises common in many programming languages, often due to the legacy of the C language. Regardless, some languages do not allow this, such as: Java, C#, Rust, Go, Swift, TypeScript*, Kotlin, Scala, Dart, Ruby, Haskell, Elixir, Zig, Ada, and Fortran.
*TypeScript will complain if the boolean type is specified.
Thanks a lot for this thoughtful comment! 🙌
You’re absolutely right - not every language allows boolean → number conversion, and many modern languages deliberately avoid it for safety and clarity. And yes, a lot of the permissive behavior in older ecosystems definitely comes from C heritage.
What I mainly wanted to highlight in the post is that this isn’t just JavaScript randomly being weird. There are quite a few languages where booleans behave numerically in some contexts, so the idea itself isn’t completely alien to programming it just feels surprising if you’re used to stricter type systems 🙂
But I totally agree with you: this behavior is a compromise, not “pure computer science truth”. Great addition to the discussion - thanks for bringing it up!
Your explanation is valid, but misses the thing that confused me: why three equals?
One: "takes the value". A = B + C means add B to C, store in A.
Two: "value compare". IF (A==B) compares A and B, returns Boolean True or False.
Three: Huh? You lost me.
Haha, fair question! 😄
In JavaScript
===is simply “strict equality”: it compares both the value and the type, with no implicit conversions.==can quietly coerce values (like"2" == 2 → true), while===says “no tricks, no guessing - they must be exactly the same”.So here it’s basically:
true + true → becomes 2, and
2 === 2is strictly, boringly true 😊Coercion... :D
Since you mentioned Kyle Simpson, here is my favorite post about how this could be done better
davidwalsh.name/fixing-coercion
Ahh, sweet memories! 😄 A 2015 article teasing the upcoming Types & Grammar — love it!
Saving this to reread and refresh my brain for sure 😀
And yeah… whenever I see the “only ===!!!!!1” mantra, Kyle Simpson instantly comes to mind - but I’ll behave and stay quiet… for now xD
I generally agree that it's not magic, it's coercion. I just think that if someone finds this behaviour weird it's not because the booleans shouldn't be interpreted as numbers but because it's purelly implicit and JS will not tell you what it's going to do with your expressions (You need to know rules for that). In C this makes some sense, because there are no booleans. But JS has specific bool type but it is still trying to cast it into number. I think that coercion doesn't have a real use (if you need to cast it, you can do it explicitly). So yes, the
true + true == 2is logical, because what is not logical, is having implicit casting in the first place. (Maybe it has reason why it exists, but i just don't know about it). But yes, it is a thing and it will probably not be removed anytime soon.Totally fair points 👍 For many people the “weird” part isn’t that booleans can map to numbers, but that it all happens implicitly and silently - and yeah, that’s very much part of JavaScript’s early “be convenient, not strict” design heritage. These days I agree that explicit casting often feels clearer, especially in a world with TypeScript and stricter tooling.
And you’re also right: this behavior is basically here to stay. One of TC39’s core principles is “don’t break the web”, so changing something as fundamental as coercion would probably break a non-trivial chunk of the internet 😅
So it’s definitely a compromise - maybe not everyone loves it - but once you understand where it comes from, it stops feeling like randomness and more like a historical design choice. Thanks for the thoughtful discussion!
Great article, Sylwia! The article makes it easier for beginners for JavaScript.
Thanks, Ben! That's the point, always happy to help 😊
You are welcome :)
You are one of the few devs I've found that isn't bashing JavaScript and understands it's nuances! Had to create an account just to follow your posts!
Wow, thank you so much! This absolutely made my day 💚
Thanks for this post. JS has indeed some peculiarities :). it is important to learn of these.
Thanks a lot! 😊 And yep - JS definitely has its peculiarities, but that’s what makes it fun to learn. Always happy to see you here! 💚
Yep, type coercion. If you want to circumvent this behavior, this may be useful
Yep, totally - type coercion doing its thing 😄 Thanks for sharing, always nice to have extra tools around!
This is something I saw and immedately thought of typecasting being the culprit (I do a lot of Java coding rn)
Exactly 😄 Once you realize it’s just type coercion doing its job, it suddenly feels much less mysterious. Java training definitely helps here!
JavaScript isn’t broken, it’s just type coercion at work. Once you understand it, the logic makes sense.
Exactly! 😄 It’s just coercion doing its job - once you understand it, everything suddenly feels logical.
(true + true === 2) === (1 + 1 === 2)
Boolean gang, but make it math 😎
JS: “Trust me bro, I know what I’m doing.” ✔️
Once you understand type coercion, true + true === 2 feels obvious, not weird.
JS isn’t broken — people just skip learning how it thinks. Looking forward to the series 😄
Couldn’t agree more! 😄 Once you get how JS “thinks”, so many things stop being weird and start being fun. Thanks — and I’m glad you’re looking forward to the next parts!
destroyallsoftware.com/talks/wat