Yes... because everything is a string and everything is a single type. 🙃
All this post comes down to is “I don’t like null”. You don’t seem to have an understanding of the use so you make up reasons for it not needing to exist.
Don’t like it then don’t use it but it exists for a reason and the fact you can’t accept that shows what kind of developer you are.
It seems somebody skipped over the disclaimer at the bottom of the post! 🤣 ... should I assume that you can't live without two different nullish values, then? How do you think languages that only have one of them work? How do you think languages that don't have nullish at all work? If other languages can work without two different type of nullish values, why JavaScript can't?
Obviously there are different types, and different possibilities for optional values, that doesn't mean we then need to do all types Type | null and rely on falsy, that only means that if you have different types, you need to deal with them in different ways. Using nullalways is a really poor solution.
And if the number of downloads of the ESLint rule to avoid null tells us something, is that is not "only me".
If other languages can work without two different type of nullish values, why JavaScript can't?
That's an argument that doesn't hold up under further scrutiny, though. The "single nullish type" causes its own problems that JavaScript actually resolves.
Let's take Java as an example. It only has the single type null which is the default variable assignment for any object type. The "Map" interface in the Java standard library (Java's dictionary data structure) specifies that it returns null when client code gets a value from the map using a key that the map doesn't know about. Unfortunately, it's well-known that null is often also an actual value associated to keys in the map. This single flaw is one of the leading reasons why NullPointerException is the number one exception observed in the logs of production Java applications.
That situation is resolved in JavaScript, because the absence of the key is represented as undefined rather than null which disambiguates the cases.
You mean to say that all languages with only one nullish value, or no nullish value at all didn't figured out something that JavaScript did? In any other languages, if something is "optional" (meaning it can be a certain type or nullish), you just have to check before using it. If you don't, you run into issues (one of the main reasons some languages have the concept of Maybe instead of having nullish). In JavaScript you can still run into exceptions (TypeError) if you try to run something that's nullish or try to access a property inside a nullish. Both are solved using nullish coalescing and optional chaining, so you can still use either one of them. The solution to the problem isn't having two different nullish X_X
What I'm pointing out is that it's not a very good argument to say, "if other languages can work with a single null type, why can't JavaScript?" Just because other languages figure out ways to exist with only a single nullish type, doesn't mean anything in terms of the advantages of having more than one nullish type.
As a complete aside: you're actually making the argument for more than one nullish type yourself when you deflect to creating an optional or maybe type. There's also the "Null Object Pattern." These are all attempts to grapple with legitimate issues in programming. Using "null" and "undefined" together is just a different attempt.
At the end of the day, I could avoid the use of both null and undefined in my programs. That doesn't make it a good idea. I saw that someone made an HTTP server in pure BASH. Interesting? Sure. Does that mean I don't "need" HTTP servers? No.
But the argument about languages having only one nullish, or not having any, is to point out that generally "nullish" should be avoided, and with dealing with it we should be as straightforward as possible. Having 2 nullish is the contrary to avoiding nullish 😅
The article shows several scenarios in which you might need nullish, and how you can deal with it only using a single nullish value (undefined). Those same problems could be solved only using null and avoiding undefined, but you end up doing extra work when you could just use the nullish that the language uses.
At the end of the day, I could avoid the use of both null and undefined in my programs. That doesn't make it a good idea.
Why not? Is commonly known that nullish values are a bad idea (the creator of the null pointer itself regrets that decision). My point in the article is that if you have to use nullish, at least you can use the one that JS uses.
Is there an example in this article that's using undefined and makes you think: You neednull for that?
The whole "other languages do well without two non-values" argument doesn't mean that JavaScript should do the same; it just means that there's no need for JavaScript to specifically choose this solution to the problem.
If you want an actual reason not to have two non-values, it's that this just seems very arbitrary. Why two, why not three? And if you're ascribing more and more semantics to the different kinds of value-absence, shouldn't you also expose this possibility to the user? At this point, just let users define their own values that are treated as falsey by the language and add a FalseySymbol counterpart to Symbol.
If you want an actual reason not to have two non-values, it's that this just seems very arbitrary.
I don't think it seems arbitrary at all. The language specification is not arbitrary when it describes null as meaning something explicitly set and undefined meaning something, well, undefined. Just because they both evaluate to "falsey" doesn't mean there's not a clear difference to their actual meaning. The reason those two exist (and not three or four or five) is because there's not another clear cut solidly-defined use case for it. But for both null and undefined, they both have a clear unambiguous meaning and plenty of use cases.
You misunderstood my point; both null and undefined are not at all arbitrarily defined; what is arbitrary is exactly having two of them.
If you're going to distinguish non-values of different kinds, why only two? Why not add a third one for "no selection", or a NaN-like value for "unknown" values? And I'm sure many problem domains would come up with their own non-values that make absolute sense to distinguish from nullandundefined.
As I said, at that point a more elegant solution would be adding FalseySymbol or NullishSymbol or both and leave it up for the user to define specific non-values that hold whatever semantic meaning they want.
Some comments have been hidden by the post's author - find out more
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
Gotta love how you completely skipped over the issue.
So how in the above do you suppose isSet would work since all you’re talking about is typescript types which don’t actually exist at runtime.
const settings = something.
if(isSet(something.option)){}
Using falsy values is like pointing a gun directly to your foot ...
And you'll get:
Option: Intentionally unsetif the value is"".Option: fooif the value is"foo".Option: Not set yetif the value isundefined.Really, you don't need
null, or at least not for this.Yes... because everything is a string and everything is a single type. 🙃
All this post comes down to is “I don’t like null”. You don’t seem to have an understanding of the use so you make up reasons for it not needing to exist.
Don’t like it then don’t use it but it exists for a reason and the fact you can’t accept that shows what kind of developer you are.
It seems somebody skipped over the disclaimer at the bottom of the post! 🤣 ... should I assume that you can't live without two different nullish values, then? How do you think languages that only have one of them work? How do you think languages that don't have nullish at all work? If other languages can work without two different type of nullish values, why JavaScript can't?
Obviously there are different types, and different possibilities for optional values, that doesn't mean we then need to do all types
Type | nulland rely on falsy, that only means that if you have different types, you need to deal with them in different ways. Usingnullalways is a really poor solution.And if the number of downloads of the ESLint rule to avoid null tells us something, is that is not "only me".
I've been using Lua for years and never felt the need to have a second non-value. You either have
nil, or you have a value.If you want a special value that's distinct from everything else, you just use an empty object and compare to it explicitly.
That's an argument that doesn't hold up under further scrutiny, though. The "single nullish type" causes its own problems that JavaScript actually resolves.
Let's take Java as an example. It only has the single type
nullwhich is the default variable assignment for any object type. The "Map" interface in the Java standard library (Java's dictionary data structure) specifies that it returnsnullwhen client code gets a value from the map using a key that the map doesn't know about. Unfortunately, it's well-known thatnullis often also an actual value associated to keys in the map. This single flaw is one of the leading reasons why NullPointerException is the number one exception observed in the logs of production Java applications.That situation is resolved in JavaScript, because the absence of the key is represented as
undefinedrather thannullwhich disambiguates the cases.You mean to say that all languages with only one nullish value, or no nullish value at all didn't figured out something that JavaScript did? In any other languages, if something is "optional" (meaning it can be a certain type or nullish), you just have to check before using it. If you don't, you run into issues (one of the main reasons some languages have the concept of
Maybeinstead of having nullish). In JavaScript you can still run into exceptions (TypeError) if you try to run something that's nullish or try to access a property inside a nullish. Both are solved using nullish coalescing and optional chaining, so you can still use either one of them. The solution to the problem isn't having two different nullish X_XWhat I'm pointing out is that it's not a very good argument to say, "if other languages can work with a single null type, why can't JavaScript?" Just because other languages figure out ways to exist with only a single nullish type, doesn't mean anything in terms of the advantages of having more than one nullish type.
As a complete aside: you're actually making the argument for more than one nullish type yourself when you deflect to creating an optional or maybe type. There's also the "Null Object Pattern." These are all attempts to grapple with legitimate issues in programming. Using "null" and "undefined" together is just a different attempt.
At the end of the day, I could avoid the use of both null and undefined in my programs. That doesn't make it a good idea. I saw that someone made an HTTP server in pure BASH. Interesting? Sure. Does that mean I don't "need" HTTP servers? No.
But the argument about languages having only one nullish, or not having any, is to point out that generally "nullish" should be avoided, and with dealing with it we should be as straightforward as possible. Having 2 nullish is the contrary to avoiding nullish 😅
The article shows several scenarios in which you might need nullish, and how you can deal with it only using a single nullish value (
undefined). Those same problems could be solved only usingnulland avoidingundefined, but you end up doing extra work when you could just use the nullish that the language uses.Why not? Is commonly known that nullish values are a bad idea (the creator of the null pointer itself regrets that decision). My point in the article is that if you have to use nullish, at least you can use the one that JS uses.
Is there an example in this article that's using
undefinedand makes you think: You neednullfor that?The whole "other languages do well without two non-values" argument doesn't mean that JavaScript should do the same; it just means that there's no need for JavaScript to specifically choose this solution to the problem.
If you want an actual reason not to have two non-values, it's that this just seems very arbitrary. Why two, why not three? And if you're ascribing more and more semantics to the different kinds of value-absence, shouldn't you also expose this possibility to the user? At this point, just let users define their own values that are treated as falsey by the language and add a
FalseySymbolcounterpart toSymbol.I don't think it seems arbitrary at all. The language specification is not arbitrary when it describes
nullas meaning something explicitly set andundefinedmeaning something, well, undefined. Just because they both evaluate to "falsey" doesn't mean there's not a clear difference to their actual meaning. The reason those two exist (and not three or four or five) is because there's not another clear cut solidly-defined use case for it. But for bothnullandundefined, they both have a clear unambiguous meaning and plenty of use cases.You misunderstood my point; both
nullandundefinedare not at all arbitrarily defined; what is arbitrary is exactly having two of them.If you're going to distinguish non-values of different kinds, why only two? Why not add a third one for "no selection", or a NaN-like value for "unknown" values? And I'm sure many problem domains would come up with their own non-values that make absolute sense to distinguish from
nullandundefined.As I said, at that point a more elegant solution would be adding
FalseySymbolorNullishSymbolor both and leave it up for the user to define specific non-values that hold whatever semantic meaning they want.