In JavaScript, it is common to declare and initialize a variable at the same time. It is also commonplace to declare a variable, leave it uninitialized, and then assign it at a later point. Any undeclared variable evaluated in the code, throws ReferenceError.
null and undefined are JS primitives and they differ from each other in terms of their types and what values they represent. Undeclared, however, is plain English, not a JavaScript keyword.
Differences
There are a number of differences between variables that are null and undefined:
- A
nullvalue's type isobject, whereas anundefinedvariable is ofundefinedtype. -
nullrepresents the presence of a value, but an intentional absence of an object (therefore, of typeobject), while anundefinedvariable points to absence of any value. -
nullhas to be assigned to a variable. In contrast,undefinedis set automatically at declaration, and can also be assigned explicitly.
Undeclared variables are different from null and undefined in the way how JavaScript treats an undeclared variable. An undeclared variable throws a ReferenceError, but its type is actually undefined.
console.log(x); // Uncaught ReferenceError: x is not defined
console.log(typeof x); // undefined
Notice how typeof x doesn't throw an error, because x is not being evaluated here.
Checking for them
Falsiness
null and undefined represent negative an absence of some value types. So, they are termed as falsy, as opposed to a truthy value.
In order to decide whether a variable is either null or undefined, we have to get a falsy value. In other words, not a truthy one. Generally, a variable is tested for truthiness, and if it fails, its falsy, i.e. either null or undefined.
let x;
if (x) {
console.log(`Hi, this is ${x}.`);
} else {
console.log(x); // undefined
};
x = null;
if (x) {
console.log(`Hi, this is ${x}.`);
} else {
console.log(`Now I'm ${x}`); // Now I'm null.
};
In order to decide between null and undefined states of a variable we have to test it with the strict equality operator ===:
let x;
if (x === undefined) {
console.log(`Hi, this is ${x}.`); // Hi, this is undefined.
} else {
console.log(x);
};
x = null;
if (x === null) {
console.log(`Hi, this is ${x}.`); // Hi, this is null.
} else {
console.log(x);
};
This is because the standard equality operator, ==, is ambiguous in deciding between the two. It returns true for any of these two values.
let x;
if (x == null) {
console.log(`Hi, this is ${x}.`); // Hi, this is undefined.
} else {
console.log(x);
};
x = null;
if (x == undefined) {
console.log(`Hi, this is ${x}.`); // Hi, this is null.
} else {
console.log(x);
};
Notice, how x == null returns true for x = undefined and x == undefined returns true for x = null. It's out of mind.
Undeclared
Undeclared variables in the global scope can be checked without throwing a ReferenceError with :
if ( 'x' in window) {
console.log('Hi');
} else {
console.log(`Hi, x doesn't live here.`); // Hi, x doesn't live here.
};
References
Top comments (1)
Strictly speaking
nullis a value adopted by developers coming from other languages which usenullas the representation of the bottom type.In terms of JavaScript
nullis excess baggage that will stick around indefinitely due to legacy APIs:The history of
typeof null