null. The concept of "there is no value" is necessary, but the implementation of null as a pointer to nowhere is very problematic in most languages.
The problem is that nearly anything can be null, and in most languages there is no way to restrict a type declaration to a non-null value. A null guard clause is the quintessential example of a DRY violation.
And also if you are given a null value, there is no way to determine its declaring type.
In conclusion, null is the original showcase of what accidental complexity looks like. For alternatives, see the Null Object Pattern for OOP, and Maybe/Option for FP. C# also has the concept of Nullable for value types, and it is supposedly going to be applicable to reference types in a future release.
I agree that nothingness needs to have some sort of representation (even typed nothingness), but null is definitely not the way to do it. Tony Hoare introduced the concept in ALGOL 65 because it was easy, but he's since said it was his billion dollar mistake. I think that may be an underestimation, honestly. I'm definitely looking forward to C# adopting Nullable for reference types in the near future, and I definitely lean towards Maybe/Option. Thanks for the comment!
We're a place where coders share, stay up-to-date and grow their careers.
We strive for transparency and don't collect excess data.