Note: this is going to be a long discussion. I enjoy it very much, but I don't understand DEV enough to know whether it is a good etiquette to hijack a top comment like this. So, Lydia, if you feel we're not providing value to your readers, just say so and we'll find a different venue.
C++ is a fantastic language for beginners to learn because it exposes them to fundamentals that are very important to learn early on
I agree with everything in that sentence except "early on". Seriously, has the first language you've learned been C++?? I know it wasn't in my case. The concepts that are important in the early stages of programming are quite different from ones that are important in the long term, and that's fine -- in all human endeavors. That is what separates apprentices from masters.
If you knock down a beginner (who is just trying to grasp the syntax) with undefined behavior, there is a good chance they'll never get up anymore. They'll just conclude programming isn't for them. (I've seen it!) Which you might say is just fair, but I think that is the real shame. Just a clash of value systems, I guess.
it's still very much relevant today.
Absolutely. Languages are relevant for many different reasons. COBOL was relevant because of a huge library, FORTRAN because of its numeric capabilities, C because of portability, JS because it runs in a browser, Python because the programs are a joy to read, Lisp because... well, Lisp was never relevant. :-Q
C++ is mainly relevant because modern C compilers (which were goaded into compiling C++ too) are the most complex software the humankind has ever produced, by an order of magnitude. Many decades will pass before we accomplish something similar for any other paradigm. (See Rust as a perfect example of how hard it is to get such a project off the ground, despite having intelligent hardworking people and a much better paradigm.) So if we just want to get as close to the metal as possible without tying our code to a particular type of metal, there really is no choice.
But of course, that criteria mean perfectly nothing to beginners. For them, "runs in a browser" or "natural readability" are much more relevant. And we can not blame them for that. For me, the criteria was "It is hardwired in my machine, so I don't have to wait while it loads from tape", so I chose BASIC.
When drawing diagrams of objects in memory, for example, we tend to naturally use arrows (pointers).
So do a test. Take someone without C-centered education, draw such a diagram and ask them to put in words what they see. They'll use "connected to", "describes" or even "is a value of", and you might get them to produce the word "reference", but you'll never get them to say "pointer". Simply because they've never heard of that word in that context.
Unfortunately, the true meaning of "reference" has been lost.
It hasn't. Bibliographers use it all the time. Even programmers, for example when they write LaTeX documents, use \ref to refer to a label. They don't use \point. :-P It just seems so if you live in a C-centered world.
Again, it's not pedantry. Terminology needs to be precise.
Absolutely. Only, in my opinion, "precise" is precisely opposite from what you say. To be precise is to call a concept we're discussing, a reference. To use "a reference" for "an alias" is imprecise to the point of being incorrect. Nobody does that, except C++ programmers.
About passing by reference: that's a completely separate can of worms, but if you insist, we can discuss that too. There are two undeniable truths: first, everything is "pass by value", if we call that what is passed "a value". Pass by address in C is nothing more than passing the address by value, for example. The true importance of the phrase comes only when speaking of a concrete variable, say "pass x by ...". Then we want to say that ... of x is passed (by value, of course:). ... might be x's address, reference, its value, or even its name (in some old versions of ALGOL).
Second undeniable truth is that if you only know of two types of passing from Pascal, and you speak about them in the context of previous paragraph, you'll inevitably be confused here, because passing in JS is neither of those. It's only by serious language twisting "when you declare a Dog d, d is not really a dog, but a pointer to a dog" that you manage to shoehorn it into one of these buckets. But it explains nothing, precisely because of the first undeniable truth: every pass is by value if we call that what is passed "a value". But if it doesn't correspond to what people usually think when they speak about the value of the object, it's simply misleading.
Your suggestion is for us to continue shielding beginners from the truth for fear that they'll get scared away by the complexity.
That's emphatically not what I'm saying. I'm asking to use the proper terminology, "proper" being defined more broadly than Bjarne Stroustrup.
One last thing to note: Pointers exist in all languages, not just C++. Call them whatever you want.
You probably mean "pointers exist in the implementations of all languages, and that's really true if those implementations are written in C (which they usually are). But they don't have to be. You can write a Python in Java (Jython), and its implementation really has no pointers. Anywhere.
If you really think pointers exist at the lanuage level, then you're simply not thinking broadly enough. There are no pointers in PROLOG, Haskell or Turing machines. Really. :-)
At the lowest possible level, there is no way to store an "object" or an array or a string in a CPU register; all you can do is store bytes, sequentially or otherwise. Hence the universal need for pointers.
Ah, good old reductionism. It has never got us anywhere, but people still try. :-) OK, let's do this one too.
There is no "lowest possible level" (that we know of). I could as equally say that there is no way to store bytes, you can only store electrical impulses or magnetic orientations -- or even lower below, quantum states. On an even lower level, you don't store anything, you simply collapse the wave function of the memory into some observable state.
Now you'll say that this is ridiculously low, and more importantly, irrelevant for the topic. And that is completely true. But it is equally true for the modern beginner when you tell them about registers, bytes (in the context of memory), or machine instructions. It simply doesn't enter their mental model until much later. And that's a good thing. Our ability to progress depends on the always heightening the level of abstraction, leaving the details of implementation deep below. We must: there is a long journey upwards ahead of us. ;-)
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
Note: this is going to be a long discussion. I enjoy it very much, but I don't understand DEV enough to know whether it is a good etiquette to hijack a top comment like this. So, Lydia, if you feel we're not providing value to your readers, just say so and we'll find a different venue.
I agree with everything in that sentence except "early on". Seriously, has the first language you've learned been C++?? I know it wasn't in my case. The concepts that are important in the early stages of programming are quite different from ones that are important in the long term, and that's fine -- in all human endeavors. That is what separates apprentices from masters.
If you knock down a beginner (who is just trying to grasp the syntax) with undefined behavior, there is a good chance they'll never get up anymore. They'll just conclude programming isn't for them. (I've seen it!) Which you might say is just fair, but I think that is the real shame. Just a clash of value systems, I guess.
Absolutely. Languages are relevant for many different reasons. COBOL was relevant because of a huge library, FORTRAN because of its numeric capabilities, C because of portability, JS because it runs in a browser, Python because the programs are a joy to read, Lisp because... well, Lisp was never relevant. :-Q
C++ is mainly relevant because modern C compilers (which were goaded into compiling C++ too) are the most complex software the humankind has ever produced, by an order of magnitude. Many decades will pass before we accomplish something similar for any other paradigm. (See Rust as a perfect example of how hard it is to get such a project off the ground, despite having intelligent hardworking people and a much better paradigm.) So if we just want to get as close to the metal as possible without tying our code to a particular type of metal, there really is no choice.
But of course, that criteria mean perfectly nothing to beginners. For them, "runs in a browser" or "natural readability" are much more relevant. And we can not blame them for that. For me, the criteria was "It is hardwired in my machine, so I don't have to wait while it loads from tape", so I chose BASIC.
So do a test. Take someone without C-centered education, draw such a diagram and ask them to put in words what they see. They'll use "connected to", "describes" or even "is a value of", and you might get them to produce the word "reference", but you'll never get them to say "pointer". Simply because they've never heard of that word in that context.
It hasn't. Bibliographers use it all the time. Even programmers, for example when they write LaTeX documents, use \ref to refer to a label. They don't use \point. :-P It just seems so if you live in a C-centered world.
Absolutely. Only, in my opinion, "precise" is precisely opposite from what you say. To be precise is to call a concept we're discussing, a reference. To use "a reference" for "an alias" is imprecise to the point of being incorrect. Nobody does that, except C++ programmers.
About passing by reference: that's a completely separate can of worms, but if you insist, we can discuss that too. There are two undeniable truths: first, everything is "pass by value", if we call that what is passed "a value". Pass by address in C is nothing more than passing the address by value, for example. The true importance of the phrase comes only when speaking of a concrete variable, say "pass x by ...". Then we want to say that ... of x is passed (by value, of course:). ... might be x's address, reference, its value, or even its name (in some old versions of ALGOL).
Second undeniable truth is that if you only know of two types of passing from Pascal, and you speak about them in the context of previous paragraph, you'll inevitably be confused here, because passing in JS is neither of those. It's only by serious language twisting "when you declare a Dog d, d is not really a dog, but a pointer to a dog" that you manage to shoehorn it into one of these buckets. But it explains nothing, precisely because of the first undeniable truth: every pass is by value if we call that what is passed "a value". But if it doesn't correspond to what people usually think when they speak about the value of the object, it's simply misleading.
That's emphatically not what I'm saying. I'm asking to use the proper terminology, "proper" being defined more broadly than Bjarne Stroustrup.
You probably mean "pointers exist in the implementations of all languages, and that's really true if those implementations are written in C (which they usually are). But they don't have to be. You can write a Python in Java (Jython), and its implementation really has no pointers. Anywhere.
If you really think pointers exist at the lanuage level, then you're simply not thinking broadly enough. There are no pointers in PROLOG, Haskell or Turing machines. Really. :-)
Ah, good old reductionism. It has never got us anywhere, but people still try. :-) OK, let's do this one too.
There is no "lowest possible level" (that we know of). I could as equally say that there is no way to store bytes, you can only store electrical impulses or magnetic orientations -- or even lower below, quantum states. On an even lower level, you don't store anything, you simply collapse the wave function of the memory into some observable state.
Now you'll say that this is ridiculously low, and more importantly, irrelevant for the topic. And that is completely true. But it is equally true for the modern beginner when you tell them about registers, bytes (in the context of memory), or machine instructions. It simply doesn't enter their mental model until much later. And that's a good thing. Our ability to progress depends on the always heightening the level of abstraction, leaving the details of implementation deep below. We must: there is a long journey upwards ahead of us. ;-)