DEV Community

Discussion on: ⭐️ Interactive JavaScript Quiz #1

 
vedgar profile image
Vedran Čačić

About beginning with C++: it seems you agree with me. Only, I wouldn't say it's "irony", it's just a normal way the things are. It would be surprising if it were different. And wouldn't it be a great shame if you weren't a programmer today, just because C++ knocked you over one time too much when you were fragile? For example, you wouldn't have this discussion with me. :-F

I realized that learning to program isn't necessarily trivial and that it requires a good bit of self-study and hard work.

Absolutely! But if you face the same circumstances on your first encounter with programming, that's not the lesson you will extract from it. Trust me, I've seen it (I teach many young people programming, usually using languages I didn't choose -- as you probably realized by now.) C++ has the highest "screw this, programming is just not for me" rate. (I'm sure Malbolge would have even higher one, but no school in my vicinity starts with that.;))

I'm biased because I like to think at a low level.

I hope I showed you that "low level" is very relative. I've seen people telling other people things like "Yeah, Django is very nice, but I like to think at a low level, how the Python beneath it actually executes -- it really helps me in my work." :-D

Yes, I see where you're coming from, but I must regretfully inform you that most of the things you've learned are by now false or at least obsolete. L1 cache is more important than registers, vectorized operations are a thing, execution is not sequential, memory access is not atomic, processors actually use quantum effects when executing some instructions, and ALUs do speculate about what's going to come next, being at some points more superstitious than my grandma. :-P

Low-level thinking might be useful when it is correct. With modern advances in technology, it's less and less so.

In the long term, it's probably safe to continue using the term "reference" in JavaScript because it's become idiomatic. Plus, it is, as you say, a natural way to talk about objects—a variable "refers to" an object in memory.

That's all I wanted you to admit. But in the meantime we've opened a bunch of other subjects... :-D

Actually, it's not. In pass by reference, the formal parameter is a "true" reference (alias). Hence the name (and behavior) of pass by reference.

I understand that. I just hoped I could avoid this line of discussion since JS is neither. (Yes, it is pass by value if you redefine value, but I hope I have explained why I think it's unsatisfactory.) We don't pass values, and we don't pass references. We pass objects themselves. It's just a false dilemma (from my perspective, though I understand yours).

Most modern programming languages (save for C++ and PHP, for example) don't have true references (aliases), and hence they do not have any notion of pass by reference.

... and so the notion is finally free to be used in the true linguistical sense. Horray! :-)

Java programmers, when taught that there are these two ways of passing things into functions

So the solution is simple: don't teach it to them. :-) In fact, you can teach it to them using the true notion of reference. In Java, you can say truthfully that primitives are passed by value, and objects are passed by reference -- it will only confuse them if they heard from some C++ programmers that "reference" means "alias". But I'm sure no-one would do that to poor Java programmers. :-]

I've never implemented a language, or examined a language's implementation,

Hm, interesting. For a low level thinker, that's really strange. May I suggest you do that sometime? I think it will be a fulfilling experience. :-) I teach a course on compiler/interpreter design, but unfortunately it's in Croatian. I'm sure you can find decent ones in your preferred language. (If you want tips, Appel cs.princeton.edu/~appel/modern/ is really good, though verbose.)

Pointers are not just a "C thing"—they exist at the assembly level, where a pointer is literally a CPU register that happens to store a memory address (which is a number like any other).

Ah, in that sense, of course. Though even that's not a given: some early versions of Fortran had no dynamic memory allocation at all. Everything was static, and so could run directly from memory. Yes, you lose various niceties like recursion (you can't have a stack without indirection, of course), and functions could call other functions only to the depth of two:-), but people actually programmed in that monstrosity 50 years ago. (I'm sure someone will say that sentence for C in 50 years.:)

And about strings: do learn about Hollerith constants some time. Loaded directly from the source, together with the length, statically. Beautiful stuff. :-D There are more things in heaven and Earth, Horatio, Than are dreamt of in your philosophy. ;-)

Thread Thread
 
Sloan, the sloth mascot
Comment deleted
 
vedgar profile image
Vedran Čačić

So, it seems we agree on everything after all. How nice. :-D

About that compiler course... I didn't tell you the most important thing. I teach it using Python (and a little framework of mine, just to do bureaucratic tasks like counting lines and characters for nice error messages, and buffering characters and tokens in case you want to go back in the stream). That's the only way I can produce a working compiler in 20 hours. And that's why I told you there are no pointers in it. (Yes, we use CPython, so there are pointers in the implementation of Python, but as you said, if you just mean "indirect access to memory", it's bound to appear somewhere down the ladder of abstraction. The point (yeah, bad pun:) is that there are no pointers in the source language, no pointers in the target language (we mostly compile expressions to some stack-based VM, and transpile commands to things like JS, which is pretty much the universal architecture nowadays -- se webassembly;), and no pointers in the implementation language. The people writing those certainly don't think in terms of pointers.

Of course, being a low level thinker, you'll probably say it's cheating, and you'd want to use C++ as implementation language, and (if I estimated your age correctly) something like x86 as the target. But my perspective is that you'll just spend more time (and increase the risk of giving up) while not learning anything essentially new. "I guess it depends on what type of thinker you are and what you enjoy in programming.", as you said. :-)

Wow, that's... actually pretty cool! I didn't even think that you could do that.

As they say, necessity is the mother of invention. You don't know you can do many things (like survive 4 days without food) until you have to. ;-) BTW both things (surviving 4 days without food and using Hollerith constants regularly) are extremely bad for your health. There are good reasons we don't do them anymore. Kids, don't do this at home. ;-)

Thread Thread
 
Sloan, the sloth mascot
Comment deleted
 
vedgar profile image
Vedran Čačić

Youth is neither good nor bad---at least that's something we old farts say to each other for consolation. ;-)

x86, on the other hand, is mostly obsolete, but that shouldn't necessarily stop you---since your aim is not to produce something useful, but to understand things better.