It honestly irks me a lot how people cite the infamous 0.1 + 0.2 floating point arithmetic error as a justification to ostracize JavaScript (even though most programming languages suffer from this issue as well by virtue of floating point limitations).
Like, seriously? Are tiny nuances really that evil so as to dismiss the programming language altogether? No, right? Python suffers from the same error, but is nonetheless one of the most loved languages out there.
You are definitely correct about people not using it effectively. When people write arguments against the languages, it's always because they don't use it right, as in non-"Pythonic" code for Python.
I hope I'm misunderstanding, but this style of example "learnt from Python" always puzzles me. Specifically, why do devs expect things to work the way they do in some other language, rather than learning the constructs in the language theyre presently working in?
I do suppose that the language should be failsafe. Even TypeScript doesn't warn., nor throws an error. Even Python is safer. TypeScript is partly safer, and some part more dangerous than Python, due to JavaScript-based.
RTFM, partly, is OK. But reading the whole EMCA specification is crazy.
Also, I cannot expect tutorials to teach everything.
I also expect the language to be "guessable" rather than being told to do so. Otherwise, throw an error early.
Productivity doesn't wait for you to finish learning...
But knowing additional paradigms might be helpful. Knowledge should add-on rather than replace.
Still, end-in-end, I love JavaScript (not even saying TS), more than Python.
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
It honestly irks me a lot how people cite the infamous
0.1 + 0.2
floating point arithmetic error as a justification to ostracize JavaScript (even though most programming languages suffer from this issue as well by virtue of floating point limitations).Like, seriously? Are tiny nuances really that evil so as to dismiss the programming language altogether? No, right? Python suffers from the same error, but is nonetheless one of the most loved languages out there.
You are definitely correct about people not using it effectively. When people write arguments against the languages, it's always because they don't use it right, as in non-"Pythonic" code for Python.
No, I would cite -- github.com/aemkei/jsfuck
Most importantly,
[1, 2] + [3, 4]
that I learnt from Python. I know that there is.concat()
and spread operator, but still...Another thing is
var
hoisting, but it can be made understand, really.I hope I'm misunderstanding, but this style of example "learnt from Python" always puzzles me. Specifically, why do devs expect things to work the way they do in some other language, rather than learning the constructs in the language theyre presently working in?
I do suppose that the language should be failsafe. Even TypeScript doesn't warn., nor throws an error. Even Python is safer. TypeScript is partly safer, and some part more dangerous than Python, due to JavaScript-based.
RTFM, partly, is OK. But reading the whole EMCA specification is crazy.
Also, I cannot expect tutorials to teach everything.
I also expect the language to be "guessable" rather than being told to do so. Otherwise, throw an error early.
Productivity doesn't wait for you to finish learning...
But knowing additional paradigms might be helpful. Knowledge should add-on rather than replace.
Still, end-in-end, I love JavaScript (not even saying TS), more than Python.