DEV Community

What common programming concept has the wrong name?

Ben Halpern on September 05, 2019

Everybody calls it this, but it's not a good name. What are some examples?

Collapse
 
ryancmcconnell profile image
Ryan McConnell

JavaScript has nothing to do with Java.

Collapse
 
therealkevinard profile image
Kevin Ard

I blew the designers' minds a few weeks ago with the "car and carnation" analogy 😅

Collapse
 
adamoutler profile image
Adam Outler

I prefer Java is to JavaScript as car is to carpet. Just like there is some carpet in a car, there is some minor overlap, but they're totally different things.

Thread Thread
 
coolshaurya profile image
Shaurya

There's also grape and grapefruit, they both are fruits but totally different ones.

taken from the r/learnprogramming faq

Thread Thread
 
chuckwood profile image
Charles Wood

This is perfect because they're more similar to each other than either one is to, say, Haskell, so they're not totally unrelated.

Collapse
 
rubyrubenstahl profile image
Ruby Rubenstahl

I generally just tell people plainly that it was a marketing ploy and that they have nothing to do with each other aside from the fact that they are both programming languages.

Collapse
 
skyandsand profile image
Chris C

That would explain why I don't want to touch it with a 50-foot pole lol

Collapse
 
soatokdhole profile image
Soatok Dreamseeker • Edited

Not directly programming, but relevant.

In asymmetric cryptography, we have the terms "private key" and "public key". But the term "private key" is actually wrong.

  • Privacy is something you don't want the whole world to know.
  • Secrecy is something you don't want anyone to know.

Since what academics call a "private key" is something that you don't reveal to anyone else, the term "private" is wrong.

It should be, therefore, "secret key" and "public key".

The "private key" should, instead, be what you agree on with e.g. Diffie-Hellman. It makes more sense than "shared secret".

It also allows you to use sk and pk as variable names in cryptography APIs, instead of having to spell out pubKey and privKey since they both abbreviate to pk.

This is my weird hill okay?

Collapse
 
earroyoron profile image
Ernesto λrroyo • Edited

Secret key is most used for simmetric schemas. I get your point, but I think using private-key is right.

The real epic-fail in cryptography is about the misused "certificate" term:

A certificate is just the public-key plus the proven subject (owner) of the certificate signed by a Certification Authority. The certificate has not the private key but we can find things like
"install the certificate in the web server to configure the TLS...." or "sign with a certificate"

Collapse
 
soatokdhole profile image
Soatok Dreamseeker

I get your point, but I think using private-key is right.

From A Cipherpunk's Manifesto by Eric Hughes (1993):

Privacy is necessary for an open society in the electronic age. Privacy is not secrecy. A private matter is something one doesn't want the whole world to know, but a secret matter is something one doesn't want anybody to know. Privacy is the power to selectively reveal oneself to the world.

Thread Thread
 
earroyoron profile image
Ernesto λrroyo

Yes but... I can be wrong, I am not english native,... but I think privacy and private are not related , they seem to be but are indeed different and unrelated concepts, this is, a private key is not "something for keeping privacy" but "something for keeping confidentiality or secrets"...
Not sure at all about these...

Thread Thread
 
soatokdhole profile image
Soatok Dreamseeker

I am not english native,... but I think privacy and private are not related

Private is an adjective, privacy is a noun, but they refer to the same thing.

A private matter is one that requires privacy.

A secret matter is one that requires secrecy.

Collapse
 
edwbuck profile image
Edwin Buck • Edited

Shared key and private key achieve the same goal, without stepping on the historical meaning of secret key.

Collapse
 
sinewalker profile image
Mike Lockhart

There are two hard problems in computing...

Collapse
 
emaraschio profile image
Ezequiel Maraschio

It’s not a fully “programming” concept but pull request on Github sounds weird for me

Collapse
 
ianfabs profile image
Ian Fabs

I 100% agree, GitLab calls them Merge Requests which is more accurate

Collapse
 
mxoliver profile image
Oliver

Exactly! and you aren't really requesting to pull anything, if anything you are requesting to push something. Merge request sounds better but at the same time I've gotten so used to pull request even though it makes no sense

Thread Thread
 
ben profile image
Ben Halpern

I guess it’s a request that the other person pull, right? But either way, merge request 100% better.

Thread Thread
 
mxoliver profile image
Oliver

Oh truue, I didn't think about that. It's like a "hey i did a thing, can you please pull this into master?"

Thread Thread
 
ben profile image
Ben Halpern

Still totally weird. I had the same confusion when first introduced.

Thread Thread
 
fernandomaia profile image
Fernando Maia

I asked myself the same thing a while ago, and I found a good explanation on SO which links to a GitLab's article that debates about this topic.

Thread Thread
 
jsn1nj4 profile image
Elliot Derhay • Edited

Think GitHub's feature name comes from git request-pull. Both are completed by the receiver rather than the author, so it's kind of pulling, even though it's a merge still...

But git pull is also git fetch && git merge FETCH_HEAD, so who knows...?

I guess GitLab's name for it does make more sense for the resulting action though.

Thread Thread
 
jessekphillips profile image
Jesse Phillips

One should always remember, Github created this workflow idea for mainstream. I'm curious if request-pull actually came after github.

I don't have a memory of the early days, but I expect the merge feature was not in the initial UI, meaning you literally asked the person to run git pull on your branch i.e. fetch and merge.

Thread Thread
 
edwbuck profile image
Edwin Buck • Edited

From my estimation, pull-request came from the author imposing his viewpoint on others. He assumed that he would be in control of the upstream repository and named it as a hybrid of what it was to him and what he had to do with it.

A better approach is to consider the user's point of view.

Git is a mess of badly structured commands, but a useful tool. Look at mercurial of you want an example of git workflows with a better command line interface.

Thread Thread
 
jessekphillips profile image
Jesse Phillips

You'll need to help me on this one. What does hg call it?

mercurial-scm.org/wiki/ReviewWorkf...

"Create a "Pull request" on data in the repo"

Thread Thread
 
nathankc profile image
Nathan K. Campbell

Don't bother looking at Mercurial if you're a BitBucket user though:
bitbucket.org/blog/sunsetting-merc...

Collapse
 
ahferroin7 profile image
Austin S. Hemmelgarn

You're requesting somebody to pull in a branch from your fork of the repo. The exact operation when using Git in the command line that merging the request equates to is git pull not git merge, so Pull Request is technically correct.

Collapse
 
shameera91 profile image
Shameera Anuranga • Edited

ha haaa ... Form the beginning this sounds wired for me also

Collapse
 
firas profile image
Firas M. Darwish

so true, i couldn't agree more

Collapse
 
jacobherrington profile image
Jacob Herrington (he/him)

Serverless is the most obvious. 🤦‍♂️

Also Object-Oriented Programming, it's inventor (Dr. Alan Kay) even said later in life that it should have been called message-oriented programming.

Collapse
 
lcarbonaro profile image
Les Carbonaro

Scrolling through this list, I'm surprised "serverless" is like three-quarters of the way down. It's clearly the most glaring misnomer in recent programming memory.

Collapse
 
dannymcgee profile image
Danny McGee

Can you think of a better name for it? I think re: developer experience it's pretty fitting, since in practical terms it means not needing to set up, manage, maintain, or pay for a server (or even write any back-end code, depending on the implementation). It's not really accurate in technical terms of what's going on behind the curtain but I don't really think that's the point.

Thread Thread
 
jacobherrington profile image
Jacob Herrington (he/him)

Infrastructure as a Service.
On-demand Computing.
idk, not serverless. :)

Thread Thread
 
dannymcgee profile image
Danny McGee

Infrastructure as a Service

Doesn't have quite the same marketable ring to it, but not bad!

Collapse
 
craignicol profile image
Craig Nicol (he/him)

It's server-less (i.e. you have less than 100% of the server, and even less than you'd have with a VM) ;-)

Collapse
 
johnylab profile image
João Ferreira

you mean it should've been called MOP ooooooooo

Collapse
 
sinewalker profile image
Mike Lockhart

Yes! And "calling a method on an instance of a class" instead of just "send an object a message"

Collapse
 
natec425 profile image
Nate Clark

darkgrey

Collapse
 
codemouse92 profile image
Jason C. McDonald • Edited

Apparently, "compiled language". Nearly all languages are compiled in many situations to something, so it's a meaningless phrase.

"Interpreted language" (final result executed via some form of a software interpreter) vs. "assembled language" (final result executed directly by the CPU as machine code) is more useful. Python and Java are examples of interpreted languages, although both are compiled in some fashion. C++ and FORTRAN are examples of assembled languages.

And yes, the designations are murky. Interpreters exist for typically assembled languages, and (generally impractical) means exist to assemble typically interpreted languages to machine code.

EDIT: Based on this thread, I'll add one more: an interactive language is one that is primarily run through a REPL (interactive shell).

This should not be confused with an interpreted language, which simply has a software layer between the shipped files (sometimes, bytecode that was compiled to) and direct execution.

Also, dependencies don't even enter into this discussion; any language has to resolve them somehow on the target machine.

EDIT 2: This thread (and related conversations) led to this...

Collapse
 
chrisgseaton profile image
Chris Seaton

Python and Java are examples of interpreted languages ... and (generally impractical) means exist to assemble typically interpreted languages to machine code

But Java is assembled to machine code at runtime, and it's extremely practical. I'd say interpreting Java is impractically slow!

Collapse
 
codemouse92 profile image
Jason C. McDonald • Edited

No, Java is not assembled to machine code at any point, except in extraordinary implementations. It is compiled to bytecode, which is what is run through the interpreter, often the JVM.

EDIT: I omitted two words by mistake: at any point before shipping. JIT-compiling is pretty typical for interpreted languages; it doesn't make it an assembled language. So, yes, runtime still counts (towards interpreted.) My bad.

This was discussed and rehashed in some detail here:

Thread Thread
 
chrisgseaton profile image
Chris Seaton • Edited

Java is not assembled to machine code at any point, except in extraordinary implementations

No, common implementations do it. Run Java with -XX:+UnlockDiagnosticVMOptions -XX:+PrintAssembly to see it assemble your Java code to machine code.

Here's the assembler in HotSpot for example github.com/AdoptOpenJDK/openjdk-jd....

And see this blog post about how another Java compiler compiles and assembles Java to machine code chrisseaton.com/truffleruby/jokerc....

I used to work at Oracle on compilers and assemblers for Java and other languages.

Thread Thread
 
badrecordlength profile image
Henry 👨‍💻 • Edited

Java is not assembled to machine code at any point

I'm sorry, but this is just plainly incorrect. Even an interpreter compiles down to machine code before execution, it just does so line by line as it executes instead of before execution starts and all at once. Do you think a CPU can understand bytecode?

Thread Thread
 
codemouse92 profile image
Jason C. McDonald

That's a misinterpretation of "compiles". The interpreter executes the bytecode, but it is not producing an actual standalone executable binary of assembly code. If it did, why on earth would Java deployment always involve the added trouble of ensuring the Java interpreter were installed on target machines, and then have the interpreter execute the .JAR?

"Runtime 'compiling'" is called "interpreting," and it really shouldn't be confused with compiling.

Thread Thread
 
badrecordlength profile image
Henry 👨‍💻 • Edited

Definition of compile:

Convert (a program) into a machine-code or lower-level form in which the program can be executed.

That's its quite literal definition, so any language which eventually is executed by a CPU (so all of them), are compiled at some stage. The topic of "Compiled VS Interpreted" languages is really about how they are compiled, generally "compiled" languages are compiled before execution and "interpreted" as they are executed.

This doesn't take away from the fact that saying "Java is never assembled to machine code" is wrong because Java wouldn't be able to be executed by the CPU at all if it wasn't. I don't mean for this to come across as combative (if that's how its been ahem interpreted), I think healthy and reasoned debate on such topics is good.

I'll end by quoting a blog post already posted in this thread where former Oracle compiler dev Chris Seaton explains how Java compiles bytecode to machine code:

To run your Java program then the JVM interprets the bytecode. Interpreters are often a lot slower than native code running on a real processor, so the JVM at runtime can also run another compiler, this time compiling your bytecode to the machine code that your processor can actually run.

Thread Thread
 
codemouse92 profile image
Jason C. McDonald • Edited

Curious for your source. Here's the one most people work off now.

A compiler is a computer program that translates computer code written in one programming language (the source language) into another language (the target language).

I chose to specifically use the term assemble for compiling to machine code, since "compiling to bytecode is still compiling!" is a favorite distraction technique employed in the Java debate.

This doesn't take away from the fact that saying "Java is never assembled to machine code" is wrong because Java wouldn't be able to be executed by the CPU at all if it wasn't.

EDIT: Okay, some confusion may have resulted from an unintentional omission on my part: earlier, I meant to type Java is not compiled to machine code at any point before shipping, which if you look at context, is what I've been saying all along. Just correcting that here.

All an interpreter ever does is to covert to machine instructions in some fashion or another, either just before execution (AOT-compiler), as it executes (JIT-compiler or traditional interpreter, depending on implementation). It's still relying on that software layer. That's all I've been claiming.

If you read my posts again, I've been repeatedly making the point that an assembled language ships a binary of machine code. If it has to be AOT/JIT-compiled (interpreted) on the end-user's machine, it's an interpreted language.

Also, to requote Chris Seaton...

To run your Java program then the JVM interprets the bytecode. Interpreters are often a lot slower than native code running on a real processor, so the JVM at runtime can also run another compiler, this time compiling your bytecode to the machine code that your processor can actually run. (Emphasis mine.)

Ergo, interpreted. Thanks for providing an official source to prove my point all along.

No wonder so many people are confused by "interpreted," "compiled," "assembled," "machine code," et al. The terms keep getting changed (esp. by various Java devs I've spoken to over the years) to dodge the obvious. :(

Thread Thread
 
codemouse92 profile image
Jason C. McDonald • Edited

Update: So, reading over the thread again, I had to put in a couple of edits to my posts. I missed a couple of words in reading, and also in writing. My mistake. So, apologies to @chrisgseaton for missing his point.

Java is (sometimes/often) compiled to machine code on the end-user computer. This is referred to as Ahead-Of-Time (AOT) Compiled. Under other circumstances, it may be Just-In-Time (JIT) Compiled. This technically contrasts with a "traditional" interpreter, which will execute the bytecode directly; however, the difference between a JIT Compiler and a "traditional" interpreter often come down to implementation; it may be hard to distinguish without knowing implementation details.

In any case, as I said originally, an interpreted language still places a software layer between the shipped end result and the machine code; an AOT-compiler, JIT-compiler, and "traditional" interpreter all fulfill this requirement, which clearly marks Java as an interpreted (granted, interpreted-compiled) language.

This is still in contrast to assembled languages, say, C and C++, in which the executable machine code file is the shipped end result.

The difference all comes down to what is shipped to the end-user.

(Reminder: runtime dependency resolution is irrelevant to this topic.)

Interpreted (or interpreted-compiled) languages are not implicitly inferior nor superior to assembled languages. They are merely different.

Thread Thread
 
badrecordlength profile image
Henry 👨‍💻

Here's my source
defintion of compile
I'd be curious as to where you got the data that "most people" use the first sentence of the wikipedia page for compiler as the definition. Also, the second sentence on that very same page is almost verbatim the definition I gave anyway:

The name compiler is primarily used for programs that translate source code from a high-level programming language to a lower level language (e.g., assembly language, object code, or machine code) to create an executable program.

So I think arguing about that is a bit silly. From the tone of your response it seems as though you think that I'm some kind of Java fanboy trying to obfuscate the truth using technicalities, far from it. In fact, my favourite language to use day-to-day is Python. It is however an unavoidable fact that Java is almost always faster than Python at runtime, due in big part to the way that it is compiled.

Saying "HAH! The quote you gave me has the word interpreter in it, I win!" is not really a productive way of having a sensible discussion. I was only ever trying to correct you on the omission that you have since edited into the last response, Java is in fact compiled/assembled to machine code before execution.

It feels like you have misinterpreted my intentions greatly, I'd rather not have this discussion devolve into petty "my language is better than your language" mudslinging, in which I have no interest in partaking.

Thread Thread
 
codemouse92 profile image
Jason C. McDonald • Edited

I'd be curious as to where you got the data that "most people" use the first sentence of the wikipedia page for compiler as the definition.

From mass usage of the term. Ironically some of them are well-meaning Java devs who assert that compiling to bytecode implicitly makes the language "compiled" in the same way that assembling to machine code makes the language "compiled". Thus why I avoid the term "compiled language".

Compiled has come to mean many things. Typescript is compiled to Javascript. Python is compiled to Python bytecode. Java is compiled to Java bytecode, which is compiled (assembled) to machine code later. C++ is compiled to object code, which is compiled (assembled) to machine code. You see where this gets confusing? That's why I took the care to use "assembled," from the C and C++ term for the part of the compilation toolchain that converts object code to machine code.

You can compile to anything, but you can only assemble to machine code. I'm using "assembled" to unobfuscate the difference, instead of trying to unring the bell wherein the old definition of "compiled" became wholly antiquated.

From the tone of your response it seems as though you think that I'm some kind of Java fanboy trying to obfuscate the truth using technicalities, far from it.

I'm just rather worn out by the sheer mass of Java developers who have a knee-jerk "but but but" to the obvious point that Java is an interpreted language by my initial definition (JIT-compiled still qualifies). There seems to be an insecurity many Java devs have, that somehow their language is "terrible", so they have to find ways to defend its validity to themselves. (Which is unnecessary - it was already valid, and not terrible.)

You may well have been correcting the mis-point I produced with my omission, and I mistook it for the usual "but but but." It read the same to me, but if that's not your goal, I concede as such.

I'd rather not have this discussion devolve into petty "my language is better than your language" mudslinging, in which I have no interest in partaking.

Happily, as I've said all along, interpreted is not inherently inferior to anything. I'm a Python and C++ developer. I never even implied that Java is inferior, merely that it is not assembled. (I'm not even a fan of Java, as it were, but I feel no need to decry it; it serves its purpose well, regardless of my personal preferences.)

It is however an unavoidable fact that Java is almost always faster than Python at runtime, due in big part to the way that it is compiled.

Unrelatedly, I'd be curious what the benchmark between AOT-compiled Java and pre-compiled (to bytecode) Pypy would be. CPython usually loses benchmarks.

Thread Thread
 
natonathan profile image
Nathan Tamez

Put simply, do you ship machine code?
Yes: it's compiled
No: it's interpreted

Thread Thread
 
codemouse92 profile image
Jason C. McDonald • Edited

I'd say, Yes: it's assembled.

Compiled is really, really confusing, because of the varying usage. As I said, you can compile to another language, to bytecode, to object code, to machine code...but you can only assemble to machine code.

I'm getting this distinction from the conventional compilation toolchain, btw. We compile first to some form of intermediary code, and then we assemble to the machine code.

Thread Thread
 
natonathan profile image
Nathan Tamez

Just bear in mind the most in academia use the term Compiled, as the term assembled, means to assemble assembly to machine code.

Thread Thread
 
codemouse92 profile image
Jason C. McDonald • Edited

Oh, I'm well aware. I've historically used it the same...in fact, I'd prefer if that's all it meant.

Unfortunately, that distinction is lost on most...and sadly, part of that is from communities surrounding some interpreted languages misusing the term "compiled" for so long to dodge the fact their language is interpreted.

That, in turn, is because "interpreted" has been used in an almost derogatory manner for so long.

Language evolves, and not always for the better. Trying to cram the linguistic refrigerator biscuit dough back in the tube is a losing fight.

Collapse
 
elmuerte profile image
Michiel Hendriks • Edited

I'd say interpreting Java is impractically slow!

Which is true and why they created the JIT Compiler in Java 2.

Now days in a standard JVM process a lot of byte code is assembled to machine code. And a large part will be heavily optimized and recompiled to better machine code at runtime. That's why a JVM program quite often outperforms "native" languages, unless they are optimized by hand. (or after gathering enough trace information).

But calling Java an interpreted language is wrong. A common characteristic of interpreted languages is executing part of the code before running into a compile error.

Thread Thread
 
codemouse92 profile image
Jason C. McDonald • Edited

Sigh

I've been in this conversation about a dozen plus times now. Every person tries to make the same claim, "Java isn't interpreted," but then they have to move the targets and shuffle the definitions around to prove it.

If Java were an assembled language, the end result of your compilation would be a standalone executable binary consisting purely of machine code, which could be run by itself on any machine of the matching architecture. Java doesn't do this; end users have to have some sort of runtime on their machine which executes the .JAR file they shipped. Either the process was made arbitrarily more complex than it needs to be, or Java is not an assembled language. (Please pick one.)

It's a compiled-interpreted, purely on merit that it isn't an assembled language. Just because its compile/interpret process is so complicated that practically every Java developer has their own unique understanding, doesn't mean that it isn't compiled-interpreted.

And, it's worth mentioning, interpreted does NOT mean slow, bad, or unprofessional! I think a lot of people are afraid if they admit that Java is compiled-interpreted, they'd be admitting it's somehow "bad," but that is most assuredly not the case.

A common characteristic of interpreted languages is executing part of the code before running into a compile error.

Nope. That's an interactive language...so, I guess that's another entry to this list?

Thread Thread
 
elmuerte profile image
Michiel Hendriks

If your claim was that an assembled language has as shippable artefact hardware machine code, then I would not make a big point out of it. But...

C and C++ programs also depend on a common runtime. They do not work without it.

You can include the Java runtime with Java application if you want.

The runtime for .Net applications is included with Windows these days, so can ship your compiled C# program without users needing to get a runtime. By your definition is it then an assembled language or an interpreted language.

Sun made hardware which natively runs Java bytecode. It was nog a big success, but it could run your .jar directly on the CPU. So it's an assembled language after all.

Thread Thread
 
codemouse92 profile image
Jason C. McDonald • Edited

The runtime for .Net applications is included with Windows these days, so can ship your compiled C# program without users needing to get a runtime. By your definition is it then an assembled language or an interpreted language.

C#, C++, and C are three different languages, btw.

As to C++ and C, they often rely on dynamically linked libraries, but dependencies are an unrelated issue for any language. You'll note I never brought it into the Java discussion; going into dependencies is another moving-the-target distraction tactic (don't worry, I won't blame you for inventing it...it's been in use for a while.)

C++ and C can produce binaries which require no dependencies, simply by avoiding the standard library.

That said, I will admit to using the wrong term a moment ago; runtime is a dependency library, whereas what I'm referring to with Java is the software responsible for executing the bytecode on the target machine.

Dependencies shouldn't enter into this discussion, all languages (practically speaking) have to be able to resolve dependencies.


Sun made hardware which natively runs Java bytecode.

Also a case of moving-the-target. I already stated earlier that there were impractical means by which virtually any assembled languages can be interpreted, and virtually any interpreted/interpreted-compiled languages can be assembled. This does not count because it has no bearing on standard deployment.


The inescapable point: Java is, for all practical intents and purposes, never assembled to a machine code binary file, such that it can be shipped to an end-user (w/ or w/o dependency libraries compiled to the same), and executed directly on a standard Intel, AMD, or ARM processor without needing an additional program to interpret (or, if you want to get pedantic, AOT/JIT-compile) the (byte)code.

An interpreted language has a software layer between the final, shipped result of compilation and its ultimate execution on the target machine.

Ergo, Java is compiled-interpreted. It is NOT assembled. It is also NOT interactive. (But, yes, it's still a real language.)

Collapse
 
sinewalker profile image
Mike Lockhart

Even machine code is interpreted by the processor's microcode, which is exchangeable on some processors and FPGAs. The machine executes nano-code on most processors these days, even RISC ones.

Collapse
 
ssolo112 profile image
Steven Solomon • Edited

Extreme Programming. It is about valuing the people on your team, and working effectively together, that is far from Extreme.

Collapse
 
ghost profile image
Ghost

haha, I guess "extreme" is a commentary on historical common practices. We even pay extra hours here!, and no whips!, even water to drink!, extreeme!

Collapse
 
keithy profile image
Keith

Extreme is a reference to taking a best practice and applying it to the extreme. Extreme code reviews become pair programming. Extreme testing becomes TDD, with comprehensive coverage. Etc.

Collapse
 
zkat profile image
Kat Marchán

Map.

Literally every single different definition. I can think of three that programmers run into, off the top of my head.

Collapse
 
anpos231 profile image
anpos231

I personally don't think Map is a wrong name.
macmillandictionary.com/dictionary...

Collapse
 
zkat profile image
Kat Marchán

The problem is, "which map?" If I'm talking about Maps, am I talking about:

  1. A "functional"/persistent key/value data structure
  2. A regular key/value data structure
  3. The higher-order function map()
  4. Cartography
  5. "Hidden Classes" (called Maps by Self and the internal V8 code)

The problem is that map is so unspecific in our field, there's a lot of ambiguity and that can lead to confusion.

Collapse
 
skyandsand profile image
Chris C

Map: Pooh Bear

Cartography: i.kym-cdn.com/entries/icons/origin...

Collapse
 
fernandomaia profile image
Fernando Maia

There's a server in serverless.

Collapse
 
jsn1nj4 profile image
Elliot Derhay

Correct!

Collapse
 
chrisachard profile image
Chris Achard

I always felt like "polymorphism" was an unnecessarily complex word for what it really is 🤣
I guess it's not inaccurate - just overly confusing.

Collapse
 
guitarino profile image
Kirill Shestakov • Edited

Maybe we should simply call it "likeness"? It might make the discussion around the whole term easier, because you can more easily explain why "likeness" leads to more reusability and less complexity.

It could also make the term more general and maybe even lead to other "likeness" patterns.

Collapse
 
deciduously profile image
Ben Lovy

I feel like this whenever I learn anything about category theory. The words are so much fancier and more complicated sounding than the concepts, but...I guess we need to call 'em something.

Collapse
 
lepinekong profile image
lepinekong

=> (fat arrow ) means the math logic predicate "implies" rapidtables.com/math/symbols/Basic... it has nothing to do with function, what did the javascript commitee drink on that day or there was one guy using coffeescript who made that silly choice ;)

Collapse
 
hugueschabot profile image
Hugues Chabot

By the Curry Howard correspondence, logical implication and function types are related. But I doubt that is the rational for the fat arrow.

Collapse
 
jsn1nj4 profile image
Elliot Derhay

Don't look now, but PHP...

Collapse
 
anant profile image
Anant Jain

Dynamic Programming.

Backstory:

I (Richard Bellman) spent the Fall quarter (of 1950) at RAND. My first task was to find a name for multistage decision processes. An interesting question is, Where did the name, dynamic programming, come from? The 1950s were not good years for mathematical research. We had a very interesting gentleman in Washington named Wilson. He was Secretary of Defense, and he actually had a pathological fear and hatred of the word research. I’m not using the term lightly; I’m using it precisely. His face would suffuse, he would turn red, and he would get violent if people used the term research in his presence. You can imagine how he felt, then, about the term mathematical. The RAND Corporation was employed by the Air Force, and the Air Force had Wilson as its boss, essentially. Hence, I felt I had to do something to shield Wilson and the Air Force from the fact that I was really doing mathematics inside the RAND Corporation. What title, what name, could I choose? In the first place I was interested in planning, in decision making, in thinking. But planning, is not a good word for various reasons. I decided therefore to use the word “programming”. I wanted to get across the idea that this was dynamic, this was multistage, this was time-varying. I thought, let's kill two birds with one stone. Let's take a word that has an absolutely precise meaning, namely dynamic, in the classical physical sense. It also has a very interesting property as an adjective, and that it's impossible to use the word dynamic in a pejorative sense. Try thinking of some combination that will possibly give it a pejorative meaning. It's impossible. Thus, I thought dynamic programming was a good name. It was something not even a Congressman could object to. So I used it as an umbrella for my activities.

Source: goodreads.com/book/show/5049990-ey...

Collapse
 
itmayziii profile image
Tommy May III • Edited

this in javascript is the confusion of a lot of javascript devs which is evident since it's not uncommon to see that = this or self = this (my personal favorite when I just started learning javascript.

Personally I think a really expressive keyword instead would of been invocationContext but that would be ugly to have everywhere so I would settle for context or ctx

Collapse
 
ginsburgnm profile image
Noah Ginsburg

that = this is actually generally used for referencing a this in another scope.

It creates really annoying code to read. But it's useful, here's an example off of stack overflow.
stackoverflow.com/questions/163611...

Collapse
 
edwbuck profile image
Edwin Buck

An object comparison methods like equals(...), I use if (this.equals(that)) and it seems quite readable.

Self.equals(other) doesn't sound as good to my ears

Collapse
 
yuzviko profile image
Oleg Yuzvik

I find Inversion Of Control (IOC) to be too abstract and, as a result, confusing.

Also a 'Bean' is a confusing concept in Java world for newcomers. It seems that the name has nothing to do with reallity.

Collapse
 
skamansam profile image
Samuel

Nerds like to run with the themes. See: almost anything in Ruby or Python. Also see: the Cucumber ecosystem.

Collapse
 
cjbrooks12 profile image
Casey Brooks • Edited

The word "bean" in programming needs to die. I'm all for puns, but this one was taken wayyyyy too far.

Collapse
 
integerman profile image
Matt Eland

I despise the names Dependency Injection and Inversion of Control, not because they're inaccurate, but because they're intimidating to the new developer and thus feel a bit inaccessible. Names like these foster impostor syndrome by being a bit too pretentious for something as simple as moving construction of an object to outside the object via constructor parameter.

And no... I don't have a better name, I just hate the ones we currently use.

Collapse
 
guitarino profile image
Kirill Shestakov

Auto-wiring might be better? What do you think?

Collapse
 
integerman profile image
Matt Eland

For IoC yeah, that or just provider container.

For DI, I think you could still say constructor / parameter / property injection, possibly. Maybe instead of DI you call it a "decoupling pattern".

Either way, our current names are here to stay, I just wish they induced less anxiety to those who don't understand them yet.

Thread Thread
 
guitarino profile image
Kirill Shestakov

Yeah I really like the "decoupling pattern". I'm gonna use it from now on.

Collapse
 
sleepyfran profile image
Fran González • Edited

I agree that inversion of control might not be the best name, mainly because it's too generic and can get confusing because of that. But what's wrong with dependency injection? It's literally telling what it does: you inject those dependencies from the outside instead of creating them yourself, and dependency is pretty self-explanatory.

There's much worse names out there than these, IMO.

Collapse
 
integerman profile image
Matt Eland

My concerns as stated weren't around the accuracy of the names.

Thread Thread
 
sleepyfran profile image
Fran González • Edited

I know, maybe I didn't explain myself. I don't see anything pretentious or intimidating about them. All technical fields are full of technical words and if the accuracy of the words is on point then I don't know what the problem with them is.

Collapse
 
vonheikemen profile image
Heiker

Reducer function.

Is not completly wrong, you can actually use it to "reduce" something, but the name implies some sort of limitation that isn't there.

Collapse
 
jvarness profile image
Jake Varness

Swift and Obj-C have protocols whereas literally every other language has interfaces.

Collapse
 
elmuerte profile image
Michiel Hendriks • Edited

React(ive).

As currently used a lot in web development. It comes from reactive programming which is a form of data flow programming, generally a declarative programming principle.

In reactive programming when you have an expression like a = b + c the value of a will change when b and c change. Not just when the above expression is evaluated.

This important characteristic is commonly not the case in what is currently often called reactive. While a lot of frameworks have support for pushing/pulling and reacting to model changes, it is hardly used to set up a reactive system. It does not go beyond showing new data, and not reacting to it.

So, the system does have reactive properties, it not reactive programming. We have been building systems with reactive properties for years before the current reactive trend showed up.

Collapse
 
guitarino profile image
Kirill Shestakov

That's a very good point. Unidirectional data flow is one way to achieve reactiveness and declarativeness, but it doesn't have to be like that. You can also write React in an imperative and non-reactive way, so why is it even called React?

I personally like a different reactive pattern more, namely, based on observed and computed variables (or independent and dependent variables if you want), similar maybe to MobX and the like.

Collapse
 
elmuerte profile image
Michiel Hendriks

Cloud computing.

In networking diagrams the "outside" network (usually the internet) is commonly shown as a cloud symbol. It is a network of unknown computers what can connect to your network. This symbolic is used as the basis of the term "cloud computing".

But it makes no sense. As the computers involved in cloud computing are known in your system design. They are named entities to and from which you connect. These computers might be dynamically allocated, and not yours. But this was also the case in large mainframe setups in "the old days".

Collapse
 
skamansam profile image
Samuel

Then use the term, "Cluster Computing." Cloud Computing was a term made up by non-programmers. Early books on Beowulfs use the analogy in the illustrations, but not in technical aspects.

Collapse
 
dannymeister profile image
DannyMeister

Agreed, El Muerte! I'm on a cloud project now, and have to find a way for the name to stop irking me.

Collapse
 
stilldreaming1 profile image
still-dreaming-1 • Edited

Pretty much everything in programming could be named better.

class -> type

I think classes reap the greatest benefit from renaming them. It is not helpful to think of things are being in that "class", in the traditional sense of the word. And somehow this also makes people think of hierarchies of classes, but that is not the primary point of classes. They should be called types. The main advantage of classes, is you are not stuck with the types built into the language, you can make your own.

method -> operator

Another way in which classes are actually allowing you to make your own types is they allow you to define the "operators" that work with that type via methods. So methods should be called operators or operations. In existing languages, a type has operations you can perform on it, and that is basically what methods are. Even the order things appear in is the same: name of variable/value/object, operator/method, other variable/value/object if relevant.

object oriented programming (OOP) -> type oriented programming (TOP)

With that in mind, the form that object oriented programming has taken in modern languages like Kotlin and C#, would be better called type oriented programming, or TOP.

function -> action

Functions should be called actions, and a good keyword might be "act" or "action". I think the analogy to math is less helpful than just calling them actions.

variable -> box

A friend of mine was recently learning how to program. He was confused by variables at first, and then later asked why they are not just called boxes. While "variables" does make some sense, once again I think the analogy to math is not the most helpful thing, and "boxes" would be a better name.

Collapse
 
mebble profile image
Neil Syiemlieh

I think calling them types makes a lot more sense than classes. But as for variables, I don't think "boxes" would be appropriate since generally, many variables can be aliases to a single object.

Collapse
 
stilldreaming1 profile image
still-dreaming-1

At first I didn't agree with what you said about variables, but after thinking about it for while, I think I do agree. Perhaps in the case of variables, the analogy to variables in an equation is pretty helpful. I'm still somewhat torn though. Variables in equations typically don't change values, wheres the value stored in a variable can change. Then again, that is mutable state, which is typically an anti-pattern. So maybe you are right.

Collapse
 
togakangaroo profile image
George Mauer • Edited

Inheritance.

Which works nothing like regular inheritance where the originator is both dead and no longer possesses the thing.

When we use the term for Java style classes it is really a conflagration of 3 concepts. DRY by sharing definitions of data shape,a specific form of polymorphism, and namespacing of behaviors to the data shapes they operate on.

None of those maps super well to inheritance it even to is...a relationships and is the source of much confusion

Collapse
 
ahferroin7 profile image
Austin S. Hemmelgarn

Which works nothing like regular inheritance where the originator is both dead and no longer possesses the thing.

I find your definition of 'regular' inheritance interesting. It seems to completely ignore genetics, which is more likely to be where programming borrowed the term from than the legal concept.

Collapse
 
togakangaroo profile image
George Mauer

I suppose I did, mostly because I'm thinking of Java/C++ as the prototype for inheritance and well...the generics story there is not great. I don't know that I've ever heard the term used specifically about generics though, what is the context you're thinking of?

Thread Thread
 
ahferroin7 profile image
Austin S. Hemmelgarn

You inherit genetic traits from your parents. The exact term 'inheritance' isn't often used directly, but the verb 'inherit' is pretty common in discussions of classical genetics, and the concept of inheriting traits from some parent entity fits pretty well with the concept of inheritance in most programming languages.

Collapse
 
togakangaroo profile image
George Mauer • Edited

A variable creates a symbol that can bind to values but does not actually itself vary. And it typically uses the = syntax while not implying equality. Also most usages do not change the bound values so nothing varies

Abstract syntax trees are a result of parsing syntax, they are structures that have only the building blocks of the language with all the syntax stripped out.

Lisp sexp's have nothing to do with sex nor lisp with speech impediments

Collapse
 
computer_babaji profile image
Computer babaji

The most misunderstood programming concept for beginners that if they learn programming,they can be like neo in matrix movies. But in reality, you cannot make more than 1 major product all by yourself. Even the computer (software is made apple, Microsoft etc) and hardware (cpu intel, amd , other parts by pc manufacturers )

Collapse
 
pchinery profile image
Philip

The No1 candidate that comes to my mind is monad. I know it's a mathematical concept, but I sometimes get the feeling that it is used to appear smart rather than being a precise definition of what is happening. Purely functional programming (hello haskell) can be a big thing to learn and I don't see the point in introducing unnecessary complexity to this, as it makes learning more intimidating.

Collapse
 
kaos profile image
Kai Oswald

Microservices.

Collapse
 
phlash profile image
Phil Ashby

"slightly smaller than before" services? :)

I find jumping to hype-driven prefixes (Ultimate, Micro, Pico) annoying.

Collapse
 
edwbuck profile image
Edwin Buck • Edited

Extreme came from the core tenet that if something is fundamentally good for producing software, that idea should be carried to its applicability limit.

The pair programming in Extreme Programming is the concept of code review pushed so far forward that it starts as the code is written.

Automated unit testing over unfinished products was once an extreme too. Some shops still do manual testing after the code is "complete", and would only automate after the manual testing became too expensive.

Collapse
 
skyandsand profile image
Chris C

My 2019 pet peeve is the new trend for all of these startups and corporations using "AI" And "Machine Learning" as a buzzword to churn profits.

AI has a very specific definition and use case. It's a concept, not a reality. There are no quantum, blockchain, machine learning compooters that perform magic. It takes a lot of hardware to run that level of IBM Watson-like software. Just like it takes a lot of wires to run a sufficient wireless network.

At the end of the day, it's all code and the nightmare that is Youtube's suggestion algorithms show that we are very far from some futuristic self-driving vehicle Facebook car.

Collapse
 
beardedbry profile image
Brian

I've always thought that 'push request' seems more intuitive than 'pull request'

Collapse
 
nestedsoftware profile image
Nested Software

I agree! I think "pull request" is the kind of terminology that comes from thinking about how the underlying technology works first, as opposed to thinking of a use-case from a typical programmer's point of view. Also, welcome to Dev!

Collapse
 
yorodm profile image
Yoandy Rodriguez Martinez

Object Oriented Programming

Collapse
 
mandaputtra profile image
Manda Putra

But it uses class

Collapse
 
markel profile image
Markel F.

Serverless

Collapse
 
jmfayard profile image
Jean-Michel 🕵🏻‍♂️ Fayard

Social media. Being social and being on Facebook/Twitter are polar opposites

Collapse
 
ranewallin profile image
Rane Wallin • Edited

In conversation 'closure' generally means ending something and putting it behind you, in Javascript it means almost the opposite. Very confusing to beginners.

Collapse
 
goodidea profile image
Joseph Thomas

Not a concept but a command, Git's checkout has never made sense to me.

Collapse
 
dowenb profile image
Ben Dowen

We say "refactoring" we mean "rewriting".

E.g much refactoring work is done to solve a problem, and make improvements. But refactoring wouldn't change result's.

Collapse
 
skamansam profile image
Samuel

Refactor is a math thing, iirc. bobobobo.wordpress.com/2009/07/08/...

Collapse
 
dowenb profile image
Ben Dowen

Yes indeed!

Collapse
 
kareem_adel_ profile image
Kareem Ali

Backwards compatibility is really about forward portability.

Collapse
 
jsn1nj4 profile image
Elliot Derhay

But are you porting something forward or are you making something new that has to be compatible with something old?

This has been your unsolicited confusing question of the day.

Collapse
 
devusman profile image
Usman Suleiman

Lambda functions. Reminds me of Math.

Collapse
 
skamansam profile image
Samuel

It is. For some amazing fun, see youtu.be/FITJMJjASUs

Collapse
 
davestewart profile image
Dave Stewart

Vue should be called React because it's truly reactive, and React should be called Vue because it's only concerned with the view

Collapse
 
niorad profile image
Antonio Radovcic

I'm-not-even-mad-thats-amazing.jpg

Collapse
 
ucavalcante profile image
Ulisses Cavalcante

Serverless, they running into a server yet the name no make sense.

Collapse
 
firozansari profile image
Firoz Ansari

AJAX (Asynchronous JavaScript and XML): XML is not a requirement though to implement AJAX.

Collapse
 
phlash profile image
Phil Ashby

XMLHttpRequest is going to argue with you on that one...

Collapse
 
madelene profile image
Madelene Campos

"Full stack developer". Hardly anyone is a "full stack developer". Does one person really specialize in all 7 OSI layers?

Collapse
 
mikesherov profile image
Mike Sherov

Deprecation. Oftentimes people mistake "removed" for "deprecated".

Collapse
 
keithy profile image
Keith

MVC in PHP is not really related to the original.

Collapse
 
scottshipp profile image
scottshipp

Dependency Injection.

Simply because people have widely differing opinions on what actually comprises DI.

Collapse
 
dansilcox profile image
Dan Silcox • Edited

DevOps. It's supposed to be about removing silos and different parts of the business working together - not just about developers having access to NewRelic...

Collapse
 
dylanesque profile image
Michael Caveney

I hate that React Hooks were namely thusly so, so much.

Collapse
 
guitarino profile image
Kirill Shestakov

Why? Aren't they hooking up to certain events / actions?

Collapse
 
mellen profile image
Matt Ellen-Tsivintzeli • Edited

Programme.

The word is derived from the ancient Greek for "public notice". Proprietary software is not public 😛

Collapse
 
grhegde09 profile image
Gajanan

What about Logistic Regression?

Collapse
 
nickhodges profile image
Nick Hodges

Generics was always weird to me.

They are really "Parameterized Types" if you ask me.

Collapse
 
sinewalker profile image
Mike Lockhart

Generics make the Types of Object method Arguments Specific instead of un-typed

milosophical.me/blog/2008/09/10/ge...

Collapse
 
niorad profile image
Antonio Radovcic

"Clean" Code. It promotes cleanliness of code for its own sake, and pretends there is a subjective "clean" state of code. Thus I think it could be called somehow different.

Collapse
 
edwbuck profile image
Edwin Buck

"Call by name" parameter passing. Thank goodness it isn't used much. It should be called "macro parameter" or "call with macro"

Collapse
 
ad0791 profile image
Alexandro Disla

debugging

Collapse
 
recss profile image
Kevin K. Johnson

Comes from the days when computers were large enough that you literally had to remove bugs from them.

Collapse
 
davestewart profile image
Dave Stewart

"Senior Developer" because most, aren't.