DEV Community

Cover image for What is today's "goto"?
Jason Steinhauser
Jason Steinhauser

Posted on

What is today's "goto"?

50 years ago, Edgar Dijkstra published a controversial (at the time) editorial entitled "Go To Statement Considered Harmful". In it, he stated what others had already hinted at - notably, Tony Hoare (creator of ALGOL and the billion dollar mistake), and John McCarthy (creator of LISP) - that goto statements in high-level languages impeded understanding of the code. Since then, we've seen movement away from goto... but we're still not fully clear of it.

One of my first projects as a software engineering intern was to take legacy FORTRAN and convert it into true OO C++... and make it a drop-in replacement to be called from other legacy FORTRAN as a full conversion of the codebase was performed. This was around 30 years after Djikstra's editorial and yet... in 1999, my partner and I found excessive used of gotos in functions that were hundreds of lines long. Having the code printed out, with lines drawn between labels and their respective gotos, trying to adapt the logic into if..else and return, continue, and break equivalents took the better part of the summer.

Unfortunately, that wasn't my last encounter with that dreaded four-letter word: goto. On one of my former projects, I tried to use my "free time" in long release cycles to try to remove one goto that we had in a relatively new C# project. It was buried deep inside of a numerical method. I spent the better part of a day trying to figure out how in the world I could rid of it, and... there wasn't a good, clean way. I called up the developer who'd written the code (he had moved to another company, but we're still good friends) and asked him "What in the world were you thinking?" After a long laugh at my misfortune, he said he'd ported that method from C++ at the recommendation of one of our data scientists and that was the one goto he was unable to remove cleanly. And so it stayed, presumably to this day.

It's actually pretty amazing to me that C#, a language developed around the turn of the millennium, still allowed developers to use goto! Java actually has goto as a reserved word, but has it marked as unused. JavaScript, with all its faults, doesn't have goto defined as a keyword. Of course, that has stopped someone from adding them in for some reason.

This kind of led me to an interesting question: what do we do in everyday development that we can do, but shouldn't? People have since posited that several things are "considered dangerous," from NULL references, to IPv4 wrapped addresses to even entire chipset families. I've found one that I agree with (to an extent) about the the overuse of Electron due to its severe bloat.

What practices and processes do you "consider harmful"? JavaScript everywhere? Measuring the wrong metrics for software quality? Crypto-everything? Let's discuss in the comments!

Oldest comments (22)

Collapse
 
dmfay profile image
Dian Fay

Object-relational mapping, aka the Vietnam of computer science.

Honestly, I don't think even goto is an excommunicable offense. It's a very simple, very powerful tool that's very easy to do terrible things with, and it's almost always better to use other control structures to manage execution flow in more readable terms, but there are exceptions to everything. Notably the case of breaking out of nested while loops is only easily accomplished with goto. And of course all loops reduce down to jmp statements in assembly so we're technically all guilty anyway.

Collapse
 
jdsteinhauser profile image
Jason Steinhauser

I'll admit that at first glance, I thought that the linked article was going to be hyperbolic. However, the parallels that it drew were quite interesting and it was a really insightful read. It definitely helped me solidify some thoughts I'd had about ORMs, specifically the issue of schema ownership. Thank you for the link!

To your other point about the utility of goto, I'd argue that multiple nested while loops are more than likely a code smell. I agree, everything does get boiled down to jmp instructions at the bare metal. However, I think that the big distinction with Dijkstra's statement is that we should strive to remove goto in high-level languages. They definitely have a place in low-level languages.

Finally, while I can pretty much agree with your stance on ORMs, I'm curious about alternatives to ORMs. Do you have some suggestions? I'm always interested in learning more. Do you think that RDMS is more of the issue with ORMs and we should use more of a document store à la MongoDB, a graph DB such as Neo4j? And do you share similar feelings about GraphQL as ORMs?

Thanks again for the comment and the article!

Collapse
 
dmfay profile image
Dian Fay

We had a thread about O/RMs and other data access patterns fairly recently so I'll just point you there rather than repeat myself :) But I love working with relational databases; they're the Swiss army knife of data storage solutions, where every other form of database I've encountered has been a specialized tool for working with specific architectural patterns: document databases for hierarchical information with few to no interconnections, BigTable derivatives for accumulation of flat records, and so on. They do well for the situation they're designed to operate in, and very poorly anywhere else.

I haven't explored GraphQL much at all. I like SQL and can do basically anything I need to with that, so until someone creates a database that uses GraphQL as a native query language instead (it's probably not too far off), I don't have a reason to use it. But it seems helpful for others, and doesn't have the fundamental impedance mismatch that makes O/RMs a problem.

Thread Thread
 
rhymes profile image
rhymes

I've played a bit with GraphQL and Django's ORM (meh) today and I'm quite sure everything is going to explode when queries become really complicated :D

Collapse
 
kspeakman profile image
Kasey Speakman

Thanks for that article link.

Collapse
 
alephnaught2tog profile image
Max Cerrina

I am so excited to read that you have no idea.

Collapse
 
nektro profile image
Meghan (she/her) • Edited

BitCoin is a disaster and should never have become a thing

Collapse
 
jdsteinhauser profile image
Jason Steinhauser

There are two good things, in my opinion, that have come from BitCoin:

  • A way for people in economically-starved countries to receive payment for goods and services in a non-fiat currency
  • The widespread concept of an immutable chain of trust

Despite those, I mostly agree with you - the way that BitCoin has been used, as a whole, has had a negative impact on the world economy and the world's power consumption.

Collapse
 
nektro profile image
Meghan (she/her) • Edited

The widespread concept of an immutable chain of trust

Blockchain has existed in many forms for years. see: git. The only reason for the proof of work is to get around decentralization fraud, but I'll get back to that.

A way for people in economically-starved countries to receive payment for goods and services in a non-fiat currency

This is a failure of established countries not providing the accessibility to modern payment services imo.

This may be unpopular opinion but decentralized currency will never take off. Money as we know it was designed to be centralized. It's a feature not a bug, to disconnect merchants from buyers and eliminate bartering. And the harm it has done to both the energy and hardware industries is absolutely horrific.

Thread Thread
 
dmfay profile image
Dian Fay

Git and blockchain are very different -- it's the easiest thing in the world to rewrite history in git, an operation blockchains are expressly designed to prevent. I've got more fingers on one hand than I've heard well-considered use cases for a blockchain, but it is its own data structure.

Thread Thread
 
jdsteinhauser profile image
Jason Steinhauser

I can't argue with you on those points, honestly. I hadn't thought about git as a chain of trust, and that's a failure on my part because... that's exactly what it is.

This may be unpopular opinion but decentralized currency will never take off. Money as we know it was designed to be centralized. It's a feature not a bug

As much as Gilfoyle and I would love to think that we could get to a universal/decentralized currency, or introduce bartering back into mainstream trading, there's no plausible path to that happening.

Thread Thread
 
nektro profile image
Meghan (she/her)

I always saw Git as a blockchain because it's still using a hash-verified chain of "blocks" to send data which also happens to be able to be decentralized. Bitcoin sends financial transactions, and git sends code. What do you see that makes them fundamentally different?

Thread Thread
 
nektro profile image
Meghan (she/her)

I also don't completely disagree with why Bitcoin was invented. To get away from government and to make transactions easier on a global scale. But I feel this is more of an issue with the banks that making a new currency can not solve. The problem with money is that it really doesn't have any value. It only has the value it does because we all use it and accept it from others. Furthermore, I wholeheartedly agree my bank should not charge me for minimum balances or inter-bank transfers.

Thread Thread
 
dmfay profile image
Dian Fay

ah, you got me -- the commit hash does depend on history, so the major distinction is in usage (whether you care about modifying history) not in nature.

Thread Thread
 
thorstenhirsch profile image
Thorsten Hirsch • Edited

git as a chain of trust [...] that's exactly what it is

I don't agree. Git can only be trusted as long as everybody is playing fair. Git can neither prevent DoS attacks (pushing thousands of useless commits) or double spending (uhm... code duplication in this analogy???) as long as we're talking about a public repo. The only way to prevent attacks in git is by centralising it (restrict access, forcing a pull request workflow). Referencing earlier data by a pointer on its hash is the only parallel I see between git and blockchains. And that's obviously not what makes blockchains special.

And the harm it has done to both the energy and hardware industries is absolutely horrific.

What's the harm for an industry in selling millions of additional devices at higher prices? The hardware industry LOVES Bitcoin. But I totally agree (and I guess everybody working on blockchains does that, too) that the energy consumption needs to be cut down. Migrating from PoW to PoS or other consensus algorithms that don't rely on computing power must have top priority in blockchain research.

edit: For further clarification - it looks to me as if you think blockchains are a new persistency layer, something one should compare to filesystems, databases, and yes, maybe git. But blockchains are the worst persistency layer, no question about it. Nobody with a sane mind would exchange a persistency layer with a blockchain in an existing app.

What makes blockchains great is the possibility of building apps that were not possible with common persistency layers - decentralised apps, whose functionality are so beneficial that it's okay to base them on the technically worst persistency layer that ever existed, blockchains. (To put it in melodramatic terms.)

Thread Thread
 
jdsteinhauser profile image
Jason Steinhauser

Blockchain tech can only be trusted if everyone is playing fair also. There are 51% attacks that have been successful.

Miners are also negatively impacting astronomy and other academic research. The cost and wait time for GPU acquisition definitely is good for hardware vendors, but bad for a broad swath of GPU consumers.

Additionally, Dapps are not really a new thing. BitTorrent, Kazaa, etc., have existed for nearly two decades.

Thread Thread
 
thorstenhirsch profile image
Thorsten Hirsch

Well, 51% means more than half of the users. That's different to "everyone". Git really needs everyone (100%) to play fair. And while Bittorrent, Kazaa, etc. shared data, Dapps also share logic (code).

Collapse
 
fnh profile image
Fabian Holzer

Using field injection instead of constructor injection. Well, on the harmful-scale it is not quite on gotos level.

Violating the "single level of abstraction" principle within modules.

Recklessly introducing thirdparty libraries. Or the other extreme: the infamous "Not invented here" syndrome. Which in some sense is a form of procrastination: "Hey boss, before I write the app, I've first got to write my own crypto/auth/orm/di/you name it..."

Collapse
 
jdsteinhauser profile image
Jason Steinhauser

Using field injection instead of constructor injection.

SO MUCH THIS. It gets so much more difficult to reason about your program.

Recklessly introducing thirdparty libraries. Or the other extreme: the infamous "Not invented here" syndrome.

Unfortunately, I've been bitten on both of these in the past. Do you know how many times I've had to write BLAS (basic linear algebra subprograms) libraries because "we think we have a slightly better way to do these"? Thankfully, I didn't have to deal with frivolous 3rd party libraries like LeftPad, but I have used React or some React components when they were definitely not necessary.

Collapse
 
alainvanhout profile image
Alain Van Hout

It gets so much more difficult to reason about your program.

Could you elaborate on that, because I've never seen field versus constructor injection have any effect except for PO(J)O unit testing and the amount of code.

Collapse
 
kspeakman profile image
Kasey Speakman

null. The concept of "there is no value" is necessary, but the implementation of null as a pointer to nowhere is very problematic in most languages.

The problem is that nearly anything can be null, and in most languages there is no way to restrict a type declaration to a non-null value. A null guard clause is the quintessential example of a DRY violation.

And also if you are given a null value, there is no way to determine its declaring type.

In conclusion, null is the original showcase of what accidental complexity looks like. For alternatives, see the Null Object Pattern for OOP, and Maybe/Option for FP. C# also has the concept of Nullable for value types, and it is supposedly going to be applicable to reference types in a future release.

Collapse
 
jdsteinhauser profile image
Jason Steinhauser

I agree that nothingness needs to have some sort of representation (even typed nothingness), but null is definitely not the way to do it. Tony Hoare introduced the concept in ALGOL 65 because it was easy, but he's since said it was his billion dollar mistake. I think that may be an underestimation, honestly. I'm definitely looking forward to C# adopting Nullable for reference types in the near future, and I definitely lean towards Maybe/Option. Thanks for the comment!