DEV Community

Discussion on: 10 rules to code like NASA (applied to interpreted languages)

Collapse
 
aminmansuri profile image
hidden_dude • Edited

These rules don't make sense for all types of software. NASA is working with embedded systems that are limited and don't use GCs. Also, I believe they aren't OO.

An anti-pattern in GC languages is to hold on to memory for a long time. If you aren't allocating and (implicitly) freeing objects rapidly in modern OO languages, then you're holding on to many objects for a long time. And that will cause the GC to work much harder in some cases. GCs today are optimized to get rid of short-lived objects, and perform poorly when too much data is escalated to become long lived.

Of course, the principle that you should limit your code to use O(1) RAM when possible is a good one. Because if you don't you'll get out of memory errors. Of course, the impact of that showing up in a web app is not the same as your robot crashing right before it lands on Mars (thus causing a physical crash and a loss of 1 billion dollars and 20 years of work).

My point is software is different. And rules need to reflect those differences. NASA for example is not known to be very good with staying within timelines and budgets. For commercial software that is probably much more important that avoiding an improbably crash now and then.

Collapse
 
andrewharpin profile image
Andrew Harpin

These rules are taken from MISRA, which is typically used for safety critical embedded software.

Yes they aren't always applicable for all software and languages, but they are good things to consider.

Collapse
 
xowap profile image
Rémy 🤖

All I'm saying, in essence, is that you need to be accountable for the resources that your program uses. Otherwise you risk blowing things. More than once I've seen a dev app blowing up when reaching production volumes and I'm certainly not talking about Google-scale.

So the advice is more a O(whatever(n)) complexity in RAM and time but with a bound on n and thus a bound on whatever(n).

If you work on data, work by pages. If you work on a stream, work on N items at once. If you do big data, do Map/Reduce.

Also NASA was notoriously on time to put a man on the Moon, so I'm guessing that their methodology to push the boundaries of science and Humankind won't be a hindrance for more conventional tasks. At least in my case, these rules help me on a daily basis to deliver quality code on time.

But yeah first time I read that I was like "bollocks nobody needs that". I guess denial is a first step :)

Collapse
 
mikeschinkel profile image
Mike Schinkel

NASA also had a huge budget relative to the time period when they were putting a man on the moon. Something that most conventional tasks do not have the luxury of having.

Thread Thread
 
xowap profile image
Rémy 🤖

Funnily enough there was no budget allocated to software on the Apollo program.

But you need the budget to prove everything, not to apply things in best effort mode. In my case, applying those rules saves time and not the opposite.

Thread Thread
 
mikeschinkel profile image
Mike Schinkel

"there was no budget allocated to software on the Apollo program."

In those days software was an afterthought for bean counters. Back then it was just rolled into "Development & Operations." And with $28.7 billion in inflation adjusted dollars for that line item, let's just say they had enough money to get it right. Which is rare for software projects today.

"applying those rules saves time and not the opposite"

Let me first say that I wrote a long comment that was subsequently eaten by my browser's interaction with this website and its lack of maintain a cookie-based copy of comments in progress. And that comment started out by saying that your article was great and that it had a lot of really good advice. Unfortunately I was weary of typing it in again so sadly it was lost to the ether.

But your article also had a few points of opinion, the nature of which is impossible to prove is time saving in its application. To assert otherwise would just be hubristic and would illustrate nothing more than confirmation bias. #fwiw