DEV Community

Cover image for With great power comes great responsibility
Blaine Osepchuk
Blaine Osepchuk

Posted on • Originally published at smallbusinessprogramming.com

With great power comes great responsibility

I recently got into a discussion in the comments section of someone else's blog where I argued that many software developers are overly confident in their abilities. I further argued that this overconfidence leads to the kind of behavior that would be considered professional malpractice if a lawyer, doctor, or professional engineer did something similar in their respective field.

A fellow software developer expressed an opposing view. He made the following points:

  • only a small percentage of software can cause as much damage as medical or legal malpractice and that software is highly regulated
  • if we stop people from pursuing their interests it will stifle innovation, which he was very much against

I hear variations on this view quite often and I think it is worth exploring.

Software as a force for good

Software has enabled our modern world. We can communicate with anyone virtually anywhere in the world for free or very low cost. It puts the world's knowledge at our fingertips. It multiplies our productivity, manages our supply chains, opens up new markets, and keeps track of our money. We use software to discover new science, improve our health, and fight disease. Plus, software reduces the cost of just about everything.

And we, the software developers, are right in the middle of it and it's glorious!

But we do have some problems

Let me paint a picture for you:

  • the industry average is 15-50 errors per KLOC for delivered software (that's one error every 20-67 lines of code on average!)1
  • only 29% of projects completed in 2015 were considered successful (on time, on budget, with a satisfactory result). 19% of projects outright failed and the rest were somewhere in between2
  • software failures cost hundreds of billions of dollars EACH YEAR3
  • 90% of mobile health and finance apps are vulnerable to critical security risks4
  • consumer routers. Need I say more?5

Do you see what I'm getting at here? As a profession, we're not exactly knocking it out of the park on every at bat.

Software developed for unregulated environments can't cause that much damage. Really?

I don't think that's a defensible position. We (software developers) are creating plenty of harm in unregulated industries through our mistakes and negligent practices.

While our software probably isn't directly responsible for killing people very often, I have no doubt we are indirectly responsible for countless deaths. Plus we enable property damage, theft of personal data and intellectual property, financial ruin, lost productivity, privacy violations, voyeurism, stalking, blackmail, discrimination, loss of reputation, interference in elections, espionage, and all kinds of criminal enterprises.

I can come up with links if you don't believe me or you can just take a look at the thousands and thousands of entries in the ACM's Risks Digest database. Here's just a taste from recent issues:

I purposely chose examples from unregulated industries to illustrate a point. We don't have to build drones with guns mounted on them or faulty autopilots that fly planes into mountains to hurt people and cause serious damage.

I know we kind of expect software to be terrible but keep in mind that none of these things are supposed to happen if we are doing our jobs correctly!

Evading responsibility

I expect that someone will want to split hairs in the comments about email not being secure and it not being the programmers' fault that someone broke into the real estate agent's email account and impersonated him because his password was "password123456". And that might be true if you're looking at an individual developer. But we (software developers) know how people are using email and we know better than anyone that it's not secure but we're doing very little about it.

We can also consider another plausible scenario. Perhaps the real estate agent created an account for some harmless website. Perhaps the website didn't store their user's passwords securely. Further imagine a copy of the website's database ended up in the wrong hands and the criminals either just read the clear text passwords straight from the database or broke the unsalted MD5 hashes and recovered the password that the real estate agent used for all of his accounts.

Here, again, we know people re-use passwords and we should know better than to store them in clear text or unsalted hashes, even if our website doesn't contain particularly sensitive information. But this could never happen, right?

The software we create is powerful, which means it can also be dangerous. So, you need to be thinking about security and the unintended consequences of any software you produce. Security isn't just for sensitive projects.

Stifling innovation?

The claim here is that I somehow want to stop new people and ideas from entering the field and stifle innovation. I haven't proposed any actual action and I have no power in the industry so I'd say that my power to stifle innovation is pretty minimal.

But let's say I did propose something for the sake of argument. Maybe you can't work on software where you'd have access to personal information if you've been convicted of identity theft or computer crimes. Is that an unreasonable rule where innovation is concerned?

How about this one: if you want to work with money or personal information in a system that's connected to the Internet, you have to pass a simple test. Passing the test would show you have a basic grasp of security principles (TLS, password hashing, SQL injection, cross site scripting, maybe that it's a bad idea to post your private encryption keys on your blog, etc.). And let's throw in some ethics while we're at it. Unreasonable?

I can't think of any reason why a person who is capable of creating innovation in any area of software development would have any trouble demonstrating his or her basic competency as a programmer. Nor do I believe a reasonable testing process would stifle innovation.

Unleashing innovation

What if keeping certain people out of software development increases innovation? We know there are huge differences between the productivity of the best and the worst developers--5-28 times by some estimates6.

So what if we had some licensing mechanism to keep the worst of the worst from becoming professional software developers. Instead of having them bounce from job to job causing chaos wherever they go, maybe we could create our own version of the bar exam or something but set the bar really low. The mechanism and details aren't important right now but just imagine if you never had to deal with another net negative producing programmer for the rest of your career. How much more could you accomplish?

Wrapping up

Pretty much every jurisdiction in the world requires people to demonstrate basic competency to drive a car. While cars are incredibly useful machines, they can also be incredibly dangerous. Ensuring the basic competency of all drivers is in the public interest. Likewise, I'd argue that ensuring the basic competency of computer programmers is also in the public interest for similar reasons.

Whether you agree with my view or not, licensing software developers is not going to happen any time soon. So, go build the next amazing thing! Please just keep in mind that any software worth building can also cause harm. So, I hope you'll skim the Risks Digest and maybe think about the specific risks of your software and how you can minimize them.

With great power comes great responsibility.

What do you think? Agree or disagree? I’d love to hear your thoughts.

Additional resources

Here are some links you might enjoy if you want to dig a little deeper:


  1. Code Complete (Steve McConnell) page 610. If you have a 100,000 line application it will contain 1,500-5,000 errors! 

  2. Standish Group 2015 Chaos Report 

  3. The estimates are all over the map but they are all HUGE. See Software failures cost $1.1 trillion in 2016 and Worldwide cost of IT failure: $3 trillion 

  4. 90% of mobile health and finance apps are vulnerable to critical security risks 

  5. The quality of consumer routers is brutal in just about every way imaginable. And people can do bad things to you if they break into your router, really bad things 

  6. Facts and Fallacies of Software Engineering (Robert Glass) Fact 2. But I'd actually argue that the spread is infinite. I've met programmers who couldn't solve a toy question in a screening interview. So their productivity is going to be zero (or negative), which makes the spread infinite. 

Top comments (8)

Collapse
 
bosepchuk profile image
Blaine Osepchuk

Sure, we face tons of pressure but I don't think we should just let ourselves off the hook too easily.

Bridges and new buildings often go over budget. Engineers and project managers must face enormous pressure to keep costs under control. Yet, it's very rare that a bridge or building falls down.

There was a bridge upgrade project in my city that went something like 300% over budget for both money and time a few years ago. It was a big-ish deal here. The project manager got on TV and said he knew this wasn't the plan but the condition of the bridge was discovered to be worse than the initial inspections suggested and this was what was required to make the bridge safe.

 
bosepchuk profile image
Blaine Osepchuk

Sounds like you work at a tough place.

We push release dates back or trim features if we think the code isn't ready. And I haven't worked overtime in years (because we believe it leads to more mistakes and burnout).

Collapse
 
suedeyloh profile image
Sue Loh

I agree about the costs, and I don't think your proposal is too onerous. I'm not sure it would make a real difference, to be honest. Like any test, people would cram, then forget everything after they've passed.

If you ask me, this is one of the greatest arguments for getting as much diversity (of all kinds) into software teams as possible. Overconfidence of one person is usually tamed by others. Overconfidence of a team and a system requires groupthink that happens more easily when there's too little diversity of thought and perspective.

Collapse
 
bosepchuk profile image
Blaine Osepchuk

I agree that my idea probably wouldn't do much. My argument was structured to respond to the claim that any change would stifle innovation.

I don't know how we'll get there but we'll probably end up with some kind of governing and licensing body as software becomes more important, takes on more responsibilities of consequence, and as the profession matures. If you wanted to look into a crystal ball, I'd guess that we'll end up with something similar to what engineers have and it will probably only apply to developers doing higher risk stuff at first.

I had never thought of the diversity argument. Thank-you. I'll have to think about that.

Collapse
 
scottshipp profile image
scottshipp

Great post and I agree big time! Like you referenced near the end, the cost of negligent software to society already runs into the billions and trillions of dollars. It's hard for me to fathom how we work in this industry every day and are just happy with business as usual when those kinds of damages are happening at our hands. I think it'd be a great starting point if all software developers were familiar with the "Software Engineering Code of Ethics and Professional Practice" and had to demonstrate meeting some continuing education credit requirements around ethics.

Collapse
 
bosepchuk profile image
Blaine Osepchuk • Edited

Thanks, Scott.

I think we're getting away with it because, despite how bad we are at it, software just provides so much value.

That equation changes when people start to die or experience significant harm. So you see regulation on medical devices, avionics, weapon systems, cars (more and more), etc.

I think the list of regulated software areas is going to increase as we go forward.

But, yeah, there's a killing to be made if you can figure out how to reliably deliver working software with low defect rates and reasonable costs. Praxis tried but it's not clear to me how successful that venture was.

Collapse
 
justinctlam profile image
Justin Lam

This reminds me of a talk by Robert Martin called the Scribe’s Oath. Watch it here: youtu.be/Tng6Fox8EfI

He claims that we should start regulating ourselves as engineers before a major disaster happens and the governments start regulating our profession for us.

Collapse
 
bosepchuk profile image
Blaine Osepchuk

Yes, that talk is a variation on his 'professionalism' theme. And I agree with him.