DEV Community

Cover image for Morality and Ethics - Caring is Everything
Sam Warner
Sam Warner

Posted on

Morality and Ethics - Caring is Everything

Computer ethics is an incredibly wide and subjective topic. My recent exposure to a few amazing educational events/talks and raised awareness has allowed me to realise exactly how vast and subjective the topic can be. At times during my journey in to the mystic shrouds of what makes ethical technology (and how software developers can take responsibility for developing ethically), it felt like far more questions were appearing in my head - often multiplying by the second - but very few answers.

This in itself is somewhat scary (and kind of demotivating).

While I can't speak for everyone in the field, I would like to think of myself as a problem-solver. I can apply algorithms, established design patterns, or some kind of process to problems that are presented to me, and get a solution. An 'answer'. Something that solves a question.

When I can't do this, it is scary. The fact I can't fiddle around until I find an answer when these ethical questions are raised is incredibly worrying - especially when we consider how pervasive these unaddressed issues are.

Demotivated
A sea of questions with no clear answers can feel quite demotivating.

Before we dive any deeper let's quickly define what I mean by 'morality and ethics', as while these words are powerful, they can sometimes be slightly too big to comprehend:

Ethics – moral principles that govern a person's behaviour or the conducting of an activity.
Morality – principles concerning the distinction between right and wrong or good and bad behaviour.
Moral Judgment – evaluations or opinions formed as to whether some action or inaction, intention, motive, character trait, or a person as a whole is (more or less) Good or Bad as measured against some standard of Good.
Personally, I think that these terms 'Good' and 'Bad' are pretty subjective, and could even be to blame for the sea of grey that is held across the board when it comes to ethics in technology. A fantastic talk at QCon London this year by Yanqing Cheng titled 'Responsibly Smashing Pandora's Box' showed the importance metrics can have when we are making these more human decisions. By evaluating cost against benefit, we can maximise the impact of our technology for good. Although we commonly understand cost-benefit analysis to be somewhat utilitarian, and as such incompatible with more moral viewpoints, for example those focused on deontological concepts such as ethics, and putting a 'value' against potentially life-changing access or effect of new tools might initially seem somewhat impersonal, I can certainly see the use of it.

What Are You Worth
What are we worth? Now there's a thinker...

Software engineering is an immature field. This is undeniable. Far too frequently we see irrational behaviour around new technologies, jumping on whichever bandwagon is shiniest, and over-engineered solutions with maddening levels of complexity follow. Far too frequently we see a lack of thoroughly defined specifications (often not through lack of trying). Far too frequently we see people turn a blind eye to the implications of their work.

Yet it still seems as though there is interest in ethical technology. Most recently during my time at QCon London 2018, the third day had an incredible 'Tech Ethics in Action' track which was overwhelmingly rated highly, and home to some of the most incredible and inspiring talks of the entire event. Another ethical technology event I was lucky to be a part of, VHacks 2018 (the first ever hackathon at the Vatican) was exclusively designed to solve societal issues through software and hardware, and received international exposure in the form of articles in national newspapers, live news coverage, and even WIRED posts! StackOverflow's recent insights survey shows us that overwhelmingly developers are becoming more and more concerned about what their day-to-day development should involve, and how it should be gone about.

All of this points to the idea that as professionals, we are interested in the ethical implications of what we make. We want to be proud of what we make. We want to make a difference.

We are the innovators and trendsetters.

As I mentioned earlier, these questions are not only far-reaching, but also numerous. While overall, we as developers seem to have a reasonable understanding of 'right' and 'wrong', the StackOverflow insights survey linked above shows that we see a lot of ethical 'grey' - infrequently in real-world contexts do we actually see black and white.

Ethics1
Ethics2
Ethics3
Ethics4
StackOverflow's insights survey devoted an entire section to ethics in software development. I believe this to be indicative of the shift we are seeing from ensuring programming and computer science are taught and more widely accessible at school level and in other countries (there is already a very solid foundation here from huge developments in recent years, including the distribution of BBC micro:bits in UK classrooms) to maturing this new teaching channel by educating on some of the finer, yet still important, aspects of programming including the ideas behind morality in this environment.

Of course, before we can educate on such an important subject (especially in classrooms around the world), it would be sensible to review the answers above and formulate an idea of what current issues are, and ways to begin resolving them.

As you can see, nearly 80% of those surveyed believe that developers as a whole have a responsibility to consider ethical implications of their produce. Now, I ask you - when was the last time you actively did this? Hold a little poll at your company, discuss this with your dev team, or just sit down and have a one-to-one about the ethical implications of your latest project.

It's very easy to tick a box in a survey that says 'yeah, of course I care', but actually do we practise what we preach? I'd be inclined to believe less than this 80% of developers do.

Furthermore, I'd be interested in hearing from any of the 20% that believe as a profession we have no obligation to consider ethical implications of our doing. Speaking generally, as a field we are intelligent and able. We should hold ourselves to a higher standard than to allow ourselves to be mercenaries, available to make anything for a price. By passing the 'blame' or 'responsibility' on to someone else, we step further away from controlling and regulating our own products - products we should be able to have a sense of pride in.

Another unsurprising but remediable issue that StackOverflow's survey has brought to mind is what should developers do if asked to write code for an unethical purpose, and how can we report ethical issues with code? For both of these questions, over a third of respondents replied with 'Depends on what it is'. This quite clearly illustrates the current uncertainty (and maybe even fear of speaking up) when it comes to computer ethics in action. As an industry, we need to make sure there are safe ways to air concerns that arise during the course of development. I am sure some infrastructure for this exists already in many companies, but we could (and possibly should) regulate this to allow ethical concerns to be raised without causing undue stress and worry.

While neither of these are concrete solutions to the problems we are currently facing, a little introspection and regulation would (in my opinion) be a great start to real progress.

And so here we have it. My final three takeaways from this whirlwind tour of ethical (and unethical) technology.

  1. We need to use the gift of technology to create accessible (and useful) services for all people, regardless of social status, ethnicity, gender or generation. By reducing the cost, energy consumption and resource requirements of our software, we provide a wider pool of access to potentially life-changing technology and provide potential quality of life improvements to some of the most disadvantaged of people. Most importantly, we must understand that it is not only multinational billion-dollar companies that are capable of doing this - the smallest contributions to ethical technology build up to create local networks of carers, eventually funnelling in to a global community which can tackle larger scale problems.
  2. To achieve this, we must understand that technology requires diversity across nationality, gender and ethnicity. While some of these factors are now being realised and remedied (check out John's recent blog on raising your daughter to be an engineer), this is not the end of the battle. Better technologies come from being diverse. By developing (and managing/marketing/directing) in diverse teams, we are far more likely to realise the ethical implications of what we are doing, and improve the likelihood of flagging any issues before they impact end-users.
  3. A people-first approach is key to making technology benefit humanity. In recent years, we have looked towards technology as a way to enhance our humanity. If we are to continue in this trend, we must focus on the design of accessible and widely available ethical technology. If we talk to users, consider the impact of our ideas, understand who the products are made for, and which human problems we are aiming to solve, we will reprogram the values of software engineering. Starting with studies about the ethical impact of the product we're creating, we may continually make projects which make human difference to the world and build companies that serve humanity rather than just the interests of shareholders. Most importantly, care. Care about your product. Care about how it is used. Care about the implications of what you are making.

When it comes to developing ethical software, care is everything.

Originally posted on the Black Pepper website.

Oldest comments (6)

Collapse
 
cess11 profile image
PNS11

Software engineering isn't immature.

The commercial-industrial applications of software engineering are.

Software engineering as a discipline is clearly mature compared to e.g. genetics or neuroscience, which get their breakthroughs in the eighties and nineties rather than the fifties and sixties. And were also practically based on computers and software when they did.

One of the primary reasons for this is the basis for software engineering in math and formal logic, an old and mature way to reason about systems, nature and thinking. Lambda calculus is almost a century old and Turing died in 1954.

The immaturities mainly come from corporate management and politicians refusing to learn the discipline and its theoretical groundwork.

Collapse
 
sjwarner profile image
Sam Warner

It's an interesting point you bring up. I do agree that the immaturities can be exacerbated by an unwillingness to learn and educate ourselves.

However I do believe that we are (relatively) immature in more ways than just this, certainly in comparison to engineering for example. 60 or so years is not a huge period of time in comparison to many other disciplines, and we see evidence of this time to time (check out this Medium article for some great examples).

Thanks for bringing up these extra points though - it is always interesting to hear other opinions on the matter!

Collapse
 
jfrankcarr profile image
Frank Carr

Here's some questions to consider:

  1. Would you feel confident in testifying truthfully about your company's software development practices in a civil deposition? In open court during a civil trial?

  2. Would you feel confident in talking to the FBI, SEC or other law enforcement agencies about company's software development practices? What about testifying in a criminal trial where your bosses are facing charges and you might be accused as well?

Collapse
 
sjwarner profile image
Sam Warner • Edited

These are both interesting questions.

I personally think that by the time we get to any kind of civil trial or law enforcement are required to be brought in, we have already failed as 'ethical' developers.

As we build our products and services, one of the constant questions that we should be asking is "what are the implications of this". I think to some extent we already do this quite well (certainly we consider the implications of design on a product's performance, we just need to think about the human implications too)!

This is why I mention the possibility that "we could (and possibly should) regulate this to allow ethical concerns to be raised without causing undue stress and worry" - ethical bodies might prevent us from needing to escalate these issues higher!

Collapse
 
tcolliganap profile image
tcolligan-ap

Although we commonly understand cost-benefit analysis to be somewhat utilitarian, and as such incompatible with more moral viewpoints, for example those focused on deontological concepts such as ethics, and putting a 'value' against potentially life-changing access or effect of new tools might initially seem somewhat impersonal, I can certainly see the use of it.

Not really sure what you mean by this here? Ethics isn't really a deontological concept. Utilitarianism is a Moral/Ethical Theory.

There are three major forms of normative ethics: consequentialism (of which Utilitarianism is a part), deontology (of which Kantianism and Divine Law are a part), and Virtue Ethics (of which Aristotelianism is a part).

Personally I have more of an interest in Virtue Ethics which focuses on the inherent character of a person rather than on specific actions, so that would be an interesting way to explore these questions.

Thanks for the post, Ethics doesn't come up quite often enough in the Software field, and does not typically become a focus till after something questionable has happened.

Collapse
 
sjwarner profile image
Sam Warner

Hi @tcolligan-ap! Thanks for the comment.

Sorry I was a bit vague there, I was referencing deontological ethics, i.e. the way an action's morality can be judged based on rules or metrics. I hope that is clearer now!

I completely agree that Virtue Ethics is also an interesting way to look at some of these questions (though obviously far more personal).

You have a very good point that questions around ethics seem to only be asked after an incident. I hope that one day soon we will have mitigated the risk of this happening again (and repeatedly) by being more mindful about ethics and relentlessly asking ourselves questions!