DEV Community

Discussion on: Unethical programming

Collapse
 
jillesvangurp profile image
Jilles van Gurp

Having personal boundaries and knowing them is definitely a good thing. For everything else there's the law. But there's a huge grey area in between.

My view on AI it that it is question of when, not if something bad happens. When that happens, the people who worked on it will be those best positioned to determine things for better or for worse. So, Google declining the opportunity to work on government projects just means they have plausible deniability but no real power. Is that ethical or just marketing?

You might argue that a better outcome would have been to take the contract and then execute it on their terms. I imagine they have a pretty good negotiation position to set some boundaries and ground rules. As it is they sidelined themselves. The work will still go ahead but it will be somebody with less ethical constraints doing it. IMHO that increases the chance of a bad outcome.

Same with the Chinese. They are building submarine drones. It's an obvious thing to be building and they are well positioned to be building them and probably not the only ones. Once they have them, it will be a formidable weapon that will disadvantage all those who declined to build them for ethical reasons. I'd prefer several other countries to also have access to this. That requires somebody to set their ethics aside and start working on this.

Take the Manhattan project for example. Some very smart people worked on that and quite a few were understandably conflicted about their contributions after they saw what happened in Nagasaki and Hiroshima. Yet, it ended a bloody war that could have dragged on for years in weeks and it discouraged the US and Russia from openly engaging with each other throughout the cold war. Research would have eventually produced a bomb regardless; that cat was out of the bag as soon as scientists identified the mere possibility. It could have been the Nazis if they had been a bit smarter about not scaring away their scientific elite.

Not doing things is fine but don't do them for the right reasons. Yes, AI powered drones with guns are scary. Now given that somebody is 100% guaranteed to build them at some point would you like to have a say in who gets there first or are you happy watching events unfold crippled by your own ethics?

Collapse
 
rhymes profile image
rhymes • Edited

My view on AI it that it is question of when, not if something bad happens. When that happens, the people who worked on it will be those best positioned to determine things for better or for worse.

I'm not sure I understand this part completely. When something bad will happen with AI (it could be argued that it already happened) something bad will happen with AI. I'm not trying to argue that people shouldn't program AI, I'm just trying to point out that the grey area as you call it is precisely what ethics is for.

You can be unethical but perfectly within the bounds of the law, but that's not we're talking about. We're talking about knowingly doing something unethical.

So, Google declining the opportunity to work on government projects just means they have plausible deniability but no real power. Is that ethical or just marketing?

I'm 100% convinced they renounced the contract because of three things: insanely bad press, fear of losing high profile employees and marketing. They are too big of a company to be driven solely by ethics. But still, I'm more interested about individuals. A company that size is no single entity. You can find good and bad.

You might argue that a better outcome would have been to take the contract and then execute it on their terms. I imagine they have a pretty good negotiation position to set some boundaries and ground rules. As it is they sidelined themselves. The work will still go ahead but it will be somebody with less ethical constraints doing it. IMHO that increases the chance of a bad outcome.

Well, they were already providing software used for weaponry, how worse should it get before someone stops and thinks about what they are doing? The companies big enough to take on such a contract are the usual ones (the tech giants), is there really a difference which one takes it on? I think we're too far removed, at least I am, to be able to make an articulated argument to favor, for example, Microsoft over Google as the contractor for a contract like that. I don't think there's a difference in this instance. My point is about the what and the why, not about the who.

They are building submarine drones. It's an obvious thing to be building

Here you totally lost me, there's no "obvious thing" to be building (in my opinion) if the object of the manufacturing is a freaking robot that can murder other people totally bypassing human supervision. I'm miles away from you on this, sorry.

and they are well positioned to be building them and probably not the only ones. Once they have them, it will be a formidable weapon that will disadvantage all those who declined to build them for ethical reasons. I'd prefer several other countries to also have access to this. That requires somebody to set their ethics aside and start working on this.

So I need to buy a rifle because someone else came home with a gun? Again, I feel like I can't possibly share your world view on this one.

Take the Manhattan project for example. Some very smart people worked on that and quite a few were understandably conflicted about their contributions after they saw what happened in Nagasaki and Hiroshima. Yet, it ended a bloody war that could have dragged on for years in weeks and it discouraged the US and Russia from openly engaging with each other throughout the cold war.

I like how you're justifying the killing of 130.000 people (not sure about the actual death toll...) with cold logic. If before you lost me in your argument about "the lesser evil" now I'm appalled about your reasoning. You're basically saying "since the US was running out of money and bodies to throw at the second world war is a good thing that they used the atomic bomb so they didn't have to fight an unpopular conflict". Noted.

Now given that somebody is 100% guaranteed to build them at some point would you like to have a say in who gets there first or are you happy watching events unfold crippled by your own ethics?

No, a thousand times no. First: I don't feel crippled by my ethics at all. I feel that ethics are part of what makes us functioning human beings. Secondly you're predicating a false dichotomy: either be part of the killing machine or shut up if someone worse than you takes your place in the machine. Fortunately far smarter people than me have found many other options in modern society, among these are protest and education.

I'm sorry Jilles but I couldn't disagree with your comment more.

Collapse
 
jillesvangurp profile image
Jilles van Gurp

It's very simple. You can choose to not do things for ethical reasons. It just means that somebody else will and will get there first.

IMHO with the manhattan project, the outcome where the US did not get there first because of ethical reasons would have been disastrous. Hitler, Stalin or the Japanese Emperor. Take your pick who you'd prefer.

Einstein et al. could have done the ethical thing and could have decided to not work on that stuff. I'm glad they did though. It was the ethical thing to do.

Thread Thread
 
rhymes profile image
rhymes

It's very simple.

It is not. Not at all. Ethics is a very complicated issue that far more knowledgeable people than me have been tackling for quite a long time. It also varies in time (it changes among the decades) and space (different cultures have different standards), so no, it's not very simple.

You can choose to not do things for ethical reasons. It just means that somebody else will and will get there first.

Again, it's a false equivalence. You see the matter in black or white with the perspective of "getting somewhere" but it's how we get somewhere that matters. If there's no difference at all that you the US could have saved a shitload of grief by wiping out the rest of the world too, be the only country on the planet and have no enemies forever :D

Einstein et al. could have done the ethical thing and could have decided to not work on that stuff. I'm glad they did though. It was the ethical thing to do.

Aside from the fact that some of them regretted what they did, I never said that Hoppenheimer and Einstein should have said no. My quarrel is with your practical justification of what the US did (and no, Einstein didn't decide to actually drop the bomb), I'm not trying to play the game of what if, otherwise we can go back to the old adage: what if you could kill baby hitler.

Jilles, my post generated from two things:

  • the first is a series of programmers regretting they took part in something they (and me) deemed unethical
  • the second is to highlight that yeah, some people do care about what they do
Thread Thread
 
jillesvangurp profile image
Jilles van Gurp

Well, I can point that same argument at you as well. You seem to pretend to not understand a very simple argument; which is that most ethical things usually have arguments both ways; which indeed makes them far from simple. Simply put, there's no such thing as unethical. IMHO, I find most arguments involving ethics/moralism to be entirely unproductive.

Some of the things you are listing are not ethical problems but simply being complicit in something criminal. Simple: don't do that unless you are willing to go to jail. Other things are a lot less clear cut.

My observation is that whenever people are talking about ethics what they are really worried about is feeling good about what they do. This includes how things are perceived by others. Killing people with AI is scary. Not something I'd want to be associated with either.

But I do acknowledge that it is technically feasible to build this stuff and have to assume that several regimes I'm not comfortable with at all are actively pursuing this. Given that, I think it is ethically responsible to try to keep up with that and not be defenseless. Washing your hands in innocence might seem ethically responsible but sometimes there's value in parking the ethics and doing the pragmatic thing. I'm glad the Chinese have some competition when it comes to these things.

So when Google bowed to internal pressure this had nothing to do with the outcome: building weaponized AI for the defense industry and killing terrorists, for whatever fashionable definition of that. It had everything to do with how their users and employees felt about being a part of that.

The real ethical problem was not building this stuff but going against the will of these people and doing this behind their backs.

I find that interesting. To be clear, if I was a Google employee, I'd be interested in the fact that "do no evil" no now included collaborating on stuff like this as well. There are very few ways to spin that right.

When it comes to programmers and doing the right thing (inherently subjective) or not doing the right thing in exchange for money/job security, this is a real problem.

Thread Thread
 
rhymes profile image
rhymes

Simply put, there's no such thing as unethical. IMHO, I find most arguments involving ethics/moralism to be entirely unproductive.

I don't understand probably because I don't agree with your reference point. I do believe that unethical things exist but I know they change in time and space. Fluidity is what makes this topic hard but important at the same time.

The real ethical problem was not building this stuff but going against the will of these people and doing this behind their backs.

I think it's both, and again, the fact that I don't agree with you is why we're having this conversation at all.

When it comes to programmers and doing the right thing (inherently subjective) or not doing the right thing in exchange for money/job security, this is a real problem.

Yes, and that's why we need to do better.