DEV Community

Sarthak Sharma
Sarthak Sharma

Posted on

If this happens, will you be loyal or ethical?

Well, privacy is the big thing in the industry right now. People have started caring about it and are becoming more aware of how their data is being used. New companies with more transparent data policies are coming forward, but that's not enough. Most big companies we know still store alarmingly high amounts of user data, with or without our consent. Because let's admit it, most people don't read those long T&C documents.
So let’s say, the company where you work as a dev is doing something similar, and you are the one who is supposed to write such a module. What would you do?

Top comments (36)

Collapse
 
niorad profile image
Antonio Radovcic

Something similar to what exactly?

Collapse
 
sarthology profile image
Sarthak Sharma

Well, Like how about your app is an voice converter app, so that give you access to microphone and now you have to write a module that record voice in background and search for keywords in speech and then send adds according to that. (Very hypothetically but still) 😬

Collapse
 
niorad profile image
Antonio Radovcic • Edited

Wouldn't say it's hypothetical. I'm certain stuff like this and much worse is already happening.

With the current situation (good job-market) I would certainly not continue on that project. (You know, were I smart enough to work on things like voice-detection in the first place 🙃)

I can't say how more dire circumstances would influence that, but recording voices in the background is already pretty unethical and I'd draw my line much earlier.
(I wouldn't work for ads/gambling/drugs/military.)

I already have mixed feelings about all the e-commerce-front-ends I worked on ¯_(ツ)_/¯

Thread Thread
 
sarthology profile image
Sarthak Sharma

Good call man.

Collapse
 
madhadron profile image
Fred Ross

Well, in Europe it would be flat out illegal thanks to GDPR.

If you're on a mobile platform and have any kind of sensible quality assurance, it will get blocked there because it will kill the battery life of the phone, and there's no better way to kill your install base as an app developer than to be sucking up your users' battery.

Collapse
 
priteshusadadiya profile image
Pritesh Usadadiya

you are the one who is supposed to write such a module.

you mean write a module that uses / reads such data ?

All stuff on Data policy and user consent is something that legal department/company lawyers / Upper management looks after and is much above the average dev's pay grade.

Collapse
 
sarthology profile image
Sarthak Sharma

But again some time the same dev can be a user of another app that uses same methods to manipulate him in buying stuffs, Karma. 😀

It’s just about the culture we make. I concerned more about the transparency of these policies. Companies can be open about it no need to hide these things.

Collapse
 
qm3ster profile image
Mihail Malo

I don't think the fact that the targeted ad is super-effective is the issue. That just makes it a good ad. The issue is the actual data being listened to.

Thread Thread
 
sarthology profile image
Sarthak Sharma

The good add is just an illusion bro. Data used to generate those good adds can be used for anything. Remember what facebook did.

Thread Thread
 
priteshusadadiya profile image
Pritesh Usadadiya

yes, and people are still using it. It might be due to lack of awareness, no good alternative or any number or reason.

Unless something better comes up with same standard of User experience, people will keep using it.

@levelsio came up with No More Google

There is a many great alternatives to google listed there. Maybe something similar can be done for other platforms.

Thread Thread
 
qm3ster profile image
Mihail Malo

Sorry, but it's *ad, not add.
And what I meant by "ad" is only the part you see.
Even when loaded in the same iframe as the ad, or even when tracking the external-src image load, I would still address the tracking mechanism separately, as "the tracker".

If you mean that the ad is "evil" because of how the data was obtained?
Well, in a lot of cases it was obtained pretty legitimately. But sure, we should be more aware of how much data we are publishing. Hence why I said:

The issue is the actual data being listened to.

 
sarthology profile image
Sarthak Sharma

Yeah that’s why i also started nomoogle

sarthology / nomoogle

🐻 Chromium extension to get rid of Google addiction

Nomoogle

A simple Chrome extension that can help you get rid of Google addiction.

For firefox extension check this repo : nomoogle-firefox

Installation

  1. Clone or Download the repo.
  2. Go to Extension settings
  3. Enable Developers Mode
  4. Click on "Load Unpackaged" and add the folder you downloaded.

Features

  1. Get google alternatives in one click
  1. Get popup Reminder, Everytime you use a google product
  1. With two special modes

Strict Mode: Block the entire webpage completly, doesn't allow you to move forward.

Redirects: Automatically redirects the page to popular alternative.

Thanks to

❤️ Special thanks to @levelsio. 🙅‍♀️ Nomoregoogle was an inspiration for this.

License

MIT License


Thread Thread
 
qm3ster profile image
Mihail Malo

@priteshusadadiya @sarthology Don't you think that's a bit extreme?
What tangible benefit does an individual get by not using all the google products all the time?

Thread Thread
 
priteshusadadiya profile image
Pritesh Usadadiya

of course all google products have their merits and they are obviously very good.

I am just saying , there might be equally good privacy focused alternatives that people might not know about.

In the end it really all comes down to individuals and their choices.

Thread Thread
 
sarthology profile image
Sarthak Sharma

Exactly, thing is a good market competition.

Collapse
 
leightondarkins profile image
Leighton Darkins

Generally, when approached by someone to do something that "requires" collecting, holding on to and working with large amounts of data, I'll point them to this article about Datensparsamkeit

"It's an attitude to how we capture and store data, saying that we should only handle data that we really need."

Typically I'll follow that up with a mention of GDPR and how holding and working with all of this data increases our responsibility and risk, as well as the complexity of our system.

Usually, the "requirement" for giant datasets evaporates reasonably swiftly. But if they still want the data-heavy approach. I'll build it, very carefully.

Collapse
 
drbearhands profile image
DrBearhands

Ethical.
Anybody "loyal", or as I call it, a brown-nosing egotistical dimwit, should be prosecuted. In fact I think my local law has provisions for employee responsibility. You are at least required to point out security issues to your employer.

Collapse
 
sarthology profile image
Sarthak Sharma

Oh that’s cool, where are you from? 😮

Collapse
 
drbearhands profile image
DrBearhands

I'd rather not disclose that, but GDPR mentions employee responsibility

[...] inform and advise [...] the employees who carry out processing of their obligations [...]

So employees have obligations.

Thread Thread
 
sarthology profile image
Sarthak Sharma

That’s good.

Collapse
 
cheetah100 profile image
Peter Harrison

I think this presumes a developer knows the law around privacy. I have a passing familiarity with the privacy act in New Zealand, and so I know that collecting data that isn't necessary to do business isn't legitimate, but that is a pretty wide definition.

Generally you will be given a specification and it is up to the company to determine the legal implications in detail.

However, if you are asked to do something you are aware is illegal don't do it. Sometimes it isn't illegal but just very ill advised. For example, when you are given a requirement which is very difficult to do in a secure way but easy to do by 'relaxing' security.

In such situations you need to make it clear to the management what the problem is, what the risks are, and have them make the decision. Clients have asked me to do some ill advised things sometimes. I've always been direct in my communications and usually they reconsider.

Not everything needs to be built like Fort Knox, but certainly weight needs to be given when you are storing critical or confidential data. Sometimes having the client take explicit responsibility for a decision is needed. In one occasion I walked away from a project rather than be implicated in what might follow.

The real question is what happens when the data becomes of interest to law enforcement in a criminal case. That gets interesting from a integrity point of view.

Collapse
 
sarthology profile image
Sarthak Sharma

I think that’s the reason government also try to be soft in making these laws. After all it’s the easiest way to track people for them.

Collapse
 
sebvercammen profile image
Sébastien Vercammen • Edited

Almost all organisations, including governmental bodies, have privacy leaks in some form.

  • An application (how it stores/shows info and who it shows it to)
  • Bad company practices (companies processing personal data without legitimate interest)
  • Because of employees (ultimately, people will always be a weak link).

There are also companies with transparent policies that say all the right words, but with internal processes and applications that don't match their empty promises.

Here's why I feel that most will choose loyalty over whistleblowing (though I would call it self-preservation instead of loyalty):

  • The position of authority of bosses/companies
  • How devastating a sudden job loss can be for anyone with responsibilities (especially families)
  • The fear of your whistleblowing becoming a black mark on your future employability or treatment (e.g. Chelsea Manning, Edward Snowden)

After all, it's still ethical to care for yourself first before trying to take care of others.

Your real questions are in your comment about your "crazy idea" app: Would it be useful? Should you make it? Would we use it?

Whistleblowing is already a thing, but uncommon. Glassdoor is a company review website that allows honesty (though still very different).

More opportunities for whistleblowing = good, and I believe people would use it.

But someone will eventually try to hold you personally responsible, with reasonable success (see Julian Assange & Wikileaks), for the data you process, the information you get & publish, damages caused, anonymity of sources, validating your information, ...

So the answer to your question is: Are you an activist, or are you something else?

Collapse
 
sarthology profile image
Sarthak Sharma

Well, I want it to be an open source application. Build and ran by developers community. I want to be just a maker helping who wants to make some change in their company or society.

Collapse
 
sarthology profile image
Sarthak Sharma

Note:
This discussion is to get a perspective about how much we care about user privacy. I also understand that Job and Money is also important.

I was thinking, I know this may sound absurd to you guys, what if there is an app to expose these big tech who exploits user privacy that can be used by the employees. They can use app Anonymously. Just a crazy idea 😬😬😬

Collapse
 
scottishross profile image
Ross Henderson

Being in the UK, and still under GDPR (for now), both companies I've worked for during this period have taken GDPR incredibly serious. Thankfully it's the punishment is so severe that I imagine a large number of companies are scared of it.

But if they weren't, I'd certainly look for another job and likely anonymously report them.

Collapse
 
sarthology profile image
Sarthak Sharma

That’s bold man. 😊

Collapse
 
mrispoli24 profile image
Mike Rispoli

Thankfully I'm not in this position but if I were I would plead my case the the higher powers to try to make a convincing argument to create value without the security or privacy risks.

Now, let's say that failed and now we're in the sprint where this module has to be written. Single me probably would quit and look for a new job. Married with children me does not have such a luxury. Even though the job market for developers is great, it's still not easy. Let's be real, even an experienced developer has to do serious refreshers to pass some of these interviews. There's also tons of crappy companies out there to sift through. So I would likely have to continue doing my work until I found a new job at which point I'd be free to blow the whistle. I wouldn't be able to quit on the spot though. I think the ease of getting a senior level developer job is not quite as easy as people make it sound in bootcamp advertisements.

Collapse
 
sarthology profile image
Sarthak Sharma

I understand your point also man. Clearly it’s a problem. But what if there is a way to keep your job as well as do good the world. By blowing the whistle Anonymously. 😈

Collapse
 
qm3ster profile image
Mihail Malo

For me, there is a huge divide between:

  1. Keeping data you were given:
    • storing all user activities
    • keeping content that they "deleted" and can't access themselves
    • tracking users on other domains via embeds
    • just trading data with other services
  2. Obtaining more data unethically:
    • Plain old violating agreements, especially informal ones. I don't care what the T&C says, if the front page says "Your X data never leaves your device" without an asterisk, it better not leave my damn device.
    • Something like what @niorad got as a response, turning on the microphone when not in use. Ditto with camera/location.
    • Searching the user's media and files from other applications.

I think the first kind should not face any prosecution. It's the default. How dare you lose your company data that they could analyse? It's simply not your call as the designer of the product.
It's one of the reasons we use patterns such as Event Sourcing - to avoid losing potentially valuable data.

The second kind is definitely fraud and should be dealt with at a technical, societal, and legal levels with extreme prejudice.

Finally, there are provisions to the for example GDPR(Ew, disgusting.) other than consent/privacy, such as that the users must be able to download their data. I don't think this should be a legal requirement, but I do think this is a nice-to-have, and the market will reward a feature like that when it's convenient.

Collapse
 
craignicol profile image
Craig Nicol (he/him)

I've been in similar situations in the past and my first reaction is to ask why. Often someone will have an idea without considering the security, privacy or other ethical considerations, and on closer examination will adapt or drop their proposal

If that's not enough, refuse to do it unless there's protection in writing - a clear change to the privacy policy, very clear opt in within the app. If you don't get that, don't write it because those decisions aren't yours to make.

Disclaimer: I live in the EU so I do have legal obligations under the GDPR which make the easier for me. The quickest way to shut down a conversation is to ask "is this GDPR compliant?". Outside the EU, it's worth reminding your employer what Apple did when they found policy violations in apps from Facebook and Google and killed their certs, stopping all their apps from working.

Collapse
 
jenc profile image
Jen Chan

I suspect this kind of thing already happens... ?

The ethical route would be to provide users with informed choice...? (aka a modal or question about allowing push-notifications or microphone access)

But I guess the design decisions in the case of a crazy scenario would already be made... :(

Collapse
 
sarthology profile image
Sarthak Sharma

I know 🙁 but change can be made anytime.

Collapse
 
sarthology profile image
Sarthak Sharma

😂😂

Collapse
 
frothandjava profile image
Scot McSweeney-Roberts

Is there anything I'm legally liable for?

Collapse
 
qm3ster profile image
Mihail Malo

What unethical thing could possibly happen at a bank?! O.o