DEV Community

Cover image for We Can’t Talk About Privacy Without Developers
Ruanna for TomTom Devs

Posted on • Edited on • Originally published at developer.tomtom.com

We Can’t Talk About Privacy Without Developers

Did you attend Codeland? I did, and was quite inspired by Paula de la Hoz's talk ("Freedom of Security") and panel comments on privacy. I thought I'd share this piece my colleague Olivia wrote on why the developer community needs to be the catalyst to change how user data is handled.

You, For Sale

If you are 21 today, your data has been for sale since you were eleven years old.

Let that sink in for a moment. What exactly were you thinking about when you were 11? How your shiny new flip phone came equipped some of the first semi-efficient mobile web browsers? Or maybe you were one of the lucky, who had the first new iPhone. Most likely, not that. Rather, you were probably excited to own a phone and more worried about typing long text messages with your T9 keyboard.

Of course, if you are 31 today, your data has been for sale since you were at least 21, if not younger. Legally speaking, that’s not so specifically problematic, but again, what were you thinking about in regards to your data at 21? You may have had your first or second smart phone and lead a whole era of searching questionable things on your own personal, seemingly self-contained little web search engine.

Neither of groups of people, or likely anyone with their first few “smart” or “dumb” phones, were substantially aware that new upgrades adapted from their full-size computer counterparts were in fact using data to optimize experience and further development. Features of phones have long aimed to mimic those of computers, such as our current reality – and this process of research, development, releases and progress requires customer data.

Thus, begins the slippery slope at the beginning of every data privacy argument: companies require data, in some form, to refine their products and create the updates that (a) customers want, and (b) that drive revenues. At face-value, this is a win-win. Turned on its head, freshly-hatched free applications have a hidden fee—your data. Or, as our co-founder Corrinne Vigreux put it in her interview at CES this year: “If you get something for free, people are happy, but I think there is a growing consciousness that perhaps free has a price and that price may be a little bit too high. Privacy is key.”

Data is the New Currency

The world runs on an inexhaustible ocean of data. Planes, trains, automobiles, spacecraft and boats all produce and rely on constant streams of it going in and out of highly calibrated systems to navigate the globe and beyond. These are usually benign collections of weather, storms, skies and other fields quite similar.

Where the issues begin is when data about people is used:

  • Without consent
  • Without control
  • Without knowledge
  • Without the effort to communicate the risks
  • As an understood transaction

In a study of over 600 websites selected from cross-referenced company presence on more than one publicly traded platform, 57% of viewers of those sites believed that the state of having a privacy policy at all meant that their data was not stored or sold. For what it’s worth, only 1% of users click on those policies at all.

We spoke earlier about the sale of minors’ data. In the US, it has been illegal since 1998 to collect online data from children under the age of 13 (per COPPA). This is claimed to be the demarcation after which young teens can decide for themselves that they consent to having their data stored by a website, but websites make a profit of the sale of data, regardless of the conditions around its consent or subject – whether you knowingly or unknowingly contribute your data has no bearing on its value to the site using it.

Going a layer deeper, how can the increasingly data-aware public expect a younger generation who grew up with this understanding of “personal information for sale” to adapt to a privacy-savvy climate? One could argue that the problem is hindering its own solution, as young people are entering the workforce as engineers – architecting systems which themselves sell data -- in great numbers.

The consequence of this is that this collected data can be used as dots, which are easy to connect. These “dots” of information start with the simplest of answers, such as the colors in a picture someone prefers, or who or what is in the picture. Who do you like to date? What do you like to watch? When is your birthday? When do you do your shopping? Do you make a lot of money, or not so much? Innocuous questions by themselves, asked by a friend, create a constellation of your identity.

It’s often said that information in the wrong hands is dangerous – but when it’s your information, don’t you find it dangerous in anyone else’s hands? Anyone can put up a sign and label colleting your information as research, but they don’t tell you where else your details end up along the way.

Then again, you may not mind this, because it seems inescapable, but these broad data points in the wrong hands can lead to or live alongside more specific ones– your phone number, your email address, and from these, your address, your financial information. And, in most cases, this small leap isn’t discovered until it’s too late – more than 20 million Americans have more than one social security number associated with their name; 100,000 Americans have had no less than five different social security numbers tied to their name.

This begs exploring the mechanics of this situation… those who are actually holding the steering wheel behind the subject of information is really collected online. It turns out, the path from creating privacy practices to implementing them ends up in the hands of those very same architects, young and old.

Who Makes These Decisions? Who Implements Them?

All too often, you’ll see a press release or other news statement about a new data scandal and the renewed efforts to protect the public’s privacy given at the behest of review boards that has been carefully arranged by PR executives to ensure that companies don’t over-promise and under-deliver.

There is a motivation here – companies selling a technical product often rely on a positive public opinion, which is in turn driven by the perception of a positive relationship with common topics related to ethical data usage… GDPR compliance, anonymity, control over settings, etc.

However, all of these very topical practices are of course embedded in the software of each application, or in the code of each independent website – at the end of the day, developers take these changes into account and decide what and how to go about implementation, themselves.

It all comes down to the type of product and what kind of data is necessary to build or sell that product. If you’re in the market of selling houses, addresses are a fairly required field of data, but if you’re marketing coffee mugs, wouldn’t a customer find it strange if a seller asked for their address before they started browsing?

And so goes the matter of optional and required data fields – “optional” fields create a decision branch that only a developer can defend. All too often, codebases themselves require more data than many engineers know is really necessary – which is why we created our developer resources differently, by allowing developers more control over which fields they include for their end user.

"What Information Do You Really Need?"

One of the first common practices a privacy advocate will tell you to practice is to only give a website or service the information they need from you. Whenever you fill out an online form, more often than not there are required fields denoted by a *, and empty fields that, well, just don’t really need content.

Say you sign up for a music streaming service – most will ask for your birthday, a common red flag. I don’t need to give my age to listen to the radio, right? You don’t, but music streaming services have established that they have the right to use your age to optimize advertising for your demographic, and verify your identity on your subscription. It’s up to you to decide if you think this is fair, or if you’ll cancel that subscription and head elsewhere for your listening.

Let's Pretend

Now that you know a little about required and optional data, let’s create a scenario.

When someone raises a red flag, and a company is asked why X type of data was collected from users, it may have a purpose – and an engineer could probably tell you the real one. A company’s marketing tactic may be at odds with what a website requires to run its software.

In this case, a fantasy playlist mixer called “Mix” does the very same as the above – first by innocuously placing advertisements every half an hour of playback to amplify revenue. Then, it is also beginning to see simultaneous users on the same account sharing subscriptions. With only ten subscribers this past quarter, “Mix” is now struggling, and these shared accounts are costing them money.

Advertisement companies are reporting a bad return on the ads they place with the company – a team member suggests collecting listener age to target the ads and increase ROI, but the company ethos holds strong, and they decide against it. They don’t really need to collect user age for any other reason, so the logic doesn’t stand up to their leadership’s scrutiny. This situation resonates as it mimics our own narrative at TomTom – there are data fields you just don’t need to collect, and why the structure of our developer resources is more open-ended.

For the sake of this example, however, we’ll need to assume that the best viable way to verify the real accountholder is via date of birth – so to prevent further account sharing, the product team creates an update requiring each user to prove the correct DOB of the person listed on the Mix subscription, which is stored for future use (e.g., if the subscriber logs out/resets a password etc.).

All of a sudden, if communication did not occur between this product team and marketing (or whomever decided the execution of privacy practices), a real snafu has been created: user ages are now stored on company servers, without an update to the policy, and without prior user information of the action forthcoming.

Theoretically, one or more other teams could even use this age field to their advantage to include other features before an executive team finds out, in a bad case – or, worst yet, a user enacts a GDPR right to removal, and one or more teams realize that an age has been stored without their knowledge.

There are a lot of assumptions and mistakes made along the road here, which aren’t necessarily replicable in such a dramatically short time, but are also easy to fall into, especially for smaller companies and startups who need to provision for the public’s growing awareness of these topics.

Growing Awareness

So, what combats rampant use of customer data, that can seemingly only be saved by changing laws? The users themselves, and the developers who create actions and changes within a codebase.

Developers, whether they know it or not, have everything to gain from acknowledging their influential role in the conversation of evolving practices. Without the decision-making authority that powers a company’s codebase, mistakes like the one illustrated above will simply continue to happen, driving efforts to improve access to privacy education among users in long, slow directions.

Finally, in a very obvious manner of speaking, developers drive developer habits – the community has the power to start its own trends with a far reach. And perhaps this is needed more than ever before, as a reminder – such a growing awareness suggests placing the younger developers rapidly rising through the workforce at an ideological disadvantage, having grown up with their data on the worldwide market. A change in education might stand to be better received across the age strata, changing a user’s understanding before more increasingly younger consumers accept that data for sale is just a fact of life.

Even more so, developers are the online community’s essential catalyst for change. Without changes taking place within codebases across the web, no one will see the right side of data ethics in online space anytime soon. Developers who want to see a shift toward any substantial respect for privacy need to create this change where they are able, when they are able, and encourage their teams and companies to do the same.

Why Do We Care?

Changes in the developer community start with spaces created where developers have the power to do good. We’d like to think of ourselves as architects of this kind of space.

TomTom has been GDPR compliant for over 15 years, with an industry reputation of transparency and fair information for users. Based in the Netherlands, TomTom has been able to proudly align with the European movements toward corporate accountability in the space of rapid, unseen exchanges of personal information that have pervaded public awareness in recent years. As a developer evangelist for a company who cares for its user’s privacy, I think there’s no better time at TomTom to be a privacy advocate, as the worldwide market of software users sparks a change in narrative for user education.

TomTom also maintains an extensive developer portal where we make available decades of mapping intelligence for engineers to learn and grow their projects and personal portfolio of geospatial experience. Within our documentation, SDKs and examples, we embed an open-ended approach for developers to customize their user’s mapping experience from the ground up – providing a sandbox to build with privacy in mind. Because in reality, the fields needed for our own location-centric actions to move users require very few fields, giving an engineer to build up or strip down the number of required fields for their own necessity – not ours.

Knowledge, information and communicating the user’s control over their data are an embedded part of the process of using TomTom from start to finish – customers need to provide informed consent before any location data is collected, even though the user and their GPS trace will be segmented, and their location information de-identified via multiple steps.

This collection in the first place is limited, leaving us able to create simplified, readable privacy statements for the general audience, as well as a history of openness about the need for data within our technology’s, while respecting the user’s right to privacy and access to their data.

We do not need to know who you are, we only want to put data directly back into making our products and services safer and more efficient. We will never make customer data available to third parties for commercial uses. We are innovators, not advertisers.

It's Your Turn

Just like everyday users are learning about data exchanges and making selections that benefit their privacy, developers can work to bridge communication within their organizations and raise community awareness.

Recent years of high-profile cases of data breaches and privacy enactments worldwide have worked to set the stage for changes to come both from within consumer facing companies and outside of them.

So, remember: Data privacy is not just leadership’s business or the business of the press, it’s everyone’s business.

What are your thoughts on data privacy? Let us know in our developer forum, or in the comments below.

This article originally appeared on https://developer.tomtom.com. The original author is Olivia Vahsen. Feel free to connect with her here on Dev.to, and/or reach out on Twitter!

Top comments (2)

Collapse
 
33nano profile image
Manyong'oments

Awesome post. Learned a lot thanks

Collapse
 
ruannawrites profile image
Ruanna

Thanks for reading!