DEV Community

Damien Cosset
Damien Cosset

Posted on

Our algorithms overlords

A Apple/Goldman Sacks tale

Last Thursday, David Heinemeier Hansson, aka @dhh on Twitter, ranted about an Apple product. Apple partnered up with Goldman Sachs to launch a credit card. So far, nothing out of the ordinary, a lot of companies have this sort of services. However, DHH was surprised to find out that his wife wasn't allowed nowhere near the credit limit David was authorized. You can find the whole thread there if you want.

After numerous tweets calling out the behavior, we discovered it was all because of the algorithm. The program that Apple/Goldman used in order to determine how much credit you were allowed. A lot of testimonies reinforced DHH first impressions: women weren't given the same credit limit than men, for no reason other than their gender.

Unfortunately, this is something we shouldn't be surprised about.

Algorithms and freedom

Nowadays, we live in a society where data rules everything. We log in to social media platforms and websites. We think those services are free, but they are not. Whenever we use these services, we agree to give away our data. The huge amount of data gathered by the big tech companies help them refine their business model. Ads. Ads everywhere, all the time.

With more data, those big tech companies (Twitter, Facebook, Google...) can run better ads for you. They can analyse your behavior and make sure they create the right ad for you...

These datasets are then fed to programs which are created for a particular purpose. It can be ads, or, it can be giving a certain credit limit to people.

In DHH's quest to find out what was happening, one answer kept coming up: It's the algorithm... We have not control over it...

This is frightening... Apple and Goldman both claimed they have no control over the algorithm, that they don't understand why results like these come up... Is that the world we want to live in? A world where humans give up their freedom of choice to machines and programs we don't even know how they work anymore?

Of course, some people know how they work... But my guess is there are so few of them it's just like nobody knows it...

These programs are designed to answer a question, or solve a problem. But phrasing the question or the problem is the most crucial part in the entire process. A biased question means discrimination at the end. And, unfortunately, we know that the teams creating those programs are not the most diverse... (couch rich white dudes cough). When the dust clears, it's always the same people who get hurt the most... The minorities...

Our world is changing, and dataism is coming at us hard. I'm not sure it's for the best anymore...

There is hope though

At the time of this writing, the state of New York launched an investigation over this whole thing. And it all started because someone on Twitter felt something was wrong. A proof, if we needed one, that we are not helpless in this fight. We have the right to know how companies use our data to create their algorithms. These algorithms that have an impact on so many aspects of our daily lives. If we let them run free, we might lose our very own freedom in the end...

What do you think?

Top comments (6)

Collapse
 
downey profile image
Tim Downey

And it all started because someone on Twitter felt something was wrong. A proof, if we needed one, that we are not helpless in this fight.

Unfortunately, that someone had to already have been an influencer with 350k+ followers. πŸ˜ͺ Not sure how effective it would have been for us normal folks.

That's why I feel it's important that we fight to keep the Consumer Finance Protection Agency (and the equivalent in non-US countries) and other regulators around:
consumerfinance.gov/

I think this particular issue is less about big Tech and more about big Finance, though, and that its existed for a long time. Hopefully this helps raise awareness and moves us closer to fixing this broken system.

cnbc.com/2019/02/27/american-consu...

Collapse
 
ben profile image
Ben Halpern

I’ll just leave this here too

Collapse
 
damcosset profile image
Damien Cosset

Terrifying... How much control have we already lost?...

Collapse
 
marcellothearcane profile image
marcellothearcane • Edited

There has to be some human input - for one thing, the code wouldn't exist without a human programming it.

It is probably a machine learning system that was trained with biased data models, or is too complicated to change, or else the company genuinely doesn't believe women should have equal credit limits and won't change.

Collapse
 
damcosset profile image
Damien Cosset

There is absolutely a human input. As for why this thing happened, I'll say: I believe there is a discrimination towards certain groups of people when those systems are created...

We know what the tech industry is capable of producing in terms of discrimination on 2019,so I'm pretty sure these results are not random.

Collapse
 
mjsarfatti profile image
Manuele J Sarfatti

There is a statement by Jamie Heinemeier Hansson (the one who was denied the credit limit she should've had access to). It's worth reading in full: dhh.dk/2019/about-the-apple-card.html