There has to be some human input - for one thing, the code wouldn't exist without a human programming it.
It is probably a machine learning system that was trained with biased data models, or is too complicated to change, or else the company genuinely doesn't believe women should have equal credit limits and won't change.
There is absolutely a human input. As for why this thing happened, I'll say: I believe there is a discrimination towards certain groups of people when those systems are created...
We know what the tech industry is capable of producing in terms of discrimination on 2019,so I'm pretty sure these results are not random.
For further actions, you may consider blocking this person and/or reporting abuse
We're a place where coders share, stay up-to-date and grow their careers.
There has to be some human input - for one thing, the code wouldn't exist without a human programming it.
It is probably a machine learning system that was trained with biased data models, or is too complicated to change, or else the company genuinely doesn't believe women should have equal credit limits and won't change.
There is absolutely a human input. As for why this thing happened, I'll say: I believe there is a discrimination towards certain groups of people when those systems are created...
We know what the tech industry is capable of producing in terms of discrimination on 2019,so I'm pretty sure these results are not random.