DEV Community

Discussion on: The Future (of AI) is Female: How Hiring Bias Mitigation in NLP Can Be Great for Women Now and in the Future

Collapse
 
nicolasini profile image
Nico S___

there is, another example is google translate. If you take a phrase in english like "she is a doctor", translate it to a language with no gender pronouns, and then back to english you get "he is a doctor"

Collapse
 
nijeesh4all profile image
Nijeesh Joshy • Edited

Just trying to understand here, So what should be the behavior in the ideal case ? (50% of the time he and 50% of the time she ?)

since when we convert a sentence to a language with no gender pronouns there is no way to retain the gender when converting it back to English right ?

Collapse
 
opshack profile image
Pooria A • Edited

I tried and confirmed it you are right, there is a group attribution bias there probably caused by the training data than developers. But I'm interested to know what's your solution for this specific "problem"?

Thread Thread
 
nicolasini profile image
Nico S___

you got it, is not hat developers are mean and create biased AIs, is that the data used to train them has bias and that is picked up by the AI