there is, another example is google translate. If you take a phrase in english like "she is a doctor", translate it to a language with no gender pronouns, and then back to english you get "he is a doctor"
I tried and confirmed it you are right, there is a group attribution bias there probably caused by the training data than developers. But I'm interested to know what's your solution for this specific "problem"?
there is, another example is google translate. If you take a phrase in english like "she is a doctor", translate it to a language with no gender pronouns, and then back to english you get "he is a doctor"
Just trying to understand here, So what should be the behavior in the ideal case ? (50% of the time he and 50% of the time she ?)
since when we convert a sentence to a language with no gender pronouns there is no way to retain the gender when converting it back to English right ?
I tried and confirmed it you are right, there is a group attribution bias there probably caused by the training data than developers. But I'm interested to know what's your solution for this specific "problem"?
you got it, is not hat developers are mean and create biased AIs, is that the data used to train them has bias and that is picked up by the AI