Wow! Thank you for objectively covering this issue! I’ve seen too many discuss this with incorrect examples that were only interpreted as bias but weren’t inherently so. Sadly too often now people look to make everything bias or racist when it isn’t.
What you present is the problem of testing inherently bias input data that teaches algorithms what we want, thereby automating prejudice.
We're a place where coders share, stay up-to-date and grow their careers.
We strive for transparency and don't collect excess data.