Research has long shown that humans are susceptible to “social identity bias”—favoring their group, whether that be a political party, a religion, or an ethnicity, and disparaging “outgroups.” A new study by a team of scientists finds that AI systems are also prone to the same type of biases, revealing fundamental group prejudices that reach beyond those tied to gender, race, or religion.
This article was originally published on this website.