automatic gender recognition – using algorithms to guess a person’s gender based on images, video or audio – raise significant social and ethical concerns that are not yet fully explored. Most current research on automatic gender recognition technologies focuses instead on technological details.
Efforts atOur recent research found that people with diverse gender identities, including those identifying as transgender or gender nonbinary, are particularly concerned that these systems could miscategorize them. People who express their gender differently from stereotypical male and female norms already experience discrimination and harm as a result of being miscategorized or misunderstood. Ideally, technology designers should develop systems to make these problems less common, not more so.
As digital technologies become more powerful and sophisticated, their designers are trying to use them to identify and categorize complex human characteristics, such as sexual orientation, gender and ethnicity. The idea is that with enough training on abundant user data, algorithms can learn to analyze people’s appearance and behavior – and perhaps one day characterize people as well as, or even better than, other humans do.
Gender is a hard topic for people to handle. It’s a complex concept with important roles both as a cultural construct and a core aspect of an individual’s identity. Researchers, scholars and activists are increasingly revealing the diverse, fluid and multifaceted aspects of gender. In the process, they find that ignoring this diversity can lead to both harmful experiences and social injustice. For example, according to the 2016 National Transgender Survey, 47 percent of transgender participants stated that they had experienced some form of discrimination at their workplace due to their gender identity. More than half of transgender people who were harassed, assaulted or expelled because of their gender identity had attempted suicide….(More)”.