Lifestyle

Outdated stereotypes mean facial recognition tech has a gender problem for trans and non-binary individuals



From using it to unlock your smartphone, or being trialled on the streets around usfacial recognition software is increasingly becoming part of our lives. 

However, like all AI technologies, they have it has its problems when it comes to accurate identification. In the past, research has shown that the tech is most accurate when assessing the gender of white men, but can misidentify women of colour up to one-third of the time. 

What about people that don’t subscribe to gender, or people who are trans? How accurate is the tech then?

The University of Colorado Boulder set out to investigate how facial analysis services perform with transgender people and those who classify themselves as gender non-binary. “We knew there were inherent biases in these systems around race and ethnicity and we suspected there would also be problems around gender. We set out to test this in the real work,” said senior author of the study, assistant professor of Information Science, Jed Brubaker.  

To conduct the research, the team collected 2,450 images of faces from Instagram which had been labelled by the owner with a hashtag indicating their gender identity, including #women, #men, #transwoman, #transman, #agender, #agenderqueer, #nonbinary. The images were then analysed by four of the largest providers of facial analysis services, IBM, Amazon, Microsoft and Calrifai. 

It turns out, these systems were most accurate when it came to photos of cisgender women (those born female and identify as female), as on average they were identified accurately 98.3 per cent of the time, whilst for cisgender men the systems were correct 97.6 per cent of the time. 

However, when it came to people who are trans or non-binary, the systems were wildly off the mark. Trans men were wrongly identified 38 per cent of the time, and those who are non-binary were mis-characterised 100 per cent of the time. 

In particular, the study suggests that the facial recognition services identify gender based on outdated stereotypes. For instance, Morgan Klaus Scheuerman, a male PhD student in the university’s Information Science department who has long hair, was characterised as female. “These systems run the risk of reinforcing stereotypes of what you should look like if you want to be recognised as a man or a woman. And that impacts everyone,” he said. 

The authors are concerned that facial recognition tech’s tendency to misgender certain populations could have grave consequences. Scheuerman points to how this could pose issues for someone getting through airport security if there is a mismatch between what the facial recognition programme sees and the passport they carry. 

The research will be presented in November at the ACM Conference on Computer Supported Cooperative Work in Austin, Texas. The hope is that the results of the study will encourage the companies to move away from gender classification. Instead, the authors suggest using more specific labels such as “long hair” or “makeup” when assessing images. 

“Our vision and cultural understanding of what gender is has evolved. The algorithms driving our technological future have not. That’s deeply problematic,” added Brubaker. 

Women in tech podcast returns

Listen and subscribe to Women Tech Charge on Apple Podcasts, Spotify, Acast or wherever you get your podcasts



READ SOURCE

Leave a Reply

This website uses cookies. By continuing to use this site, you accept our use of cookies.