Biometric bias in the transgender and non-binary community

Source Node: 1437036

Facial recognition software is a booming industry. We use it today to prove who we are to our banks, insurers, entertainment providers, healthcare providers, the government and many more organisations.

Organisations have a responsibility to recognise bias in their technologies

However, when it comes to those who identify as transgender or non-binary, human computer interfaces are almost never built with these communities in mind. As a result, they reinforce existing biases.

According to the National Transgender Discrimination Survey, only one fifth (21%) of transgender people who have transitioned have been able to update all of their IDs and records with their new gender, while one third (33%) have not updated any of their IDs or records.

With so many individuals not even having access to formal documents that properly identify their gender, there is very little power to combat false identifications from algorithms.

Creating new technologies has an inherently moral responsibility, according to the World Economic Forum, to “shape how the people using them can realise their potential, identities, relationships, and goals”.

However, if we continue at the same course and speed, software companies will, unintentionally, support biases against minority groups.

Equal access to services is a right

You can’t ‘pick and choose’ identities. We all need digital access. Today’s expectation is that technology solutions are unbiased, though race continues to be a widely discussed issue in this field, and one which must be urgently addressed.

Facial recognition systems are under scrutiny and in some cases have already been proven to make decisions which demonstrate racism.

With more Britons than ever before identifying as LGBTQ+, it’s important that the broader ecosystem pays attention to minority communities. We could start by better understanding the challenges they face and working as a community to offer alternative digital solutions.

Biometrics are an essential factor in proving a person’s identity, but facial biometrics may not be the right first layer of verification for the transgender community. An alternative method might be to use eye scanning technology or fingerprints, for example.

Why is biometric equality important?

Inclusion means equal access, but we aren’t there yet. Facial recognition determines the user’s gender simply by scanning their face and assigning the identity of male or female based on previous data analysed.

Superficial features such as the amount of makeup on the face, or the shape of the jawline and cheekbones may put your gender into a binary category. As a result, these systems are unable to properly identify non-binary and trans people.

Consequently, biometric facial recognition technologies cannot recognise minority subgroups based on their gender expression. At the same time, user interfaces that allow people to add their gender information also lack the necessary selection of gender type options.

Biometrics are not biased, the data is biased

It’s important to note that biometrics themselves are not actually biased, as they are not making any decisions based on human values. Bias and inequality in biometric technologies are caused by a lack of diverse demographic data, bugs, and inconsistencies in the algorithms.

For example, if the training data primarily includes information related to just one demographic, the learning models will disproportionately focus on the characteristics of that demographic.

The inability to identify people within these groups has consequences in the real world. The lack of accuracy of these technologies can lead to people being mistreated, from not being able to get approved for financial products and services to facing issues with the government or police due to misidentification.

People who aren’t represented lose the ability to be acknowledged and fight for their freedoms and rights.

Inclusivity starts with listening

Organisations have a responsibility to recognise bias in their technologies, and work to adapt models to acknowledge the differences that make us who we are. This could involve diversifying the types of biometric technologies that are used to identify users, retraining systems which are misgendering people, changing the way systems classify by gender and, most importantly, listening to customers.

Transgender and non-binary experiences must be heard and understood by real people who are able to make a change, before the technologies can accurately and inclusively recognise them. It starts with being willing to listen. We must be ready to do whatever it takes to allow equal access for everyone.


About the author

Cindy White is chief marketing officer at identity verification technology firm Mitek.

Cindy was previously vice president of marketing at FICO and spent 12 years working at Microsoft.

Source: https://www.fintechfutures.com/2021/11/biometric-bias-in-the-transgender-and-non-binary-community/

Time Stamp:

More from FinTech Futures -