By Finn See

FaceApp recently gained popularity for its supposedly fun gender swapping filter, using artificial intelligence (AI) to ‘swap’ a person’s face into female or male. This has caught the attention of religious authorities in Malaysia, who have in the past and present instigated violence against gender minorities. In response, they have forbidden the sharing of gender swap images, claiming the act of “impersonating opposite genders” to be un-Islamic.

Gender swap AIs have impacted trans and non-binary people in a variety of negative ways. In this article, these consequences are explored with the input of trans and non-binary people, most whom are also Malaysians.

Providing some definitions for context:

  • Cisgender refers to a person whose assigned sex at birth align with their true gender. This refers to most people, e.g. a person assigned female at birth identifying as a woman;
  • Transgender refers to a person whose gender do not align with their assigned sex as birth, e.g. a person assigned male at birth identifying as a woman;
  • Non-binary are people with genders that are not strictly and always male or female, e.g. agender (people with no gender), genderfluid (people whose gender changes over time).

How Gender Swap AIs Hurt Trans and Non-Binary People

Gender swap AIs have been used to degrade trans people and make light of their struggles. This often appear subtle, but not always. Minisha, a trans woman based in Kuala Lumpur, spoke about seeing cis men on her Facebook Feed appropriating the word nyah when using the AI. “Nyah is a special word used within the trans women community in Malaysia, but suddenly cis men are using it among themselves. This is an example of mockery. You’re taking terms used by the trans community and making it as a joke. That’s not funny.”

Hua, a Malaysian non-binary person, described their frustration with the app’s strict gender categories. “They (These AIs) were not taught to factor in genders beyond the typical “male-female” binary. The app is horrible reminder of how society wishes to package my being into these rigid boxes, and eventually […] (using the app) became a violent experience of facial mutilation and consequent dysphoria.” The enforcement of narrow gender categories (male vs female) by gender swap AIs erases and denies non-binary people’s existence, encourages misgendering and makes it further difficult for them to pass as their true gender.

Trans and non-binary people are often burdened with consequences of cis people’s actions. This will continue so long as cis people do not confront their privilege and hold each other accountable to their actions. Minisha drew a parallel about a friend who was offended by blackface filters, but saw no issue with using gender swap filters. “He made a live video saying that he didn’t like it (blackface filter) because he is dark-skinned and Indian. You called that out for the (blackface) app, but you’re okay when it comes to gender swap simply because you are a more privileged cis gender man.”

It is a prime example of how cis privilege and trans oppression bear similar parallels with other oppressive power dynamics like racism. “It’s not that I’m too oversensitive,” Minisha continued. “If you’re comfortable with it, it’s fine. But what about those of us who are not comfortable with it?”

Miles*, a Malaysian trans man, spoke about this. “I think what cis people don’t realise is that our genders are not a costume. […] People like us have to live our whole lives questioning our gender identities, and so much time and energy have been used to navigate around that. When we finally do decide to live our authentic lives, we have to fight very hard for the body that we want.

“So when people just snap a photo and make fun of the outcome, it trivialises the transgender experience. People need to remember while you’re having fun with this app, there are actual trans people fighting every day in their lives to fit into a society that treats them like a joke.”

Against a backdrop of institutional violence against the trans community, gender swap AIs perpetuates the existing climate of intolerance against trans people. When institutions of power publicly denounce experiences that mimic trans and non-binary people’s lived realities, they actively stir transphobia, putting Malaysian trans people at further risk of violence. In Malaysia, at least four cases of murder of trans women were reported between November 2018 and October 2019. It is also a frustrating reminder of how trans and non-binary bodies are used as a site of conflict without their consent.

Do Gender Swap AIs Hurt Gender Dysphoric People?

One may argue that gender swap AIs provide some trans people temporary relief from gender dysphoria, defined as people who experience significant distress from conflict between their assigned sex and true gender. “As a trans person I have mixed feelings about such technologies,” Charlie, a seasoned web developer and a trans person, commented. “They are incredibly useful to binary people who wish to see what they might look like ‘swapped’. They are vital tools for people to assess their own gender identity and to see if it’s something that they are comfortable with. On the other hand, they can cause intense dysphoria in those who cannot access the medicines and surgery necessary to attain the look that the app suggests for them.”

There are also trans people who claimed gender swap AIs could have alleviated their insecurities prior to transitioning. “Perhaps it affirms what they already suspect they were meant to be, but there isn’t enough diversity on this app to be inclusive of such exercises,” Miles said. “Apps like these need to stop focusing on hyper feminine and masculine physical traits if the goal is really about exploring gender.”

When institutions of power publicly denounce experiences that mimic trans and non-binary people’s lived realities, they actively stir transphobia, putting Malaysian trans people at further risk of violence.

Finn See

While it may be wise to refrain judgement towards trans and non-binary people who use the app for to alleviate dysphoria, it is also concerning. These AIs were not built to improve the wellbeing of trans people. Considering the fact that messages about gender expressions in society are dominated, dictated and validated by people with cis privilege, transgender expressions are often pushed aside and made invisibile. These AIs could increase distress, worsen dysphoria and overall mental health in the long run. 

What Can Be Done Moving Forward?

Gender swap AIs are harmful. Their inception, framework and usage contribute to the oppression of trans and non-binary people. Is it possible to move forward with tech products that promote equity, empathy and health? “What is needed, assuming that these apps could even be decoupled from the problematic data collection and capitalism issues, is a sense of customisation,” Charlie said.

Apps like these need to stop focusing on hyper feminine and masculine physical traits if the goal is really about exploring gender.

Miles*

“Imagine having such an app that allowed you to alter things as you would on an Instagram post. You could apply individual filters for eyebrows, beard, skin texture, hairline. You could alter lip size, apply makeup. You could go further and see how you’d age as a particular combination of attributes! Such customisation would allow people to explore their physical and social identities in a way that is not tied to the binary, and which is free, as much as it is possible to be free, of biases and assumptions.”

Hua also believe the importance of having more non-binary and trans people being involved in tech development and leading development teams at the forefront. “Not enough is done to diversify the way artificial intelligence understand the complexity of human beings. In many cases, such technology […] reflect the biases of the people who create them, i.e. people with privilege and power. I would urge everyone, particularly those in tech dev, to imagine what role AI can play in the realm of social justice, equity and inclusivity.”

In the meantime, it’s important that everyone learn to be mindful of how trans and non-binary people may feel to see their friends supporting, using and sharing images that hurt them over their social media timeline. Self-education, listening and reflection on how the tools we use impact marginalised communities could mean the difference between affirming and reducing the humanity of trans and non-binary people.

*Miles is a pseudonym

About the author

Finn See is a Malaysian agender who uses they/them pronouns. They’d like to thank Nine for acting as an initial editor to this work.