People demand justice for Jennifer Laude and the removal of U.S. troops from the Philippines, 2016. Image: Eugenio Loreto.
Of Trans & Tech
Written by Jean Linis-Dinco
Time Magazine’s Transgender Tipping Point issue, which featured trans star Laverne Cox dressed in a navy blue dress on the cover, was released in May 2014. It was a game changer for a group of people who had endured centuries of prejudice and hatred for dismantling normative gender conventions. For the first time, trans folks could now see themselves in popular culture in a respectable way, rather than the usual portrayal that relied on caricaturisation and a gesture of comic relief. Eight years after Time’s tipping point, we might anticipate that increased global awareness would hasten trans progress in the right direction. Instead, the heightened visibility of transgender issues has contributed to a rise in the worldwide securitisation, surveillance, and violence against trans bodies. Contemporary technologies have contributed to this trend, and the battle for trans liberation has entered the realm of machine learning and, particularly, gender recognition technology.
AGR is predicated on the incorrect and outmoded notion that a person’s gender can be determined solely by observing their outward appearance.
One of the fundamental tenets of trans activism is the concept of self-identification, wherein a person’s internal experience is elevated to the central authority in determining who they are. [1] Self-identification, however, crumbles in the face of cutting-edge technology that uses physical characteristics to identify the gender of individuals. The two most common feature extraction methods used in Automated Gender Recognition (AGR) are appearance-based feature extraction and geometric-based feature extraction. The former considers the face photo as a one-dimensional vector wherein the whole face region is the input data. On the other hand, the latter emphasises some fixed points of the image, such as the nose, eyebrows, eyes, and even the nasolabial fold. Both feature-extraction techniques rely on superficial features such as the shape of the jaw, hairline, and cheekbones to decide whether a person is male or female. AGR is predicated on the incorrect and outmoded notion that a person’s gender can be determined solely by observing their outward appearance.
The idea of determining someone’s sexuality and gender based on data is already unsettling. For trans and non-binary people, not only does this present a novel challenge with societal and political implications, but it also risks exacerbating existing prejudices—potentially leading to fatal consequences. The tragic case of Jennifer Laude, a transgender Filipina who was killed by a US Marine upon discovering her trans identity in 2014, serves as a stark reminder of the life-threatening implications that revealing a transgender person’s identity can have.
The invasive nature of automated gender detection systems, often integrated into social media platforms and other forms of surveillance systems, insidiously undermines the safety and wellbeing of transgender and non-binary individuals.
The invasive nature of automated gender detection systems, often integrated into social media platforms and other forms of surveillance systems, insidiously undermines the safety and wellbeing of transgender and non-binary individuals. For instance, these systems can be utilised in nations where being trans is criminalised and even punishable by death or where people can use such technology to ‘out’ trans persons or identify them in random images they find online. In addition to being harassed, transgender persons risk losing their jobs, means of subsistence, or even their lives if they portray themselves in a way that is inconsistent with their legal documentation. This is already occurring in many nations without the aid of machine learning. For example, a recent newspaper article in Malaysia published a story on how to detect queer people based on their physical features...
The full version of this article is available in Instruments of Surveillance for sale at the National Communication Museum until March, 2025.
References
[1] Zimman, L. 'Trans self-identification and the language of neoliberal selfhood: Agency, power, and the limits of monologic discourse'. International Journal of the Sociology of Language, (2019): 147-175
About the Author
Jean Linis-Dinco
Jean Linis-Dinco is an activist working in the intersection of human rights and technology. Dr Dinco's projects have consistently aimed at bridging the gap between technological advancement and human rights, ensuring that digital tools serve as a force for good. Her advocacy extends beyond academia into active policy reform and public speaking through her leadership at ASEAN Regional Coalition to #StopDigitalDictatorship, where she addresses global forums on the human rights implications of so called 'AI' and digital surveillance.