What makes a great Pride parade? Over the past year and a half, organizers around the world have been asking themselves this question, the pandemic requiring hundreds of Pride events globally to either move online or cancel. Many countries responded with innovative digital programming; digital drag shows, online LGBTQ+ film festivals, and even virtual reality Pride experiences. Others organized unofficial or grassroots alternative events, like the Brooklyn Liberation March, where 15,000 masked protesters took to the streets last June to show solidarity with Black Trans Lives Matter, a crowd that was mostly galvanized through Twitter and Instagram.
From digital Pride events to social media community building then, LGBTQ+ activism in the modern age has a lot to thank technology for. In fact, as The Washington Post recently declared: “TikTok has become the LGBTQ+ soul of the internet” – explaining how young people especially are finding support via the thousands of creators posting heartfelt or hilarious video content about their queer experiences. However, while technology is instrumental in creating progress for LGBTQ+ communities, it can only do so if it is built with equality in mind from the beginning. It’s well detailed that algorithms run the risk of replicating human biases and in some areas of technology, this is already happening, with AI excluding or even posing a risk to LGBTQ+ communities. The positive news is, there are designers, digital activists, and other experts out there who are safeguarding the future of technology to ensure that it’s equitable for everyone.
“AI learns from us” meaning “we’ll need to teach it that humans are all different, instead of staples to be registered,” explained anthropologist Mary L. Gray, in a recent interview with Forbes on the subject of AI and LGBTQ+ equality. The problem is that we tend to base AI on “the norm” or “the typical”, she says, adding that by design, “AI excludes and pushes to the margins anything that doesn’t have a robust example.” Thinking about how we design it differently “is “a chance to reflect on our choices, to ask when we are exclusionary,” she adds. “Think about your time on a dating app and ask what assumptions it’s making about who you might want to see based on what you do on the internet and online shopping. Why does it keep showing me the same white faces, what’s that about? What does it say about my social networks? We can reflect on any and all decision-making system. And we can reject that.”
One big area where LGBTQ+ activists are influencing tech right now is automated gender recognition technology (“AGR”) and AI systems that predict sexual orientation. A while back, in 2017, the conversation around this tech reached its apex when it was reported that ad companies were categorizing us by sexuality based on as little as three Facebook “Likes”. News also broke that researchers at Stanford had devised AI technology that could accurately work out your sexuality from your facial structure. The story raised concerns around how this AI could be abused by some of the 69 governments around the world that still criminalize homosexuality. The research was also later repeated and, this time, the neural network was found to be inaccurate.
Flawed results aside, the issue with these types of technologies is that they decide who we are before we get to decide for ourselves, and negate the fact that gender and sexuality often sit on a spectrum. Automated gender recognition, in particular, reinforces a binary, placing us in the category of “male” or “female” depending on a strict set of criteria, and so does not allow for differences within genders, gender-nonconforming people, or the many nonbinary people who do not identify as a gender at all.
The organization Access Now reports on how this tech affects some groups more than others: “In their groundbreaking study, Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification, Joy Buolamwini and Timnit Gebru found that the AGR systems deployed by prominent companies had higher error rates for women than men, and the failure rate was even higher for dark-skinned women. Unfortunately, making facial recognition more accurate would not diminish its discriminatory impact in many other contexts — and it’s highly unlikely any adjustments would make it any less harmful for trans and non-binary people.”
In other words, the tech needs an update – or even a hard reset. But what does a better approach look like? According to the LGBTQ+ organization All Out, AGR and AI-based “detection” of sexual orientation should be banned altogether – and they are running a campaign to see this happen. Tighter regulation around privacy is also vital for ensuring that data around sexuality is gathered or treated in a way that puts the safety of LGBTQ+ people first. Meanwhile, social media platforms and dating apps can continue to empower their users by letting them categorize themselves – with as many gender and sexuality options as possible.
For AI to be free from bias and prejudice, it is critical that the machine learning algorithms that drive AI decision-making are trained on diverse data sets, but it’s also crucial that the people doing the programming reflect the true diversity of society. More queer people and people from diverse racial backgrounds in AI will ensure that those people have the opportunity to use their experiences to build systems that are as inclusive as possible – systems that reflect the myriad ways that we experience our identities.
So far, technology has been responsible for making all the colors of the rainbow more visible, but just like how the LGBTQ+ rights movement requires people power to push it in the right direction, so does technology. As the computer scientist and activist Deborah Raji puts it, “AI doesn’t work until it works for all of us.”