Implications of Artificial Intelligence Algorithms on LGBTQ Users' Visibility, Representation, and Safety
Social media platforms have become an integral part of modern life, enabling billions of people to connect and share their experiences online.
Many marginalized communities face unique challenges when it comes to representing themselves and finding support through these platforms. One such group is the LGBTQ community, which has historically been subjected to discrimination and violence based on its sexual orientation and gender identity. This article explores how artificial intelligence algorithms can impact the visibility, representation, and safety of LGBTQ users on social media.
When it comes to visibility, some argue that algorithms are biased towards heterosexual and cisgender individuals because they favor popular content that aligns with dominant cultural norms.
Facebook's News Feed algorithm prioritizes posts from friends and family members, while Instagram's Explore page promotes photos of celebrities and influencers. As a result, LGBTQ individuals may struggle to find their voices in these spaces, limiting the ability to seek out information or build relationships with others who share similar identities.
Algorithms can reinforce stereotypes and perpetuate harmful representations of LGBTQ individuals.
TikTok's algorithm frequently suggests videos featuring trans women wearing traditionally feminine clothing, implying that this is the only acceptable way for them to present themselves. Similarly, YouTube's recommendation engine often promotes videos about "coming out" stories, suggesting that being gay or trans is something to be embarrassed about. Such content may normalize negative attitudes toward non-heteronormative individuals and create an environment where harassment and hate speech flourish.
Algorithms can also unintentionally expose LGBTQ users to dangerous situations.
Dating apps like Grindr and Scruff have been linked to increased rates of violence against queer people, as predators use location data to identify and target vulnerable victims.
If algorithms were designed to protect user privacy by obscuring specific locations, this risk could be mitigated.
Artificial intelligence algorithms play a significant role in shaping social media experiences for LGBTQ users. While they can enhance visibility and foster connection, they can also limit representation, promote harmful stereotypes, and put individuals at risk. Therefore, developers must prioritize creating inclusive and equitable algorithms that support all communities, including those who are marginalized due to their sexual orientation or gender identity.