The internet has revolutionized how people communicate, socialize, express themselves, and interact. It's now an essential part of everyday life for billions of users globally.
Like any other medium, it is subject to biases that can affect the representation of certain groups, including LGBTQ+ communities. This paper will explore how algorithms, recommendation systems, and content moderation influence visibility and marginalization of queer voices online.
Algorithms are sets of instructions used by search engines to rank websites based on keywords, backlinks, and other factors. They determine what results show up when someone types in a query. The algorithms often prioritize mainstream sites, which may exclude LGBTQ+ content.
Google's PageRank algorithm ranks pages based on link popularity and relevance, while Yahoo! Search uses Boolean logic to retrieve documents from its index. While these methods ensure relevancy, they can also lead to underrepresentation of minorities.
Recommendation systems are software programs that suggest items related to a user's preferences or behavior. They can be found in eCommerce platforms, streaming services, news feeds, and social media. These algorithms analyze user data such as likes, clicks, shares, and browsing history to personalize the user experience.
They can reinforce stereotypes about sexuality, gender identity, and relationships, excluding queer perspectives.
Facebook's News Feed shows posts that match the user's interests, ignoring alternative viewpoints.
Content moderation refers to the process of reviewing, removing, and regulating user-generated content, particularly images, videos, and comments. It aims to prevent hate speech, harassment, violence, and illegal activities but can also suppress LGBTQ+ expression. Moderators use automated tools like machine learning models to identify offensive language and nudity, leading to the removal of queer content. Social media platforms may censor specific terms like "trans" or "gay," limiting representation.
Algorithms, recommendation systems, and content moderation shape visibility and marginalization of queer voices online. They can exclude diverse perspectives by prioritizing mainstream sites, reinforcing stereotypes, and suppressing LGBTQ+ expression. Improving these systems requires collaboration between developers, researchers, policymakers, and communities to promote inclusivity and diversity on the internet.
How do algorithms, recommendation systems, and content moderation shape visibility and marginalization of queer voices online?
Algorithms are automated sets of instructions that help to organize information and make decisions based on data inputs. Recommendation systems rely heavily on algorithms and they help people find relevant and interesting content by suggesting similar videos or articles based on their previous choices. Content moderation refers to the process of filtering out objectionable material from social media platforms like YouTube, Twitter, Facebook, etc.