Facebook pixel code

What’s new

Critical Considerations When Addressing Online Hate: Anonymity and Content Moderation

Published on 20/04/2023 by Jolin Joseph, Manager – Research, Education and Knowledge Mobilization, YWCA Canada

We need to talk about the corrosive impact of online hate– on youth, their psychological and physical safety, and public engagement. In 2022, YWCA Canada released the #BlockHate National Research Report and Polling Data centering the voices, lived experiences and strategies shared by young women and gender diverse youth aged 16-30 years around ending online hate and creating safer digital spaces. As part of these consultations, two issues emerged as requiring additional consideration: anonymity online and content moderation. On both topics, focus group participants highlighted the complexities and tensions of regulating in these areas. Jolin Joseph, Manager – Research, Education and Knowledge Mobilization at YWCA Canada, shares further insights from the community-based research.  

Among the reasons online hate is able to continue undeterred is because there are seldom reputational or punitive risks involved. As one participant said, “[i]t’s much easier to commit online hate and violence when you conceal your identity and hide behind a screen.” They stressed that “aggressors thrive on anonymity.” However, they also shared that women, gender diverse youth, racialized and other equity-deserving groups find shelter in remaining anonymous. This raises a paradox evident in participants’ views.  

Online communities have long been a sanctuary for youth that feel unsafe in public spaces, For historically and systematically marginalized groups, the option to remain private offers protection from online hate and related harms. However, the same cloak of anonymity that extends much-needed refuge to marginalized groups enables others to commit digital violence with impunity. A few focus group participants suggested platforms could move to a verified account registration model with confirmed user information (whether that is displayed or not). Others were deeply wary of any insistence on accounts linked to user identity. Valid concerns were raised around data privacy, confidentiality, and surveillance.  

The twin truths emerging from our discussions are that anonymity both creates safety and space to engage freely while also abetting angry and abusive online behaviours, allowing some users to become ‘agents of hate’. Through the conversations and research process, we held space for this duality. In online environments (such as multiplayer games and image/video-sharing platforms) where identifiers related to gender, race, ethnicity, sexual orientation, Indigeneity, and body type are frequently the reason platform users are targeted, navigating online ecosystems anonymously means they can participate without drawing undue attention. Youth engaged in the study found it difficult to reconcile the need to protect their online privacy with the need to enforce consequences for online hate. This is an area where clear recommendations did not emerge, and therefore requires further attention and nuance.  

Another tension flagged by participants was the mandate for swift and expedient platform action to remove hateful content. While many felt it necessary that platforms deploy technological tools to ensure timely takedown, others were afraid that this would extend the bias against marginalized users and content creators. When platforms are obliged to address online hate with speed and at scale, they turn to Artificial Intelligence (AI) methods that pre-emptively screen or automatically filter potentially hate-based content. Some participants worried that AI-powered rapid removal would risk flattening context and censoring their views. Others were uncertain that these methods could detect and regulate —such as rhymes, alliteration and seemingly benign emojis and hashtags—used to avoid detection and further hateful agendas. Efforts need to be responsive to the evolving nature of online hate speech.. 

Even when online hate disappeared and posts were removed, the impact remained. Participants highlighted that hate included in ephemeral content—photo or video content that is temporary and disappears, like stories across a variety of platforms (Facebook, Instagram, TikTok, Snapchat)—may appear to be short-lived, but these can be screenshotted and saved or re-shared, and therefore have an afterlife where fear and harm lingers. Further, the speed at which online hate evolves and spreads necessitates the right balance of technical intervention and human moderation. Participants called on platforms to increase their reliance on trained human moderators to review decisions and make nuanced decisions on flagged content. Many were, however, mindful of the toll on human moderators and the grueling working conditions leading to moderator burnout and vicarious trauma. It is clear that sector-wide change is necessary to ensure all content moderators are supported and trained to parse online hate and AI-decisions are subject to ongoing review and updates. Most critically, the perspectives of survivors, youth and equity-deserving groups that encounter online hate must guide the way forward for platform regulation and content moderation. 

Through research with young women and gender diverse youth across the country, YWCA Canada has outlined critical proposals in the National Report to take into consideration during the development of legal and regulatory regimes around online hate and related harms. These efforts are necessary to ensure free expression and full participation of young women and gender diverse youth in online platforms and public life.  

About Block Hate  

Block Hate: Building Resilience Against Online Hate Speech is a four-year research and knowledge mobilization project to address online hate in Canada funded by Public Safety Canada’s Community Resilience Fund. The overall objective of the project is to improve community resilience by developing tools to prevent, address and report online hate speech through community-based research. 

Back to news
104 Edward St., 1st Floor, Toronto ON, Canada M5G 0A7 416-962-8881
Content produced in accordance with YWCA Canada policy. Our charitable registration number is 88878 9393 RR0001.
CREATED BY
Onaki logo