A recent peer-reviewed study published in Frontiers in Psychology by Anglia Ruskin University’s International Policing and Public Protection Research Institute (IPPPRI) examines how extremist groups leverage gaming and gaming-adjacent platforms to radicalize and recruit users, particularly younger audiences.
Conducted by Dr. William Allchorn and Dr. Elisa Orofino, the research draws on interviews with platform moderators, tech industry experts, and counter-extremism professionals to highlight the mechanisms and challenges of addressing extremist content in these digital spaces.
The study identifies gaming-adjacent platforms—such as chat and livestreaming services linked to gaming—as under-regulated “digital playgrounds” exploited by extremists, particularly far-right groups. These platforms are used to funnel users from mainstream social media to less-moderated environments, facilitating the dissemination of ideologies including white supremacy, neo-Nazism, anti-Semitism, misogyny, racism, homophobia, and conspiracy theories (e.g., QAnon).
While far-right extremism dominates, Islamist extremism and “extremist-adjacent” content, such as glorification of school shootings, also appear, often evading detection due to lax moderation.
The research underscores the appeal of hyper-masculine gaming genres, like first-person shooters, which draw impressionable users into environments where extremists build rapport through shared interests. Initial interactions in-game often transition to gaming-adjacent platforms, where propaganda dissemination and subtle recruitment occur. Interviewees noted that matchmaking features enable rapid relationship-building, which extremists exploit to guide users to less-regulated spaces.
Key challenges include inconsistent moderation policies and the overwhelming volume of harmful content. Moderators struggle with detecting coded language, hidden symbols, and contextually ambiguous phrases (e.g., “I’m going to kill you” in gameplay). While AI tools assist, they falter with nuanced content like memes or sarcasm, necessitating stronger human oversight. The study also highlights the vulnerability of younger users to extremist influencers who blend gaming streams with radical narratives, emphasizing the need for education among parents, educators, and youth.
The research critiques the lack of effective detection and reporting mechanisms, noting user frustration with platforms’ inadequate responses to reported content. It calls for enhanced moderation systems, updated platform policies to address “lawful but harmful” material, and greater law enforcement understanding of gaming subcultures. Dr. Allchorn emphasizes that these platforms, often overlooked compared to mainstream social media, are critical vectors for extremist recruitment, requiring urgent regulatory and technological interventions to curb radicalization.
Press release Anglia Ruskin University , UK