While U.S. policymakers and media headlines have historically focused on platforms like Instagram, Facebook, YouTube, TikTok, and X, a significant shift is occurring as young people increasingly use gaming and gaming-adjacent platforms for social interaction. These conversations, happening on services such as Discord, Roblox, and Steam, are typically anonymous and largely hidden from public view.
These platforms, originally designed to connect gamers, have evolved into primary hubs for social discourse and authentic interaction, particularly as mainstream applications prioritize content engineered for virality. This evolution has brought new scrutiny, with a focus on how these closed forums can harbor hate and exploitation, problems that often remain concealed until they manifest in real-world harm.
Mainstream social media apps are structured to elevate and amplify content publicly, which makes them effective tools for spreading ideologies, rumors, or disinformation. However, many of these ideas first take root in smaller, more private forums on gaming platforms. Extremist groups that have been removed from mainstream services have consequently found new homes in these spaces.
Unlike public-facing apps, users on gaming platforms are more accustomed to operating with pseudonyms. This practice of using pseudo-identities facilitates the anonymous exchange of radical and taboo ideas.
Mariana Olaizola Rosenblat, a policy advisor on tech and law at NYU Stern, explains that the very architecture of these gaming-focused platforms is a key factor in the proliferation of dangerous content. “Extremists and predators go to these gaming spaces to find highly-engaged, susceptible young people, many of whom are yearning for connection,” she says.
The smaller, private chat rooms where harmful conversations often develop are typically sealed off from outside observers. “Most researchers are basically blind to all of this. You can’t enter these rooms,” Rosenblat notes. Users also leverage the gaming context, employing “gamespeak” to disguise extremist or dangerous concepts, which blurs the distinction between role-playing and real-world intent. While the platforms themselves have technical access to this content, the sheer volume makes monitoring a significant challenge. Rosenblat observes that most have not invested sufficiently in the safeguards or moderation resources needed to protect young users.
It is important to note that the majority of conversations on these platforms are ordinary. They range from discussions about gaming and participation in study groups to sports fandoms and neighborhood community forums. Nevertheless, these same environments have also become fertile ground for radicalization and exploitation, with a growing number of incidents bringing the issue to light.
Discord is facing renewed scrutiny after the suspect in the murder of Charlie Kirk appeared to confess within a Discord chat. The platform was also used by organizers of the 2017 Unite the Right rally in Charlottesville to coordinate logistics, including carpools and lodging. In a separate incident, the shooter who killed 10 people in a Black neighborhood of Buffalo in 2022 documented months of his planning in a private Discord chat. Additionally, a 2018 investigation by The Daily Beast uncovered hundreds of instances of revenge porn being shared across Discord servers.
Roblox, a platform marketed directly to children, has drawn sharp criticism for the sexual, predatory, and extremist content that appears on its service. The company is currently facing multiple lawsuits. One lawsuit, filed by the state of Louisiana, alleges that Roblox failed to protect children. Another was filed by an Iowa family after their 13-year-old daughter was kidnapped, trafficked, and raped by a predator she met on the platform.
A Roblox spokesperson provided a statement regarding its safety practices:
“While we cannot comment on claims raised in litigation, at Roblox, we strive to hold ourselves to the highest safety standards. We invest significant resources in advanced safety technology, including a combination of machine learning and human moderation teams working 24/7 to detect and address inappropriate content and behavior.”
The 2022 Buffalo shooting was livestreamed on Twitch, the Amazon-owned gaming platform. Twitch condemned the attack, removed the video quickly, and stated that it was working closely with law enforcement to investigate the incident.
In 2021, researchers found that the platform Steam had become a networking space for far-right ideologies, hosting groups that promoted neo-Nazi organizations.
Following the killing of Charlie Kirk, gaming platforms are receiving increased attention from U.S. policymakers. House Oversight Chair James Comer (R-Ky.) has asked the CEOs of Discord, Steam, Twitch, and Reddit to testify before Congress on October 8 about the issue of user radicalization on their platforms.




