Today’s popular online gaming platform, Roblox, announced sweeping safety updates to protect its tiniest users, young kids under 13. However, a significant amount — about 40 percent or 78 million — are under 13, and the company is imposing stricter age restrictions to protect them from inappropriate content and dangerous interactions. They are part of a wider effort to make the platform a safer virtual environment that has been heightened in recent months as law enforcement and media investigations highlight safety issues and criticisms that the platform has turned into a place for the exploitation of children online.
Firing up on November 18, younger players will no longer have access to social hangout games and spaces centered on the right to open communication, like free-form chat games or those concerned with social or community things. Such areas, sometimes known as ‘vibe games,’ virtual clubs, allow players to play as they see fit, sometimes with behavior similar to social media. Some players view these games as social outlets, but Roblox’s safety team has highlighted them as sites of risk for underage users. On top of that, starting this month, the preteens won’t be allowed to look for or access these kinds of experiences. But don’t worry; this update doesn’t affect games based on roleplaying by characters (or based on real-life simulations) where the encounter is in a controlled, fictional setting, not as one oneself.
Roblox introduces new age limits and parental tools to protect young users
Roblox will soon enforce a new rating system to protect younger players from inappropriate content, limiting preteens to the games that have been reviewed or rated. Game creators must complete detailed questionnaires outlining their experiences to see if their game is appropriate for children under 13. As long as the date (December 3) passes, any game that doesn’t fit the new rating requirements will be automatically marked as “unrated” and inaccessible to players under 13. Roblox officials hope that filtering out violent, mature, or otherwise unsuitable to the eye content with this strategy would better filter out content that might have violence, mature themes, or other elements parents may no longer wish their kids to view.
New parental controls also give parents a bigger hand in users’ Roblox experience. With these controls, parents can decide which games are accessible and set chat settings that depend on the child’s age. A Roblox spokesperson said the new tools will give parents a clearer look at what their kids are doing on the platform, allowing parents to control the platform experience as they see fit.
A timely response to mounting safety concerns
At a critical time, Roblox’s new safety policies are in place. In recent years, as lawsuits and increased scrutiny of Roblox’s policies have emerged over incidents in which predators used the platform to prey on young users, investigations have become more alarming. In just one recent case, two parents took Roblox to court due to its’ supposed part in allowing remote virtual casinos that go with gambling. Even though Roblox sought to dismiss the case, arguing it had no liability under the Communications Decency Act’s Section 230, a California judge allowed it to continue, pointing to the extraordinary dangers that platforms catered so heavily to children can carry.
Roblox has promised to prioritize child safety as its user base grows. It is imposing these new measures to build a more secure environment that is in harmony with its gargantuan popularity with young audiences and in pursuit of appeasing parents, regulators, and the wider community.
Image credits: Roblox