Roblox, the wildly popular online gaming platform, is taking significant steps to enhance child safety on its platform. In a bid to address concerns about potential risks to younger users, the company is introducing new age-based restrictions, a comprehensive rating system, and stricter content moderation policies. These changes come after a period of intense scrutiny from regulators and reports of inappropriate interactions on the platform.
Starting December 3, game creators will be required to clearly classify their games as suitable for users under 13 years old. Games that fail to meet this requirement will automatically be restricted for players aged 12 and younger. This move aims to ensure that children are exposed to content appropriate for their age.
Additionally, beginning November 18, preteen users will no longer be able to access certain social areas known as “social hangouts.” These virtual spaces allow players to engage in real-time text and voice chat, which can sometimes present a vulnerability for younger users. To further safeguard younger players, Roblox will also prohibit children under 13 from accessing games with “free-form 2D user creation.” This feature encompasses tools like virtual chalkboards or whiteboards, where players can draw or write freely without prior moderation. The company believes that restricting this feature will reduce the risk of offensive messages or images appearing unfiltered on the platform.
Content creators who wish to make their experiences accessible to younger users must meet specific age-appropriate requirements. This includes completing a detailed questionnaire and ensuring full compliance with Roblox’s stringent guidelines.
These new safety measures are a direct response to the increasing scrutiny Roblox has faced regarding child safety. Earlier this year, an Ofcom report identified Roblox as the most popular game among UK children aged 8 to 12. However, the platform has also received complaints regarding potential risks to younger players, including reports of inappropriate interactions. In August, Turkey, for instance, blocked Roblox entirely due to concerns about child safety.
In October, Roblox introduced new parental controls to further enhance the safety of its younger users. These controls offer parents more control over their children’s activities and provide a more manageable account type for parents to monitor their children’s online experiences.
Juliet Chaitin-Lefcourt, a Roblox spokesperson, assured the Verge that the company has already implemented over 30 safety improvements this year, with more in the pipeline. She emphasized the importance of transparency with the developer community as the company rolls out these updates.
The full enforcement of these new safety standards will continue into next year as Roblox works to solidify its age-based access system and a robust rating system. These comprehensive measures aim to create a safer and more enjoyable online environment for all users, particularly its younger players.
Despite these ongoing efforts, Roblox faced two bearish reports in October from Bear Cave and Hindenburg. These reports alleged that the company was inflating key user metrics, including daily active users (DAUs) and engagement hours.
However, Roblox continues to demonstrate strong financial performance. In the third quarter of 2024, Roblox reported a 34% increase in bookings to $1.13 billion. Average Daily Active Users (DAUs) reached 88.9 million, a 27% rise, and Hours Engaged saw a 29% increase to 20.7 billion.
The company’s stock (RBLX) is currently trading up 0.88% at $53.37. While the recent bearish reports have cast some shadows, Roblox remains committed to providing a safe and engaging platform for its users while fostering positive growth in its business.