Children as young as five exposed to explicit content and inadequate protective measures, the report found

Key takeaways:
-
Kids as young as 5 encountered explicit content and unsupervised contact with adults
-
Safety controls found to be weak despite Roblox's recent updates
-
Platform calls for government action and industry collaboration to protect young users
A deeply disturbing investigation has raised serious concerns about the safety of children on the massively popular gaming platform Roblox, where kids as young as five were found to be exposed to graphic content and interactions with adults, despite platform safeguards.
The findings, reported by The Guardiancome from a study by digital behavior research firm Revealing Reality, and point to a stark disconnect between Robloxs child-friendly branding and the reality of what young users experience on the platform. The report documents multiple incidents of age-inappropriate environments, sexual content, and grooming risks faced by young users.
Safety controls that exist are limited in their effectiveness and there are still significant risks for children on the platform, the report concludes.
Avatars as young as 5 accessed suggestive content
To conduct the investigation, researchers created multiple fake Roblox accounts registered to fictional users aged 5, 9, 10, 13, and 40+. These accounts only interacted with each other, ensuring results were not influenced by real users.
They found that even very young avatars could access highly suggestive environments, including virtual hotel rooms with sexualized roleplay and fetish gear, and bathroom settings where avatars mimicked inappropriate behavior. Audio chat also exposed users to sexually explicit sounds and conversations, despite Robloxs claim that its voice features are AI-moderated and restricted to verified users aged 13 and older.
In one alarming example, an adult test account was able to request a five-year-old avatars Snapchat handle using thinly veiled language highlighting how easily built-in filters and moderation systems can be bypassed, the Guardian reported.
Roblox responds, but pressure mounts
In response, Roblox Corporation acknowledged the presence of harmful content and bad actors on its platform and said it is working to strengthen safeguards. However, the company stressed that tens of millions of people have a positive, enriching, and safe experience on Roblox every day.
It also called for government intervention and broader industry efforts to help address these systemic issues.
We deeply sympathize with any parent whose child has had a harmful experience, a Roblox spokesperson said.
Despite recent updates including new parental controls and restrictions for users under 13, Revealing Reality found these measures to be inconsistently effective and easily circumvented.
Parents Voice Alarm
The findings echo the growing concern among parents over childrens screen time, online addiction, and exposure to danger on open digital platforms. Many have voiced frustration that current moderation tools fail to keep pace with evolving online threats, particularly in user-generated environments like Roblox.
With more than 85 million daily users, 40% of whom are estimated to be under 13, Robloxs influence on young audiences is profound and experts say these revelations should serve as a wake-up call for both the company and regulators.
We must stop assuming that a child-friendly appearance equals a child-safe experience, the report warns.
As the debate over online child safety intensifies, the Roblox case may become a litmus test for digital accountability in an era dominated by user-created content and AI-driven moderation.
Sign up below for The Daily Consumer, our newsletter on the latest consumer news, including recalls, scams, lawsuits and more.
Posted: 2025-04-15 17:49:30