In the digital age, where the internet is a playground for the young and curious, gaming apps have become a staple in the everyday lives of many children. Among the myriad of options available, Roblox remains a frontrunner, captivating young audiences with its interactive and imaginative platform. However, beneath the surface of its colourful and engaging interface, new research reveals a dark underbelly that cannot be ignored.
The Roblox Metaverse
Roblox is an expansive digital universe that allows users to create their own games and experiences. Digital Native describes it as a ‘place where YouTube meets LEGO meets Facebook meets Epic Games - an ecosystem for game-making and hosting’.
Roblox boasts 79.5 million daily active users from more than 190 countries, engaging with its 5.3 million "active experiences"—a term the platform uses to refer to games, social spaces, and events like online concerts, sports, and fashion shows. Various reports indicate that 60% of Roblox users are under the age of 16.
On its own website Roblox says:
'We believe in building a safe, civil, and diverse community—one that inspires and fosters creativity and positive relationships between people around the world.'
Under the Spotlight
Roblox claims to have a zero-tolerance policy for the exploitation of minors, including predatory behaviour, sexualising minors in any way, engaging in sexual conversation, coercion and enticement of a minor.
Yet, a scathing new report by Hindenburg Research paints a very different picture, labelling the supposedly child-friendly gaming platform as ‘A Pedophile Hellscape For Kids’.
This isn’t the first news to shed light on the exploitation rife throughout the Roblox metaverse.
Numerous allegations of grooming, kidnapping, rape and the trading of sexual content with kids ranging 8-14 years old, have continued to surface since 2019.
Earlier this year Bloomberg Businessweek published a report detailing the risks children face of being exposed to and groomed by pedophiles on the Roblox platform.
Source: Bloomberg Businessweek
For the second year in a row Roblox has been placed on the National Center on Sexual Exploitation’s ‘Dirty Dozen List’ featuring 12 mainstream contributors to sexual exploitation. The closely watched list urges Roblox to take action to improve control and moderation measures to protect children on the gaming platform.
In a 2023 report released by Parents Together, Roblox was named a top five platform for child exposure to inappropriate sexual content.
In addition to the years of documentation highlighting the exploitation rife throughout the platform, Hindenburg Research recently performed their own checks to see if Roblox had tightened up any of their child safety moderations.
Here’s what they found:
As an initial test, the research team attempted to set up an account under the name ‘Jeffey Epstein’ only to see the name was taken, along with 900+ variations.
‘Many were Jeffrey Epstein fan accounts, including “JeffEpsteinSupporter” which had earned multiple badges for spending time in kid’s games. Other Jeff Epstein accounts had the usernames “@igruum_minors” [I groom minors], and “@RavpeTinyK1dsJE” [rape tiny kids].’
They attempted to set up a Roblox account under the name of another notorious pedophile to see if Roblox had any up-front pedophile screening: Earl Brian Bradley was indicted on 471 charges of molesting, raping and exploiting 103 children. The username was taken, along with multiple variants like earlbrianbradley69.
Roblox Hosts Child Sexual Exploitation Material
Once a username was selected, the research team listed their age as “under 13” to investigate whether children are being exposed to adult content. By simply entering ‘adult’ into the Roblox search bar, they discovered a group called “Adult Studios” with 3,334 members openly trading *child pornography and soliciting sexual acts from minors.
By tracking some of the members of “Adult Studios” they found:
- 38 Roblox groups – one with 103,000 members – openly soliciting sexual favors and trading *child pornography.
- The chatrooms trading in *child pornography had no age restrictions.
- Roblox reports that 21% of its users are under the age of 9, a number that is likely underestimated given that Roblox has no age verification aside from users seeking 17+ experiences.
Moderation Failures
Registered as a child the Hindenburg research team were also able to access games like “Escape to Epstein Island” and “Diddy Party”. They found over 600 “Diddy” games, including “Survive Diddy” and “Run From Diddy Simulator”.
Source: Roblox
12,400 erotic roleplay accounts on Roblox included everything from “rape/forceful sex fetishes” to underage users “willing to do anything for Robux”.
Hindenburg Research put together a condemning video compilation of Roblox moderation failures:
Source: Hindenburg Research. **Note: This video above was previously published on YouTube. Within minutes the content was flagged as inappropriate. The same content YouTube flagged almost immediately is regularly available to children on Roblox. The link now points to where the video can be found on ‘X’ (formerly Twitter).
Online and Offline Offending
Researchers ran a search to see whether and how pedophiles were using Roblox to groom and abuse children offline. While the search was limited to the U.S and was by no means exhaustive, a sample of what they uncovered highlighted 9 separate arrests including:
- The arrest of a registered sex offender for enticing an 8-year-old girl into sending him sexual content.
- A man who kidnapped and raped an 11-year-old girl he communicated with on Roblox, resulting in felony counts relating to kidnapping and unlawful sexual contact.
- The arrest and charging of a man for enticing children aged 10-12 into sending naked photos in exchange for the virtual gaming currency Robux.
As recently as August this year an arrest was made of a man who flew from Chile to California to meet a 14-year-old girl he met on Roblox who police said he intended to sexually assault.
Users Regularly Described Lewd Sex Acts While Others Used Hateful Slurs
Users seeking sexual experiences on Roblox are so pervasive that there are thousands of Roblox sex videos on porn sites, inviting users of unknown ages to make explicit content on the platform.
‘We tested out Roblox’s experiences to see what else kids were being exposed to. We quickly encountered images of male genitalia and hate speech in Roblox’s “school simulator” game, which had registered 28.9 million visits with no age restrictions.’
Some pornographic videos had links to roblox usernames, with descriptions and comments inviting viewers to engage in sexual acts in-game with other users of unknown ages.
Researchers also found:
- When posing as a child in Roblox’s “therapy” experience, our therapist introduced himself as a “rapper with only one p”. We were advised to run away from home and that he would come pick us up so we could move into his basement in exchange for paying rent with our body.
- Roblox players identifying as 9-years-old and up have the option of playing games titled “Throw Bricks at Homeless People” and “Beat Up the Pregnant”, in which players compete to murder as many pregnant women as possible in a Walmart parking lot.
- Other Roblox users have identified issues of simulated rape, naked users, and rampant in-game sexual harassment.
Roblox Is Compromising The Safety Of Children On The Platform In Order To Report Growth To Investors
Roblox management faces a dilemma of choosing between better metrics or improved child safety, with one former employee stating:
“You’re supposed to make sure that your users are safe but then the downside is that, if you’re limiting users’ engagement, it’s hurting your metrics. It’s hurting the [daily] active users, the time spent on the platform, and in a lot of cases, the leadership doesn’t want that.”
Despite Roblox’s claims of “best in the world” content moderation, interviews with moderators revealed safety was largely outsourced to Asian call centers.
“Moderators described being paid $12 a day to review countless instances of child grooming and bullying with a limited ability to keep perpetrators off the platform permanently.”
After extensive exploration of Roblox’s metaverse Hindenburg research found the platform to be:
“An X-rated pedophile hellscape, replete with users attempting to groom our avatars, groups openly trading child pornography, widely accessible sex games, violent content and extremely abusive speech—all of which is open to young children and all while Roblox has cut content moderation spending to appease Wall Street and boost earnings.”
Read the full report here.
The Core Issues
- Roblox’s social media features allow pedophiles to efficiently target hundreds of children, with no up-front screening to prevent them from joining the platform.
- Explicit content in games available to children of all ages, include pictures of genitalia, hate speech, and violence.
- Numerous games are available to children seemingly designed for engaging in sexual play.
- Chat groups, available to children of all ages, openly solicit sexual content.
- There looks to be zero up-front moderation of experiences available to children on the platform.
- Growth and profit are prioritised over user safety.
Considering the numerous documented instances of children encountering harm on the platform, Roblox has yet to implement the necessary changes. With such a young user demographic, it is critical for Roblox to prioritise child safety, ensuring their platform is secure by default and intentionally designed for protection.
Join our global partners at the National Centre on Sexual Exploitation today by urging Roblox to take meaningful action to safeguard our children.
**Note on use of term *child pornography: In contemporary Australian legislation, the term ‘child pornography’ has been replaced with the term ‘child abuse material’. This is to avoid any suggestion of compliance on the part of the victim, or legality on the part of the sex offender. The term ‘child pornography’ does not accurately reflect the serious nature of this material (Australian Centre to Counter Child Exploitation).
See also:
Prevent harm, prioritise children: Social Media and Online Safety Committee responds to inquiry
Hold social media platforms to account: MTR addresses Fed inquiry
Add your comment