Lawsuit alleges that San Mateo-based Roblox fails to safeguard children from predators and explicit content
Quisha Smith, in a lawsuit against the San Mateo-based gaming platform Roblox, claims that after allowing her young daughter to create an account on the popular platform, she was surprised when someone in a role-playing game asked the girl to remove her pants while in a virtual bathroom. Despite Roblox boasting more than 66 million daily users globally, with the fastest-growing age group being 17 to 24, and estimating that three-quarters of U.S. kids aged 9 to 12 had accounts in 2020, Smith alleges in her lawsuit that the platform exposes children to “rampant sexual content” and the presence of child predators posing as innocent avatars.
While Roblox promotes itself as a safe and welcoming environment for all ages, the lawsuit argues that children of all ages, including those barely old enough to be in elementary school, are exposed to inappropriate content. Smith, a California parent of a 12-year-old daughter and a 7-year-old son, seeks class-action status to include other parents who may have been similarly misled into allowing their young children to sign up for Roblox. According to the lawsuit, Smith saw a social media advertisement from Roblox in 2021 that portrayed the platform as suitable for children, leading her to let her daughter open accounts for herself and her brother. The alleged incident with her daughter in the virtual bathroom took place last year, as mentioned in the lawsuit.
Roblox, in response, disputes the allegations, asserting that it has a dedicated team of thousands working on moderation and safety 24/7. The company claims to swiftly block inappropriate content or behavior, including sexual content violating their Community Standards, and highlights its collaboration with 20 organizations focused on child safety and online safety.
“We have implemented various features with the specific intention of ensuring the safety of children, such as employing text chat filtering to block inappropriate content or personal information. Additionally, we provide parental controls and features that allow the limitation or disabling of chat functionalities. Our ongoing investment includes the development of tools to grant parents insight into their children’s activities.”
The lawsuit highlighted that Roblox, alongside platforms like Instagram, OnlyFans, and the Apple App Store, was listed in the “Dirty Dozen” by the National Center on Sexual Exploitation. The center’s June report alleged that children on Roblox not only encounter highly sexualized content but also stated that numerous children had been sexually abused and exploited by predators they met on the platform. The lawsuit’s central concern reflects a broader challenge faced by major Silicon Valley social media companies regarding how to effectively screen and filter users and content on a massive scale. Roblox contends that, like other large platforms, it employs a combination of automated and human moderators.
Legal actions against social media companies related to child safety are not uncommon. In a recent case, New Mexico sued Meta, accusing it of providing a “breeding ground” for child predators on its Facebook and Instagram apps, asserting that young users are exposed to sexual content and contact by adult users.
The lawsuit filed against Roblox in San Mateo County Superior Court on Wednesday cited specific incidents, including the arrest of a 40-year-old California man who traveled across the U.S. to sexually abuse a 14-year-old girl he met on Roblox. Another incident involved a 14-year-old girl who was sexually assaulted by a man posing as a 17-year-old boy on the platform.
“In 2022, a 13-year-old girl in Topeka was rescued from a man she met on Roblox who was sex trafficking her,” the lawsuit claimed, referencing the center’s report. “In 2022, an 8-year-old girl in North Carolina was targeted by an online predator on Roblox who asked her to send him ‘hot videos.’ The girl’s mother said she had parental controls on all the devices her kids used.”
According to both the lawsuit and the center, Roblox permits adult strangers to send direct messages, engage in chat, and become friends with children, and these settings are automatically applied when any child creates an account.
Referring to a 2020 regulatory filing by Roblox, the lawsuit points out that the platform acknowledged facing allegations that its service has been utilized by criminal offenders to identify and communicate with children, potentially enticing them to interact off-platform, beyond the confines of the chat, content blockers, and other safety measures on the platform. Despite dedicating “considerable resources” to prevent such misuse, Roblox stated in the filing that it was unable to prevent all such interactions.
The lawsuit alleges that the company failed to disclose this information to parents. It also references a warning from Canadian police in April, indicating that predators were reaching out to children on Roblox and then directing them to apps like Facebook, Instagram, or Snapchat to manipulate them into sending explicit photos or engaging in sexual activities.
Quisha Smith, who is seeking a court order to prohibit Roblox from engaging in alleged illegal conduct and false advertising, mentioned in the lawsuit that she discontinued her children’s use of the platform after the reported incident in the virtual bathroom. However, she remains open to digital entertainment options for her children and would consider allowing them back on Roblox if the company accurately promoted its platform and enhanced its moderation and protective systems.