California Mother Sues Roblox and Discord After Son’s Death
- Sagar Mankar

- Sep 16
- 3 min read

A new lawsuit has been filed against Roblox and Discord, alleging that failures in safety protections contributed to the exploitation and suicide of a 15-year-old California boy.
Lawsuit Details
The case was filed by Rebecca Dallas in San Francisco County Superior Court, following the death of her son, Ethan Dallas, in April 2024. The lawsuit accuses Roblox and Discord of "wrongful death, fraudulent concealment, negligent misrepresentation, and strict liability."
According to the complaint (via NBC), Ethan was groomed by an adult predator who posed as a child on Roblox. What began as casual in-game interactions escalated into explicit exchanges.
Attorneys say the predator convinced Ethan to disable parental controls and later moved their conversations to Discord, where he demanded explicit images and threatened to expose him if he refused.
“Tragically, Ethan was permanently harmed and haunted by these experiences, and he died by suicide at the age of 15,” the complaint states.
His family learned afterward that the man who groomed him was later arrested in Florida for sexually exploiting other minors through the same apps.
Claims Against Roblox and Discord
The lawsuit argues that both platforms failed to implement adequate screening, age verification, or parental safeguards that could have prevented Ethan’s interactions with the predator. It accuses the companies of "marketing themselves as safe for kids" while knowingly creating "environments where predators could thrive."
Roblox allows free account creation without strict age checks. While parental controls exist, the lawsuit notes that children can bypass them by inputting false birthdates. Discord also lacks age or identity verification and permits children to create accounts without oversight.
According to Dallas’ attorneys, these design choices make both platforms “easy prey for pedophiles.”
Company Responses
Roblox expressed sympathy over Ethan’s death but declined to comment on specific litigation. A spokesperson told NBC that the company strives to maintain “the highest safety standard” and (as always) highlighted 100 new safety features introduced recently, including parental visibility tools, 24/7 moderation, and partnerships with law enforcement and child safety organizations.
Discord also avoided commenting directly on the case but emphasized that it requires all users to be at least 13 years old. The company said it uses automated scanning and trained safety teams to remove harmful content and prevent grooming.
Wider Pattern of Allegations
This lawsuit is not an isolated case. Law firm Anapol Weiss, which represents Dallas, said this marks the ninth lawsuit it has filed related to child exploitation on Roblox or Discord.
In 2024, the National Center on Sexual Exploitation placed both platforms on its “Dirty Dozen” list, claiming Roblox exposed minors to sex-themed games and Discord enabled grooming and the sharing of explicit material.
Roblox has also faced mounting political pressure. In August, Louisiana’s attorney general sued the company over child safety failures. Around the same time, Congressman Ro Khanna opened a petition demanding stronger protections for minors on the platform.
In response to rising scrutiny, Roblox has recently introduced an age-estimation program using facial recognition, ID verification, and parental consent features. The company has also partnered with the International Age Rating Coalition to apply ESRB-style ratings (“E,” “T,” “M”) to experiences, giving parents clearer guidance.
However, critics argue these measures remain reactive. Roblox reportedly has 3,000 moderators overseeing more than 50,000 chat messages per second, a ratio that many say prioritizes "growth over safety." By comparison, TikTok, which has roughly three times Roblox’s daily users, employs over 13 times the number of moderators.
Dallas’ lawsuit seeks a jury trial and compensatory damages, aiming to hold both Roblox and Discord accountable.








Comments