Roblox And Discord Singled Out In New Child Safety Lawsuit

A month after California Governor Gavin Newsom signed a bill restricting data collection from minors, Roblox, Discord, Facebook, and Snapchat are being sued for allegedly harming children and teens due to product features that inherently promote addictive behaviors and illegal content. 

The Social Media Victims Law Center filed the suit in a California state court on behalf of an underage girl, identified as S.U., who was contacted on Roblox by an adult man at the age of 10 who persuaded her to drink alcohol and take prescription drugs, according to The Seattle Times.

S.U. allegedly met more men through Roblox and Discord’s direct messaging services, who then encouraged her to open Instagram and Snapchat accounts. At 13, she became addicted to her devices and fell victim to a 22-year-old Roblox user from Missouri who convinced her to send sexually explicit images, which he then allegedly sold online. 

Snapchat’s “My Eyes Only” feature allowed S.U. to hide what was happening from her mother, who tried to monitor her daughter’s social media use. 

In 2020, the girl allegedly attempted suicide twice. Her parents were over $10,000 in debt in 2021 due to expenses related to her mental health crisis, according to the complaint.

“These men sexually and financially exploited her,” the Social Media Victims Law Center said. “They also introduced her to the social media platforms Instagram and Snapchat, to which she became addicted.”

The lawsuit includes details that were leaked from Meta’s internal research, specifically on the negative effect that Facebook and Instagram have on teenagers’ self-esteem, and the ease at which younger users can access harmful content. 

Discord and Roblox are coming under increasing scrutiny and criticism for their inability to stop adults from messaging children without supervision. 

The girl’s family aims to hold the social-media companies financially responsible, while seeking a court order that would force the platforms to make their products safer.  

Just this year, there has been a constant stream of lawsuits filed against Meta, Snap, TikTok, and Google surrounding harms done to adolescents and young adults who have suffered anxiety, depression, eating disorders, and sleeplessness after becoming addicted to social media. Some children have allegedly committed suicide becuase of it. 

Although Roblox has rarely been included in these prior lawsuits, yet half of the children in the U.S. were active on the platform in 2020. 

In 2022, Discord was cited in around six cases involving child sex abuse material or grooming children, according to a Bloomberg search of Justice Department records. 

Next story loading loading..