Teens and their parents are urging a federal judge to allow them to proceed with a sweeping class-action complaint alleging that Facebook, Instagram, YouTube, TikTok and Snap harm users' mental health.
“Defendants design social media apps that target kids but are not safe for them to use,” lawyers for the families say in papers filed Thursday with U.S. District Court Judge Yvonne Gonzalez Rogers. “Each product at issue here exploits the way young brains work, causing compulsive use and addiction and leading to dramatic increases in rates of anxiety, depression, self-harm, eating disorders, and suicide among kids.”
Rogers is presiding over litigation brought by dozens of parents, teens and others, who claim Facebook, Instagram, TikTok and other social media platforms have harmed minors. The complaint largely centers on the theory that social media companies designed their services to be addictive, and then served minors with potentially harmful content that had been posted by other users.
Among other claims, the families allege that the social media companies are responsible under “products liability” theories -- meaning their products were dangerously defective, due to design features that made the services addictive -- and that the companies acted negligently.
The social platforms recently urged Rogers to dismiss the complaint on the grounds that the allegations, even if true, wouldn't support claims for negligence or products liability.
The tech companies argued they can't be found liable for negligence based on publishing content, because publishers “have no duty to protect their audience from the effects of content publication or consumption.”
The companies also argued that their platforms, which enable communications, aren't comparable to cars or other types of physical merchandise that can be defective.
Lawyers for the teens and families are now urging Rogers to reject those arguments.
“A defective item does not need to satisfy a rigid 'tangibility' requirement in order to be treated as a 'product,'” counsel writes.
Counsel also contends that the claims aren't about “content,” but the way the services are designed.
“Plaintiffs do not demand that defendants change the content they host,” the attorneys write. “Instead, plaintiffs claim specifically that they would not have been harmed by the content on Defendants’ platforms but for the alleged design defects.”
The tech companies plan to argue in a separate motion that they're protected by Section 230 of the Communications Decency Act -- which immunizes web companies from liability based on content posted by users -- and the First Amendment. The platforms are expected to bring that motion by June 27.