Free-speech principles as well as a 27-year-old media law prevent teens from proceeding with a highly publicized lawsuit alleging they were harmed by social media, the major social platforms
say.
The claims in the sweeping class-action complaint “are based on the creation and dissemination of content by others -- from videos about viral pranks and online challenges, to
images that allegedly invite personal comparisons, to communications with other users,” Meta, Snapchat, TikTok, and YouTube argue in a motion urging U.S. District Court Judge Yvonne Gonzalez
Rogers in the Northern District of California to throw out the lawsuit at an early stage.
The platforms argument comes in a class-action lawsuit brought by dozens of parents, teens
and others who claim Facebook, Instagram, TikTok, Snapchat and YouTube harmed minors' mental health.
The complaint is largely based on the theory that tech platforms designed their services to
be addictive, and then served minors with potentially harmful content that had been posted by other users.
Among other claims, the families allege that social-media companies are responsible
under “products liability” theories -- meaning that their products were dangerously defective, due to design addictive features.
Meta and the other companies now argue that the
lawsuit is barred by both Section 230 of the Communications Decency Act -- which immunizes web services from lawsuits over material posted by users -- and the First Amendment. (The platforms
previously urged Rogers to dismiss the case on the grounds that the allegations, even if true, wouldn't support the claims in the lawsuit. That request is still pending.)
The platforms say in
their new papers, filed Wednesday, that even though the complaint references supposedly addictive features of social media -- including recommendation algorithms -- any alleged injuries are tied to
content posted by other users.
“Section 230 provides immunity for interactive computer service providers against claims ... that seek to hold them liable for third-party content on their
services and the manner in which that content is presented,” the platforms argue. “Labeling such claims as challenging design 'defects' does not remove them from Section 230 --
particularly where, as here, the alleged defects are inescapably linked to the publication of third-party content and plaintiffs’ alleged harms necessarily depend on the content they
viewed.”
The companies add that the First Amendment separately protects them from lawsuits over other users' speech.
“Courts have long eschewed efforts to impose liability
on publishers of movies, television shows, music, and video games based on their role in broadcasting speech that allegedly harms its audience,” they write.