Lawyers for teens and families who are suing social media platforms are pressing to proceed with claims that the platforms injured minors by designing their services to be addictive, then serving teens with potentially harmful content originally posted by other users.
In papers filed late Tuesday, counsel argues that the platforms aren't protected by either the First Amendment or Section 230 of the Communications Decency Act -- a 27-year-old media law that immunizes web companies from liability for users' posts.
“This case is about conduct, not content -- defects, not speech,” the attorneys write. “Defendants designed defective social media products that harm kids.”
The new filing comes in a sprawling class-action complaint brought by dozens of parents, teens and others against Facebook, Instagram, TikTok, Snapchat and YouTube.
The litigation dates to last June, when users in nine states alleged that Meta designed Facebook and Instagram in a way that posed a risk to young users' health. Numerous other users later brought similar allegations against Meta and the other platforms.
The complaints initially drew on former Facebook executive Frances Haugen's accusations that the company chose profits over safety by designing its services in ways that can harm users -- such as by promoting material associated with eating disorders to teen girls.
One of the original plaintiffs alleged that she began using Instagram when she was 11, and that her social media use “coincided with a steady, but severe, decline in her mental health."
That plaintiff said she was later hospitalized for an eating disorder.
Last month, the platforms asked U.S. District court Judge Yvonne Gonzalez Rogers in the Northern District of California to throw out the lawsuit at an early stage. Among other arguments, the companies said the lawsuit is barred by both Section 230 and the First Amendment.
In general, the First Amendment protects companies from lawsuits over lawful speech (including content that could be considered harmful, such as posts about eating disorders or drug use), while Section 230 specifically immunizes web companies from liability over material posted by users, including speech that's defamatory or otherwise unlawful.
The platforms argued that even though the complaint references recommendation algorithms and other supposedly addictive features of social media, any alleged injuries are tied to the content itself, including “videos about viral pranks and online challenges,” and “images that allegedly invite personal comparisons.”
Counsel for the families essentially counter that their claims aren't about content, but how the platforms attempt to keep teens engaged.
“Plaintiffs take issue with how defendants’ algorithms feed on metrics derived from users’ interactions with content, rather than the information conveyed therein,” lawyers write in their new motion. “Such algorithms are not designed to curate information; they are designed to maximize the duration and intensity of children’s usage, untethered from that to which their attention is directed.”
Whether the claims will be viewed as centering on speech or conduct likely will be key to the outcome of the case, according to Santa Clara University Eric Goldman, who has written extensively about Section 230.
“If the court characterizes the plaintiffs' claims as fundamentally about speech, I think they're going to lose,” he tells MediaPost. “If if the court characterizes the claims as about conduct, I think the platforms are in deep trouble.”
Goldman says he views the case as centering on content, and for that reason believes the platforms should be protected by Section 230.
“Section 230 is the proper exit for the case, because so many of the claims are about trying to hold social media companies liable for third-party content,” he says.
He adds that the core argument that platforms are addictive is linked to their content, given that what users allegedly are being addicted to is other people's speech.