
Meta Platforms and TikTok are asking an appellate court to
dismiss claims that social media platforms harm teens, arguing that Section 230 of the Communications Decency Act prevents the claims from moving forward.
In papers filed this week with the
9th Circuit Court of Appeals, Meta argues that the “plain text” of Section 230, as well as prior court decisions, make clear that the sprawling litigation shouldn't move forward.
“As this Court has long recognized, Congress’s intent in enacting Section 230 was to promote the continued development of the Internet by removing the chilling effect that even the
threat of suit would have on the publication of third-party content,” Meta writes in a brief joined by TikTok. “To effectuate its policy objectives -- which are set forth expressly in the
statute itself -- Section 230 provides interactive computer service providers with both immunity from liability and immunity from suit arising from their publication of third-party content.”
advertisement
advertisement
Section 230, which dates to 1996, provides that web companies are generally immune from liability for posts by users -- even when those posts could cause harm, such as by encouraging eating
disorders. The statute itself -- which law professor Jeff Kosseff famously called “The Twenty-Six Words That Created The Internet” -- reads:
“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content
provider.”
(The First Amendment separately protects speech that's potentially harmful, but not in itself illegal.)
Meta's new papers come in legal proceedings dating to June
2022, when teens and their families sued the company for allegedly designing Facebook and Instagram to be “addictive,” and then serving teens with potentially harmful content, such as
filtered photos that promote unrealistic aesthetic standards.
Additional families later sued Meta and other platforms, including Google, X, TikTok and Snap. State attorneys general, school districts and local towns subsequently brought similar suits.
The plaintiffs raised numerous theories, including that platforms' recommendation algorithms were dangerous, that the platforms failed to warn users about the risk of addiction, and failed to
notify users when photos had been filtered.
U.S. District Court Judge Yvonne Gonzalez Rogers in the Northern District of California ruled that Section 230 and the First Amendment barred some claims, but not all
of them. For instance, she said Section 230 precluded claims over the platforms' use of recommendation algorithms, but not claims that the platforms failed to warn users about the risks of addiction,
and or that some photos had been filtered.
Meta now argues to the 9th Circuit that those “failure to warn” claims should also be dismissed, contending they're “functionally
identical” to ones Rogers threw out.
“Where Section 230 bars claims that would hold Meta liable for third-party content based on certain of its publishing tools, duty to warn
claims challenging the same publication tools are also barred,” the company writes.
“Any other result would allow plaintiffs to simply plead around Section 230 even where their
claims directly target publishing activity and functionally identical claims are barred.”
The plaintiffs are expected to respond to Meta's arguments later this spring.