Siding with Google and TikTok, a federal judge has dismissed a complaint alleging that the platforms failed to remove videos promoting dangerous activity.
The ruling, issued Monday by U.S. District Court Judge Virginia DeMarchi in San Jose, California, comes in a lawsuit brought by parents who alleged their children were harmed as a result of content on the platforms, and by the Becca Schmill Foundation -- created by the family of 18-year-old Rebecca Mann Schmill, who died of a fentanyl overdose after using social media to obtain drugs.
Among other allegations, the plaintiffs -- who described themselves as “modern-day champions and vigilantes -- said they searched for and reported “choking videos and other harmful videos” to Google's YouTube and TikTok, but that their efforts were “unheeded, ignored, and arbitrarily dismissed” by the platforms.
Their complaint includes claims that Google and TikTok misrepresented that they remove content that violated their content policies.
advertisement
advertisement
The suit also includes claims that the companies' products -- meaning their social-media platforms -- are dangerously defective, and that the companies were negligent for failing to protect users from an “unreasonable risk of harm.”
Google and TikTok urged DeMarchi to dismiss the matter for several reasons, including that Section 230 protects companies from liability for material created by third parties.
“The essence of plaintiffs’ claims is their disagreement with defendants’ decisions about whether to publish third-party content: the supposed 'defect' or negligent act is failing to remove content,” the companies argued in papers filed last year.
The companies also contended that they couldn't be liable for allegedly offering dangerously defective "products," arguing that such “products liability” claims can only be brought against companies that offer physical goods, such as cars or pharmaceuticals.
“Plaintiffs here do not challenge a tangible 'product,'” the companies wrote. “Instead, they target defendants’ processes for reviewing and taking down reported videos. That is a quintessential service.”
DeMarchi gave several reasons for dismissing the claims.
Among others, she said the allegations, even if proven true, would not show that Google or TikTok offered a product that was defective.
“The crux of plaintiffs’ allegations is that the defendants’ reporting systems are defective because plaintiffs’ reports do not produce the outcomes that plaintiffs believe they should -- i.e. removal of the reported videos,” the judge wrote. “Such allegations fail to state a claim under products liability law.”
She also ruled that the products liability claims were barred by Section 230, noting that they were premised on the idea that Google and TikTok should have removed videos posted by third parties.
DeMarchi also ruled that the allegations in the complaint weren't specific enough to support a finding that Google and TikTok misrepresented their content-moderation policies.
“The complaint does not identify any specific video that contained prohibited conduct and was not removed once it was determined to violate a defendant’s guidelines,” she wrote.
The dismissal was without prejudice, meaning that the plaintiffs can reformulate their claims and bring them again.