Google, TikTok Prevail In Battle Over Potentially Harmful Videos

Siding with Google and TikTok, a federal judge has thrown out a lawsuit alleging that the platforms failed to take down videos promoting dangerous activity.

The ruling, issued this week by U.S. District Court Judge Virginia DeMarchi in San Jose, California, came in a lawsuit brought by parents who alleged their children were harmed as a result of content on the platforms, and by the Becca Schmill Foundation -- created by the family of 18-year-old Rebecca Mann Schmill, who died of a fentanyl overdose after using social media to obtain drugs.

Among other allegations, the plaintiffs -- who described themselves as “modern-day champions and vigilantes -- said they searched for and reported “choking videos and other harmful videos” to Google's YouTube and TikTok, but that their efforts were “unheeded, ignored, and arbitrarily dismissed” by the platforms.

Their complaint includes claims that Google and TikTok misrepresented that they remove content that violated their content policies. The plaintiffs also claim that the social media companies' platforms are dangerously defective, and that the companies negligently failed to protect users from an “unreasonable risk of harm.”

advertisement

advertisement

In February, DeMarchi dismissed a prior version of the complaint without prejudice -- meaning she allowed the plaintiffs to revise their claims and bring them again.

She ruled at the time that the allegations, even if proven true, wouldn't show that Google or TikTok offered a "defective" product.

“The crux of plaintiffs’ allegations is that the defendants’ reporting systems are defective because plaintiffs’ reports do not produce the outcomes that plaintiffs believe they should -- i.e. removal of the reported videos,” DeMarchi wrote in February. “Such allegations fail to state a claim under products liability law.”

She also ruled that the products liability claims were barred by Section 230, noting that they were premised on the idea that Google and TikTok should have removed videos posted by third parties.

DeMarchi additionally said in the earlier ruling that the allegations in the complaint weren't specific enough to support a finding that Google and TikTok misrepresented their content-moderation policies.

“The complaint does not identify any specific video that contained prohibited conduct and was not removed once it was determined to violate a defendant’s guidelines,” she wrote.

The plaintiffs filed an amended complaint in March, but DeMarchi ruled this week that the amended complaint still lacked the kinds of allegations that would warrant further proceedings.

She said the most significant change was that the new complaint alleged that Google and TikTok relied on "automated reporting tools" to review content flagged by users, and that those tools weren't able to conduct reviews.

"Plaintiffs allege that the automated reporting tools are defective because they are 'not capable of conducting a review of the harmful content against the defendants’ community guidelines,'" she wrote.

But she added that even with those new allegations, the plaintiffs' case boiled down to the argument that Google's and TikTok's tools are defective because the companies don't always remove videos that have been reported for alleged violations of content moderation policies.

"As the court concluded in its prior order, the alleged 'defect' is not 'content-agnostic,' but instead reflects a disagreement about 'ideas, content, and free expression upon which products liability claims cannot be based,'" DeMarchi wrote, quoting from a separate opinion regarding social media companies.

The dismissal was with prejudice, meaning the plaintiffs can't attempt to move forward again with their claims unless an appellate court intervenes.

Next story loading loading..