
Siding with social platform TikTok, a federal judge has
dismissed a lawsuit against the company by the mother of a 10-year-old girl, who died after attempting a “blackout challenge” she saw on the service.
In a decision issued Tuesday,
U.S. District Court Judge Paul S. Diamond in the Eastern District of Pennsylvania said TikTok was protected by Section 230 of the Communications Decency Act. That law immunizes web platforms from
liability for publishing content created by users.
Diamond wrote that despite the tragic circumstances, he was "compelled to rule" that TikTok is protected by Section 230, given that the
lawsuit sought to hold the company responsible for content created by others.
The decision stems from a lawsuit filed in May by Nylah Anderson's mother, Tawainna Anderson, against TikTok
and parent company ByteDance.
advertisement
advertisement
She alleged that Nylah died in December after imitating the “blackout challenge,” which involved strangling herself. The challenge was then
circulating in TikTok videos, and was allegedly algorithmically promoted to Nylah.
Anderson raised several claims, including that TikTok's service is “dangerously defective,” due
to its algorithm.
“The TikTok defendants’ app and algorithm are intentionally designed to maximize user engagement and dependence and powerfully encourage children to engage in a
repetitive and dopamine-driven feedback loop by watching, sharing, and attempting viral challenges and other videos,” she alleged. “TikTok is programming children for the sake of corporate
profits and promoting addiction.”
Diamond said in his ruling that Anderson's claims about TikTok were “inextricably linked” to its role as a publisher of third-party
content.
“Nylah Anderson’s death was caused by her attempt to take up the 'Blackout Challenge,'” he wrote. “Defendants did not create the challenge; rather, they made
it readily available on their site. Defendants’ algorithm was a way to bring the challenge to the attention of those likely to be most interested in it.”
He added that TikTok acted
as a publisher by promoting the videos -- and was therefore protected by Section 230.
“The wisdom of conferring such immunity is something properly taken up with Congress, not the
courts,” he added.
Several years ago, the 2nd Circuit Court of Appeals likewise ruled that web companies don't lose Section 230 immunity by
algorithmically promoting content.
In that matter, the appellate court said Facebook was immune from liability for allegedly using algorithms to recommend content that facilitated terror
networks.
The appellate judges said at the time that websites typically recommend third-party content to users -- such as by prominently featuring it on a home page, or displaying
English-language articles to users who speak English. The use of algorithms to make those recommendations doesn't make Facebook liable for the posts, the judges said.
Diamond's ruling in favor
of TikTok comes as Facebook and Instagram are facing lawsuits for allegedly
designing their services to be addictive, and serving potentially harmful content to teens and children.
Meta Platforms is expected to argue in those cases that it is protected by Section
230.