Court Urged To Revive Claims Against TikTok Over 'Blackout Challenge' Death

The mother of a 10-year-old girl who died after attempting the so-called “TikTok Blackout Challenge” on Monday asked a federal appellate court to revive a lawsuit against TikTok and its parent company, ByteDance.

Tawainna Anderson alleges that her daughter, Nylah Anderson, asphyxiated herself in December of 2021, after imitating a video of the “Blackout Challenge.” That challenge, which was then circulating on TikTok, showed people strangling themselves.

Anderson says the social media platform algorithmically promoted the video to Nylah based on data about her, including her interests, “likes,” and other information.

TikTok “knew that Nylah Anderson was an impressionable 10-year-old minority female living in a working-class neighborhood who tended to watch challenge videos that were put in front of her and that she would then attempt to mimic the challenges and record it,” lawyers for her mother write in papers filed Monday with the 3rd Circuit Court of Appeals. “Rather than employ this knowledge to screen this 10-year-old from dangerous content, TikTok harnessed this knowledge against Nylah with the intention of keeping her engaged on the app.”

U.S. District Court Judge Paul Diamond in the Eastern District of Pennsylvania dismissed Anderson's complaint last year, ruling that Section 230 of the Communications Decency Act immunized TikTok from civil lawsuits based on content created by users.

Diamond said in his ruling that Anderson's claims about TikTok were “inextricably linked” to its role as a publisher of third-party content.

“Defendants did not create the challenge; rather, they made it readily available on their site,” Anderson wrote. “Defendants’ algorithm was a way to bring the challenge to the attention of those likely to be most interested in it.”

Anderson's lawyer is asking the 3rd Circuit to reverse that ruling, arguing that the judge interpreted Section 230 too broadly.

“Defendants’ recommendation algorithms are not the technology Congress intended to protect and foster through the [Communications Decency Act],” Anderson's lawyer argues.

The Supreme Court recently agreed to review two related disputes over whether Section 230 of the Communications Decency Act protects tech platforms for allegedly algorithmically promoting content linked to terrorism. The court is expected to hear arguments in those cases on February 21 and 22.

Next story loading loading..