Ruling That Narrowed Section 230 'Threatens Chaos,' TikTok Argues

TikTok on Tuesday asked a federal appellate court to reconsider its recent decision that Section 230 of the Communications Decency Act didn't protect the company from liability for allegedly serving “blackout challenge” videos to a 10-year-old girl.

That ruling, issued last month, “undercuts a central building block of the modern internet and threatens chaos across the industry,” TikTok says in papers filed with the 3rd Circuit Court of Appeals.

The company, now represented by former U.S. Solicitor General Paul Clement, is asking the 3rd Circuit to reconsider its decision to revive a lawsuit brought by Tawainna Anderson, mother of Nylah Anderson who died in December 2021 after attempting the “blackout challenge.” That challenge, which was then circulating on TikTok, showed people strangling themselves.

advertisement

advertisement

Tawainna Anderson raised several claims in a 2022 complaint against TikTok, including that its service is “dangerously defective” due to its algorithm.

U.S. District Court Judge Paul Diamond in the Eastern District of Pennsylvania dismissed the suit, ruling that TikTok was protected by Section 230 -- a 1996 law that immunizes web platforms from liability for publishing content created by users. 

Diamond wrote that Anderson's claims about TikTok were “inextricably linked” to its role as a publisher of content created by users.

Anderson then appealed to the 3rd Circuit, which revived the case in an opinion written by Circuit Judge Patty Shwartz and joined by Peter Phipps. Circuit Judge Paul Matey partially concurred in a separate opinion.

The majority held that TikTok's algorithmic curation of users' speech is TikTok's own “expressive activity,” and therefore not protected by Section 230.

Some industry observers and legal experts said at the time that the ruling eviscerates Section 230 -- at least in the 3rd Circuit.

“The majority says that if the First Amendment protects algorithms -- which it does -- and a service uses algorithms to 'curate' third-party content -- which it must -- then the service qualifies for First Amendment protection but not Section 230 protection because there is no longer any 'third-party content.' Ergo, Section 230 never will work in the Third Circuit.” Santa Clara University law professor Eric Goldman wrote about the opinion.

TikTok argues in its request for a new hearing that the panel's September decision is inconsistent with rulings by other appellate courts.

“For the better part of three decades, every federal court of appeals to consider the issue held that Section 230 bars lawsuits that seek to hold websites liable for their decisions about which third-party content to display to their users, including decisions about how to organize that content and which materials to prioritize or recommend,” the company writes.

“That universal understanding of Section 230 has been pivotal to the development of the modern internet,” TikTok adds.

TikTok gave several examples of contradictory rulings, including one issued several years ago by the 2nd Circuit Court of Appeals, which held that web companies don't lose Section 230 immunity by algorithmically promoting content. The appellate court said in that matter that Facebook was immune from liability for allegedly using algorithms to recommend content that facilitated terror networks.

Next story loading loading..