Court Narrows Section 230, Revives Suit Against TikTok Over Algorithms

In a ruling that could have a broad impact on social platforms, a federal appellate court said Section 230 of the Communications Decency Act didn't protect TikTok from liability for allegedly serving “blackout challenge” videos to a 10-year-old girl.

The decision, issued last week by a three-judge panel of the 3rd Circuit Court of Appeals, revives a lawsuit brought against TikTok by Tawainna Anderson, mother of Nylah Anderson who died in December 2021 after attempting the “blackout challenge.” That challenge, which was then circulating on TikTok, showed people strangling themselves.

Tawainna Anderson raised several claims in a 2022 complaint against TikTok, including that its service is “dangerously defective” due to its algorithm.

advertisement

advertisement

U.S. District Court Judge Paul Diamond in the Eastern District of Pennsylvania dismissed the suit, ruling that TikTok was protected by Section 230 of the Communications Decency Act -- a 1996 law that immunizes web platforms from liability for publishing content created by users. 

Diamond wrote that Anderson's claims about TikTok were “inextricably linked” to its role as a publisher of content created by users.

“The wisdom of conferring such immunity is something properly taken up with Congress, not the courts,” he wrote.

Anderson then appealed to the 3rd Circuit, which revived the case in an opinion written by Circuit Judge Patty Shwartz and joined by Peter Phipps. Circuit Judge Paul Matey partially concurred in a separate opinion.

The majority held that TikTok's algorithmic curation of users' speech is TikTok's own “expressive activity,” and therefore not protected by Section 230.

Some industry observers and legal experts say the ruling eviscerates Section 230 -- at least in the 3rd Circuit.

“The majority says that if the First Amendment protects algorithms -- which it does -- and a service uses algorithms to 'curate' third-party content -- which it must -- then the service qualifies for First Amendment protection but not Section 230 protection because there is no longer any 'third-party content.' Ergo, Section 230 never will work in the Third Circuit.” Santa Clara University law professor Eric Goldman wrote about the opinion.

Stanford's Cyber Policy Center's Daphne Keller adds that Section 230 “would be a nullity” if it didn't apply to platforms' “constitutionally protected exercise of editorial freedoms.”

Techdirt's Mike Masnick likewise says the ruling “takes a wrecking ball to 230.”

“The implications are staggering if this ruling stands,” he wrote.

The 3rd Circuit's decision comes as TikTok, Meta and other social platforms are facing a class-action brought by hundreds of teens and their families who have raised claims relating to the companies' algorithms. Among other claims, the plaintiffs allege that tech platforms design their services to be addictive, and then serve minors with potentially harmful material -- such as filtered photos that promote unrealistic aesthetic standards.

The new ruling conflicts with a decision issued several years ago by a different appellate court -- the 2nd Circuit Court of Appeals, which said web companies don't lose Section 230 immunity by algorithmically promoting content.

In that matter, the appellate court said Facebook was immune from liability for allegedly using algorithms to recommend content that facilitated terror networks.

The appellate judges said at the time that websites typically recommend third-party content to users -- such as by prominently featuring it on a home page, or displaying English-language articles to users who speak English. The use of algorithms to make those recommendations doesn't make Facebook liable for the posts, the judges said.

Next story loading loading..