IAB, Others Urge Court To Reconsider Ruling That Curbed Section 230

The Interactive Advertising Bureau and other business organizations are urging a federal appellate court to reconsider a recent ruling that significantly curbed the protections of Section 230 of the Communications Decency Act.

That law, dating to 1996, generally immunizes web companies from liability for distributing user-generated content.

In August, a three-judge panel of the 3rd Circuit Court of Appeals said in a surprise decision that Section 230 didn't protect TikTok from liability for allegedly showing user-created “blackout challenge” videos to a 10-year-old girl. That decision revived a lawsuit brought by Tawainna Anderson, mother of Nylah Anderson who died in December 2021 after attempting the challenge, which showed people strangling themselves.

That ruling against TikTok “destabilizes established law in ways that threaten profound consequences for countless websites across the Internet --and their users,” the IAB and others, including the tech industry organizations NetChoice and Chamber of Progress argue in a friend-of-the-court brief filed Tuesday with the 3rd Circuit Court of Appeals.

advertisement

advertisement

“If allowed to stand, the panel’s decision would create substantial uncertainty for every website on the Internet that disseminates user-created speech,” the organizations write.

The legal battle dates to 2022, when Tawainna Anderson sued TikTok, alleging that its service is “dangerously defective” due to its algorithm. U.S. District Court Judge Paul Diamond in the Eastern District of Pennsylvania dismissed the suit, ruling that TikTok was protected by Section 230.

Anderson then appealed to the 3rd Circuit, which revived the case in an opinion written by Circuit Judge Patty Shwartz and joined by Peter Phipps. Circuit Judge Paul Matey partially concurred in a separate opinion.

The majority held that TikTok's algorithmic curation of users' speech is TikTok's own “expressive activity,” and therefore not protected by Section 230.

Some industry observers and legal experts said at the time that the ruling eviscerates Section 230 -- at least in jurisdictions within the 3rd Circuit, which covers Pennsylvania, Delaware and New Jersey.

TikTok recently urged the court to reconsider the decision.

The IAB and other business groups are now backing that request. They argue that the appellate panel misinterpreted Section 230 by holding that it doesn't apply when companies exercise their First Amendment right to curate user-generated content.

“At bottom, the First Amendment and Section 230 are complementary, not mutually exclusive,” the groups write. “Websites’ First Amendment rights do not somehow negate Section 230’s statutory protections.”

Six digital rights groups including the Electronic Frontier Foundation, Center for Democracy & Technology and Public Knowledge are separately urging the court to reconsider its ruling. They also argue that the panel incorrectly concluded that First Amendment protections and Section 230 protections are mutually exclusive.

“Online platforms like TikTok may have both Section 230(c)(1) immunity against claims based on harmful user-generated content -- here, the Blackout Challenge videos posted by TikTok users -- and First Amendment protection for their editorial decisions around whether and how to display that user-generated content, including recommendations (whether effectuated by algorithm or otherwise),” the groups write. “The panel erred in suggesting that these two protections are mutually exclusive.”

“TikTok had nothing to do with creating the Blackout Challenge videos -- they were wholly produced and posted by TikTok users, and it was the content of the videos that led to the tragic death of plaintiff’s daughter in this case,” the groups write.

They add that the depriving platforms of Section 230 protections for recommendations would give platforms an incentive to stop curating user-generated content “in any helpful way, beyond perhaps listing it in reverse chronological order.”

“The end of niche curation would harm internet users as recommendations help people find content relevant to their interests and connect with others in what is otherwise a virtually unnavigable sea of content and people,” the organizations argue.

Many of the same arguments came up in last year's battle at the Supreme Court over Google's potential liability for allegedly recommending terrorist videos.

In that matter, the family of Nohemi Gonzalez, a California State University student who was killed in a terror attack in Paris, sought to sue Google over the alleged recommendations.

The 9th Circuit ruled that Section 230 immunizes Google from liability for users' content, as well as for recommending users' posts. The family appealed to the Supreme Court, which heard arguments on the issue, but ultimately declined to rule on the extent of Section 230's protections for algorithmic recommendations.

Next story loading loading..