Court Won't Revive Claims Against Meta Over South Carolina Mass Shooting

Siding with Meta Platforms, a federal appeals court refused to revive a lawsuit that sought to hold the company responsible for the 2015 racially motivated mass shooting by Dylann Roof in Charleston, South Carolina.

In an opinion issued Wednesday, a panel of the 4th Circuit Court of Appeals said in a 2-1 ruling that Section 230 of the Communications Decency Act protected Meta from claims that Facebook's algorithmic recommendations radicalized Roof by recommending content to him that encouraged his racist and violent views. That law immunizes web companies from liablity for publishing content posted by users.

The panel also said that even without Section 230, the allegations against Facebook, even if proven true, wouldn't show that the platform caused the shooting.

The ruling comes in a lawsuit brought in 2022 by Jennifer Pinckney on behalf of her daughter, identified in court papers as “M.P.”

advertisement

advertisement

M.P.'s father, Clementa Pinckney, a state senator and pastor at Mother Emanual church, was among the people killed by Roof.

Pinckney alleged that Roof “was shown so much white supremacist propaganda that he believed the heinous act he ultimately committed at Mother Emanual was necessary to spark a race war and save the white race.”

Her complaint included claims that Facebook was defectively designed, and that the company was negligent. (She also sued other defendants based on Russia, including the propaganda shop Internet Research Company; those claims are still pending.)

U.S. District Court Judge Mark Gergel dismissed the claims against Meta in September 2023, ruling that the company was protected by Section 230.

Pinckney then appealed to the 4th Circuit, arguing in a brief filed last year that Section 230 doesn't immunize web companies from liability for recommendations.

“This action seeks to hold Meta liable for its own conduct in designing a platform to elicit 'emotional contagion' because 'emotional contagion' is good for Meta’s bottom line,” she argued in a written brief filed in January 2024.

The appellate judges rejected that argument, writing that the claims in the complaint “are inextricably intertwined with Facebook’s role as a publisher of third-party content.”

“While there is widespread concern about Facebook’s use of its algorithm to arrange and sort racist and hate-driven content, acts of arranging and sorting content are integral to the function of publishing,” Circuit Judge Barbara Milano Keenan wrote in an opinion joined by Judge Albert Diaz.

“For instance, newspaper editors choose what articles merit inclusion on their front page and what opinion pieces to place opposite the editorial page,” Keenan added. “These decisions, like Facebook’s decision to recommend certain third-party content to specific users, have as a goal increasing consumer engagement.”

“Decisions about whether and how to display certain information provided by third parties are traditional editorial functions of publishers,” she added.

Circuit Judge Allison Jones Rushing partially dissented, noting that one of the allegations against Facebook was that it recommended Roof join extremist groups.

“Recommending that a user join a group, connect with another user, or attend an event is Facebook’s own speech, for which it can be held liable,” she wrote, adding that she would have returned the matter to the district court for further proceedings.

Milano countered in a footnote that the complaint didn't allege that Facebook recommend that Roof “join a specific hate group” or that Roof joined a hate group based on a “Facebook algorithm referral.”

In 2019, another appellate court, the 2nd Circuit Court of Appeals, also ruled that Section 230 protected Facebook from liability over algorithmic recommendations.

In that matter, Facebook was sued by family members of people killed in Israel by terrorists, and one survivor of an attack in that country. The plaintiffs alleged that Facebook wrongly allowed its platform to be used by terrorist organizations who sought to organize and recruit new members, and that the company's algorithmic recommendations “helped introduce thousands of terrorists to one another, facilitating the development of global terror networks.”

A split 2nd Circuit panel sided with Meta, ruling 2-1 that the use of algorithms to recommend third-party content doesn't strip companies of Section 230 protections.

But last year, a different court, the 3rd Circuit Court of Appeals, ruled that Section 230 didn't protect TikTok from liability over recommendations. The judges in that case said TikTok's algorithmic curation of users' speech is TikTok's own “expressive activity,” and therefore not protected by Section 230.

TikTok plans to ask the Supreme Court to review that decision.

Next story loading loading..