Shooting Victim's Family Asks SCOTUS To Revive Claims Against Meta

Family members of Clementa Pinckney, who was killed in a racially motivated mass shooting by Dylann Roof, want the Supreme Court to revive a lawsuit against Meta Platforms, which allegedly recommended material to Roof that encouraged his racist and violent views.

Earlier this year, the 4th Circuit Court of Appeals said in a 2-1 ruling that Section 230 of the Communications Decency Act protected Meta from liability for Facebook recommendations. That law immunizes web companies from liability for publishing content posted by users.

The panel also said that even without Section 230, the allegations against Facebook, even if proven true, wouldn't show that the platform caused the shooting.

Pinckney's family argues in a petition quietly filed last week with the Supreme Court that Section 230 doesn't protect social platforms from lawsuits over recommendations.

advertisement

advertisement

“Neither Section 230’s text nor its history suggests that Meta should be immune ... for its own choices to manipulate users by recommending the most damaging content possible,” lawyers for the family argue. “And neither Section 230’s text nor its history provides immunity for suggesting that a person join a group of white supremacists.”

The lower court's ruling came in a 2022 lawsuit by Jennifer Pinckney, widow of Clementa Pinckney, on behalf of her daughter, identified in court papers as “M.P.” Her father, Clementa Pinckney -- a South Carolina state senator and pastor at Mother Emanuel church -- was one of nine people killed by Roof on June 17, 2015.

The complaint alleged that Roof “was shown so much white supremacist propaganda that he believed the heinous act he ultimately committed at Mother Emanuel was necessary to spark a race war and save the white race.”

Pinckney alleged that Facebook was defectively designed, and that the company was negligent.

U.S. District Court Judge Mark Gergel dismissed the case in September 2023, ruling that they were barred by Section 230.

Pinckney then appealed to the 4th Circuit, arguing that Section 230 shouldn't immunize web companies from liability in this situation.

The appellate judges rejected that argument, writing that the claims in the complaint “are inextricably intertwined with Facebook’s role as a publisher of third-party content.”

“While there is widespread concern about Facebook’s use of its algorithm to arrange and sort racist and hate-driven content, acts of arranging and sorting content are integral to the function of publishing,” Circuit Judge Barbara Milano Keenan wrote in an opinion joined by Judge Albert Diaz.

Circuit Judge Allison Jones Rushing partially dissented, noting that one of the allegations against Facebook was that it recommended Roof join extremist groups.

“Recommending that a user join a group, connect with another user, or attend an event is Facebook’s own speech, for which it can be held liable,” she wrote, adding that she would have returned the matter to the district court for further proceedings.

Milano countered in the majority opinion that the complaint didn't allege that Facebook recommend that Roof “join a specific hate group” or that Roof joined a hate group based on a “Facebook algorithm referral.”

The 4th Circuit isn't the only appellate court to clear Meta of liability over recommendations. In 2019, the 2nd Circuit Court of Appeals, also ruled that Section 230 protected Facebook from liability over algorithmic recommendations.

In that matter, Facebook was sued by family members of people killed in Israel by terrorists, and one survivor of an attack in that country. The plaintiffs alleged that Facebook wrongly allowed its platform to be used by terrorist organizations who sought to organize and recruit new members, and that the company's algorithmic recommendations “helped introduce thousands of terrorists to one another, facilitating the development of global terror networks.”

A split 2nd Circuit panel sided with Meta, ruling 2-1 that the use of algorithms to recommend third-party content doesn't deprive companies of Section 230 protections.

But last year, the 3rd Circuit Court of Appeals said Section 230 didn't protect TikTok from liability over recommendations. The judges in that case said TikTok's algorithmic curation of users' speech is TikTok's own “expressive activity,” and therefore not protected by Section 230.

Next story loading loading..