Meta Platforms is urging the Supreme Court to leave in place a lower court ruling that dismissed a lawsuit by family members of Clementa Pinckney, who was murdered by Dylann Roof in
a racially motivated mass shooting at Mother Emanuel church in Charleston, South Carolina.
Pinckney's family is seeking to hold Meta responsible for the shooting, arguing that
its algorithms served Roof with racist and violent material.
Meta counters in papers filed Friday that a trial judge and 4th Circuit Court of Appeals correctly determined that
Section 230 of the Communications Decency Act precludes the family's claims. That law immunizes web platforms from liability for publishing content posted by users.
"Section
230 bars any claim that seeks to impose liability based on a publisher’s traditional editorial functions, including how to arrange and sort third-party content," Meta argues.
"That rule applies with equal force when those functions are accomplished by an algorithm," the company adds.
advertisement
advertisement
Meta's Supreme Court filing comes in a 2022 lawsuit by Jennifer Pinckney, widow of Clementa Pinckney,
on behalf of her daughter. Clementa Pinckney -- a South Carolina state senator and pastor at Mother Emanuel -- was one of nine people killed by Roof on June 17, 2015.
The
family's complaint alleged that Roof “was shown so much white supremacist propaganda that he believed the heinous act he ultimately committed at Mother Emanuel was necessary to spark a race war
and save the white race.”
The complaint included claims that Facebook was defectively designed, and that the company was negligent.
U.S. District
Court Judge Mark Gergel dismissed the case in September 2023, ruling that they were barred by Section 230.
A majority of the 4th Circuit panel upheld that ruling, writing that
the family's claims “are inextricably intertwined with Facebook’s role as a publisher of third-party content.”
“While there is widespread concern about
Facebook’s use of its algorithm to arrange and sort racist and hate-driven content, acts of arranging and sorting content are integral to the function of publishing,” Circuit Judge Barbara
Milano Keenan wrote in an opinion joined by Judge Albert Diaz.
The judges also said that even without Section 230, the allegations against Facebook, even if proven true,
wouldn't show that the platform caused the shooting.
Pinckney's family recently asked the Supreme Court to review that ruling.
Meta is opposing the
request, arguing in papers filed Friday that the 4th Circuit ruling "represents an unremarkable application of settled Section 230 principles."
The tech platform also says that
any ruling regarding Section 230 wouldn't affect the outcome of the case because the 4th Circuit also ruled that Meta didn't cause the shooting.
At least one other federal
appellate court -- the 2nd Circuit Court of Appeals -- has ruled that Section
230 protected Facebook from liability over algorithmic recommendations.
In that matter, Facebook was sued by family members of people killed in Israel by terrorists, and one
survivor of an attack in that country. The plaintiffs alleged that Facebook wrongly allowed its platform to be used by terrorist organizations who sought to organize and recruit new members, and that
the company's algorithmic recommendations “helped introduce thousands of terrorists to one another, facilitating the development of global terror networks.”
A split
2nd Circuit panel sided with Meta in 2019, ruling 2-1 that the use of algorithms to recommend third-party content doesn't deprive companies of Section 230 protections.
But last
year, the 3rd Circuit Court of Appeals said Section 230 didn't
protect TikTok from liability for recommendations. The judges in that case said TikTok's algorithmic curation of users' speech is TikTok's own “expressive activity,” and therefore not
protected by Section 230.