The Supreme Court this week declined to review a lower court ruling that dismissed a lawsuit by family members of Clementa Pinckney, who was murdered by Dylann Roof in a racially
motivated mass shooting at Mother Emanuel church in Charleston, South Carolina.
As is customary, the court didn't give a reason for its refusal to hear the case.
The move leaves in place the 4th Circuit Court of Appeals' decision that Section 230 of the Communications Decency Act protected Meta from the family's claims. That law immunizes web
platforms from liability for publishing content posted by users.
The order denying review brings an end to a lawsuit brought in 2022 by Jennifer Pinckney, widow of Clementa
Pinckney, on behalf of her daughter.
Clementa Pinckney -- a South Carolina state senator and pastor at Mother Emanuel -- was one of nine people killed by Roof on June 17,
2015.
The family's complaint alleged that Roof “was shown so much white supremacist propaganda that he believed the heinous act he ultimately committed at Mother Emanuel
was necessary to spark a race war and save the white race.”
advertisement
advertisement
The complaint included claims that Facebook was defectively designed, and that the company was negligent.
U.S. District Court Judge Richard Mark Gergel dismissed the case in September 2023, ruling that they were barred by Section 230.
A majority of the 4th
Circuit panel upheld that ruling, writing that the family's claims “are inextricably intertwined with Facebook’s role as a publisher of third-party content.”
“While there is widespread concern about Facebook’s use of its algorithm to arrange and sort racist and hate-driven content, acts of arranging and sorting content are
integral to the function of publishing,” Circuit Judge Barbara Milano Keenan wrote in an opinion joined by Judge Albert Diaz.
The judges also said that even without
Section 230, the allegations against Facebook, even if proven true, wouldn't show that the platform caused the shooting.
At least one other federal appellate court -- the 2nd
Circuit Court of Appeals -- has also ruled that Section 230 protected
Facebook from liability over algorithmic recommendations.
In that matter, Facebook was sued by family members of people killed in Israel by terrorists, and one survivor of an
attack in that country. The plaintiffs alleged that Facebook wrongly allowed its platform to be used by terrorist organizations who sought to organize and recruit new members, and that the company's
algorithmic recommendations “helped introduce thousands of terrorists to one another, facilitating the development of global terror networks.”
A split 2nd Circuit
panel sided with Meta in 2019, ruling 2-1 that the use of algorithms to recommend third-party content doesn't deprive companies of Section 230 protections.
But last year, the
3rd Circuit Court of Appeals said Section 230 didn't protect TikTok
from liability for recommendations. The judges in that case said TikTok's algorithmic curation of users' speech is TikTok's own “expressive activity,” and therefore not protected by
Section 230.