Platforms Defeat Suit Over Buffalo Mass Shooting

Siding with tech platforms, a New York state appeals court has dismissed a lawsuit by victims of a racially motivated mass shooting in Buffalo who alleged that YouTube, Reddit, Meta and other companies indoctrinated the shooter by showing him posts about the “replacement theory."

In a 3-2 decision issued Friday, the appellate judges said social platforms are protected from liability by both the 1st Amemdment, which give platforms the right to wield editorial control over their content, and Section 230 of the Communications Decency Act, which immunizes platforms from lawsuits over speech posted by users.

"Section 230 immunity and First Amendment protection are not mutually exclusive, and in our view the social media defendants are protected by both," Justice Stephen Lindley of the 4th Department of the New York Supreme Court wrote in the majority opinion.

advertisement

advertisement

"The Internet is the modern public square ... and section 230 is the scaffolding upon which the Internet is built," Lindley added.

The new ruling, which reverses a decision handed down last year by Erie County Supreme Court Justice Paula Feroleto, grows out of litigation over the May 2022 mass shooting at Tops Friendly Markets in Buffalo by teenage white supremacist Payton Gendron.

Victims and family members alleged in a 2023 complaint that social platforms including YouTube and Reddit were “instrumental” in the shooting because they allegedly radicalized Gendron, and allowed him to learn about weaponry.

“Their unreasonably dangerous and negligent design choices resulted in the shooter’s addiction to their products, and caused him to develop the mentality required to target and kill Black people who were innocently shopping at their local market,” victims alleged in the complaint. “In addition to facilitating the shooter’s radicalization, the design of these social media platforms provided the shooter with knowledge regarding the tools, products, and skills he needed to commit the mass shooting at Tops.”

The plaintiffs added that Gendron's “near-constant use of social media” led him to believe in the racist “great replacement” conspiracy -- which includes the theory that white people in the United States are being replaced by non-white immigrants.

Victims also alleged that Gendron became a “problematic user” of social media due to “dangerously defective and unreasonably dangerous algorithms powering Instagram, YouTube, and Snapchat.”

The tech companies urged Feroleto to throw out the case at an early stage for numerous reasons -- including that they were shielded from liability by the First Amendment and Section 230.

Feroleto ruled against the platforms, writing that the companies allegedly are "designed to be addictive to young users" and "specifically directed Gendron to further platforms or postings that indoctrinated him with 'white replacement theory.'"

The tech companies then appealed to the 4th Department -- the second-highest court in New York state.

Lindley wrote that Feroleto's opinion, if left in place, "would gut the immunity provisions of section 230 and result in the end of the Internet as we know it."

"This is so because internet service providers who use algorithms on their platforms would be subject to liability for all tort causes of action, including defamation," Lindley wrote. "Because social media companies that sort and display content would be subject to liability for every untruthful statement made on their platforms, the Internet would over time devolve into mere message boards."

He noted in the opinion that federal judges recently sided with Meta in a lawsuit over the racially motivated shooting in Charleston, South Carolina. In that case, family members of a victim sought to hold Meta responsible for algorithmically recommending content to shooter Dylann Roof that allegedly encouraged his racist and violent views.

Justices Tracey Bannister and Henry Nowak dissented. Among other reasons, they took the position that algorithmic recommendations are not protected by Section 230.

"We conclude that the targeted dissemination of particular information to individual end users does not amount to a traditional editorial or publishing decision that would fall within the ambit of section 230," they wrote.

Next story loading loading..