Court Won't Revive Lawsuit Against Meta Over Rohingya Genocide

Siding with Meta Platforms, a federal appellate court on Tuesday refused to revive a lawsuit alleging that the company helped fuel the Rohingya genocide in Myanmar.

The ruling, issued by a three-judge panel of the 9th Circuit Court of Appeals, grew out of a class-action complaint brought in 2021 by a refugee who fled Myanmar after her father was detained by the military in 2012. Proceeding anonymously, she alleged that Meta's algorithms spread anti-Rohingya sentiment by promoting extremist content, including hate speech about the Rohingya ethnic minority. A second anonymous plaintiff later joined the complaint.

The plaintiffs sued around three years after a Facebook-commissioned report concluded the company didn't do enough to prevent people from using the platform to incite violence in Myanmar.

advertisement

advertisement

The complaint included claims that Facebook introduced a "defective product" -- meaning its social networking service and recommendation algorithms -- into the marketplace, and that the platform was negligent.

U.S. District Court Yvonne Gonzalez Rogers in the Northern District of California dismissed the complaint, ruling it fell outside a two-year statute of limitations.

The plaintiffs appealed to the 9th Circuit, which upheld the dismissal but for a different reason. The appellate judges said Meta was protected by Section 230 of the Communications Decency Act, which immunizes web companies from liability for publishing content posted by users.

Circuit Judge Ryan Nelson wrote for the panel that the allegations against Meta centered on the "content of third-party posts."

He added that claims regarding Meta's recommendations boiled down to allegations that the company algorithmically promoted posts because people engaged with them.

Meta's alleged use of that type of algorithm is an activity associated with publishing, and therefore protected by Section 230, Nelson wrote.

"The alleged defects relate to Facebook’s core design as a publishing platform, particularly how Facebook promoted or downplayed third-party posts using algorithms," Nelson wrote in an opinion joined by Judges William Fletcher and Marsha Berzon.

"Under our case law, matching users with content is publishing conduct, even when the user has not requested the content," Nelson added.

While the panel decision was unanimous, two of the three judges indicated they disagreed with the outcome and only sided with Meta due to prior 9th Circuit decisions.

"If not bound by Circuit precedent, I would hold that section 230 does not bar the claims raised against Meta in this case because 'websites’ use of machine-generated algorithms to recommend content and contacts are not within the publishing role immunized under section 230,'" Berzon wrote in a concurrence joined by Fletcher.

They called for the 9th Circuit to reconsider "en banc" -- meaning by a majority of the circuit's judges -- whether Section 230 immunity should cover recommendations.

Nelson also wrote a separate concurrence in which he urged fellow 9th Circuit judges to revisit whether Section 230 protects web companies that use a "personalized recommendation algorithm."

He opined that Section 230 should not apply when platforms use "modern" recommendation algorithms -- as opposed to the one Facebook allegedly used around 15 years ago in Myanmar.

"Modern recommendation algorithms are opaque, esoteric, and -- particularly when artificial intelligence enters the fray -- incomprehensible, sometimes even to their own designers," he wrote.

"Much of the matchmaking and network creation that modern algorithms engage in does not fit within any fair definition of publishing conduct," he added.

Jay Edelson, who represents the plaintiffs, suggested they could seek further review.

"We are reviewing the court decision and, of course, paid close attention to the court's invitation to see en banc review," he said. "We will make a decision in the coming weeks."

Next story loading loading..