Supreme Court Won't Revive Terror Victims' Lawsuit Against Facebook

The Supreme Court Monday turned away victims of terrorist attacks in Israel who wanted to revive a lawsuit accusing Facebook of allowing its platform to be used by members of Hamas to communicate, organize and recruit new members.

The order denying review, issued without comment, leaves in place a lower-coiurt ruling that Section 230 of the Communications Decency Act protects Facebook from liability for activity on the platform by users, including ones linked to terrorist groups.

The battle stemmed from a lawsuit brought by the estates and family members of four Americans killed by terrorist attacks between 2014 and 2016 in Israel (Stuart Force, Yaakov Naftali Fraenkel, Chaya Zissel Braun, Richard Lakin) and one survivor (Menachem Mendel Rivkin).

Their lawsuit is one of numerous cases brought by terrorist victims against social media companies. The victims have tended to argue that the web companies assist terrorists by giving them a communications platform, which they use to recruit people and plan attacks.

To date, the courts have ruled in favor of social media companies. In 2018, the 9th Circuit Court of Appeals sided with Twitter in a lawsuit brought by family members of men killed in a 2015 shooting in Amman, Jordan. In that case, the judges said there was no evidence that Twitter caused the shooting.

Last April, the 6th Circuit Court of Appeals refused to allow victims of the 2016 shooting at the Pulse nightclub in Orlando, Florida to proceed with a lawsuit against Google, Twitter and Facebook. The judges in that matter said there was no proof that the social media companies caused the shooting.

Several similar lawsuits remain pending in the 9th Circuit Court of Appeals.

In the case rejected Monday, attorneys for the Hamas victims had argued that Facebook's use of algorithms for recommendations should strip the company of immunity.

A majority of the 2nd Circuit appellate panel disagreed. In a ruling issued last July, the appellate judges said websites typically recommend third-party content to users -- such as by prominently featuring it on a homepage, or displaying English-language articles to users who speak English.

The use of algorithms to make those recommendations doesn't make Facebook liable for the posts, Circuit Judge Christopher Droney wrote in an opinion joined by Judge Richard Sullivan.

The 2nd Circuit's chief judge, Judge Robert Allen Katzmann, dissented.

“Shielding internet companies that bring terrorists together using algorithms could leave dangerous activity unchecked,” he wrote at the time. “Hamas is far from alone -- Hezbollah, Boko Haram, the Revolutionary Armed Forces of Colombia, and many other designated terrorist organizations use Facebook to recruit and rouse supporters.”

Next story loading loading..