Handing Facebook a victory, a federal appellate court ruled the social media platform isn't responsible for terrorist attacks by Hamas.
In a 2-1 decision issued this week, the 2nd Circuit Court of Appeals said Section 230 of the Communications Decency Act protects Facebook from liability for posts by users, including people linked to terrorist groups.
This decision marks the third time a federal appellate court has ruled social media platforms aren't responsible for terrorist attacks. Last year, the 9th Circuit Court of Appeals recently sided with Twitter in a lawsuit brought by family members of men killed in a 2015 shooting in Amman, Jordan. In that case, the judges said there was no evidence that Twitter did anything to cause the shooting.
And in April, the 6th Circuit Court of Appeals refused to allow victims of the 2016 shooting at the Pulse nightclub in Orlando, Florida to proceed with a lawsuit against Google, Twitter and Facebook. The judges in that matter said there was no proof the social media companies caused the shooting.
The decision issued this week stemmed from a lawsuit by family members of four people killed by terrorists in Israel, and one survivor of an attack in that country. The attacks all occurred between 2014 and 2016. The people who sued alleged that Facebook wrongly allowed its platform to be used by terrorist organizations who sought to organize and recruit new members.
U.S. District Court Judge Nicholas Garaufis in the Eastern District of New York threw out the case two years ago, ruling that Facebook is immune from liability for activity by users.
The victims then appealed to the 2nd Circuit. Among other arguments, they said Facebook's use of algorithms for recommendations stripped the company of the protections of Section 230. They argued that the algorithms “helped introduce thousands of terrorists to one another, facilitating the development of global terror networks.”
A majority of the appellate panel disagreed that Facebook's alleged use of algorithms deprived the company of immunity. The judges said websites typically recommend third-party content to users -- even if only by prominently featuring it on a homepage, or displaying English-language articles to users who speak English. The use of algorithms -- as opposed to human editors -- to make those recommendations doesn't make Facebook liable for the posts, the judges said.
“Services have always decided, for example, where on their sites (or other digital property) particular third-party content should reside and to whom it should be shown. Placing certain third-party content on a homepage, for example, tends to recommend that content to users more than if it were located elsewhere on a website,” Circuit Judge Christopher Droney wrote in an opinion joined by Judge Richard Sullivan.
“Seen in this context, plaintiffs’ argument that Facebook’s algorithms uniquely form 'connections' or 'matchmake' is wrong,” the judges added. “That, again, has been a fundamental result of publishing third-party content on the Internet since its beginning.”
Chief Judge Robert Allen Katzmann disagreed with the majority.
“As is so often the case with new technologies, the very qualities that drive social media’s success -- its ease of use, open access, and ability to connect the world -- have also spawned its demons,” he wrote in a sweeping dissent that takes aim at social media companies broadly.
“Shielding internet companies that bring terrorists together using algorithms could leave dangerous activity unchecked,” he wrote. “Hamas is far from alone -- Hezbollah, Boko Haram, the Revolutionary Armed Forces of Colombia, and many other designated terrorist organizations use Facebook to recruit and rouse supporters.”
Katzmann went on to add that a “hands-off approach to social media” has far-reaching consequences.
“Social media can be used by foreign governments to interfere in American elections,” he wrote, citing reports of Russian meddling in the 2016 presidential election. “Widening the aperture further, malefactors at home and abroad can manipulate social media to promote extremism.”
Katzmann's lengthy dissent could encourage plaintiffs in other cases, but whether it will persuade other judges is a different story.
Santa Clara University law professor Eric Goldman, who has written extensively about Section 230, says Katzmann's opinion is an outlier.
“The dissent leaves bread crumbs for plaintiffs to follow,” he tells MediaPost.
But he adds that he doesn't anticipate Katzmann's view gaining traction with other judges.
“My expectation is that the dissent will just be forgotten over time, that it won't really shift the discussion,” he says.
The large social media platforms have recently taken steps to combat posts by terrorists. In 2017, Facebook, Microsoft, Twitter, and YouTube launched the Global Internet Forum to Counter Terrorism, which aims to both remove extremist speech and counter terrorist propaganda with other points of view.