In the last four years, victims of terrorist attacks have brought numerous lawsuits against Facebook, Google and Twitter for allegedly enabling terrorists by giving them a means of spreading propaganda, communicating with each other and organizing.
So far, the tech platforms have been winning in court. Trial judges have dismissed 12 of the lawsuits, and three different appellate courts have upheld those decisions. In some of the cases, judges have ruled that the tech platforms didn't cause the attacks; in others, judges have said the companies are protected from suit by Section 230 of the Communications Decency Act, which provides that online platforms aren't responsible for illegal material posted by users.
Facebook, like the other major platforms, has stepped up efforts to combat posts by terrorists in recent years.
But those efforts are largely voluntary, at least in the U.S. What's more, they don't resolve the lawsuits. Instead, questions surrounding the platforms' legal responsibility will likely only be settled for good if the Supreme Court weighs in. None of the battles have yet reached the country's highest court, but that could soon change.
Earlier this year, victims of terrorist attacks in Israel asked the Supreme Court to revive a lawsuit claiming that Facebook allowed its platform to be used by members of Hamas to communicate, organize and recruit new members. A trial judge dismissed the lawsuit, ruling that Facebook was immune from liability for crimes committed by users. Last July, a divided panel of the 2nd Circuit Court of Appeals upheld that decision, ruling that Section 230 immunizes Facebook.
This week, Facebook asked the Supreme Court to leave the prior ruling in place. Facebook says the 2nd Circuit decision “aligns with outcomes in every other circuit, which have in numerous cases rejected efforts to hold online service providers liable for the allegedly harmful effects of third-party content.”
This particular fight stems from a complaint brought by the estates and family members of four Americans killed by terrorist attacks in Israel (Stuart Force, Yaakov Naftali Fraenkel, Chaya Zissel Braun, Richard Lakin) and one survivor (Menachem Mendel Rivkin). All of the incidents occurred between 2014 and 2016.
The lawyers representing Force and the other Hamas victims previously argued that Facebook's use of algorithms for recommendations should strip the company of immunity. Specifically, the lawyers argued that the algorithms “helped introduce thousands of terrorists to one another, facilitating the development of global terror networks.”
Two judges on the 2nd Circuit's appellate panel disagreed. They said websites typically recommend third-party content to users -- such as by prominently featuring it on a home page, or displaying English-language articles to users who speak English. Using algorithms to make those recommendations doesn't subject Facebook to liability, the judges said.
A third judge, Robert Allen Katzmann, dissented. “Shielding internet companies that bring terrorists together using algorithms could leave dangerous activity unchecked,” he wrote. “Hamas is far from alone -- Hezbollah, Boko Haram, the Revolutionary Armed Forces of Colombia, and many other designated terrorist organizations use Facebook to recruit and rouse supporters,” he wrote.
Katzmann also urged Congress to reconsider Section 230, in order “to better calibrate the circumstances where such immunization is appropriate and inappropriate.”
In its new papers, Facebook also suggests that terrorist victims would need an Act of Congress in order to successfully sue.
“The decision below was correct and consistent with the decisions of every other court of appeals,” the social networking platform writes. “To the extent petitioners raise concerns about the policy consequences of Section 230, those concerns are better addressed to Congress.”