Google, Twitter and Facebook are asking an appellate court to reconsider a recent decision to revive claims that they aided and abetted the growth of ISIS by allowing terrorists to use the platforms.
“The panel’s decision is not just wrong. It threatens major harm to ordinary businesses providing standardized goods or services to the general public,” the companies argue in papers filed with the 9th Circuit Court of Appeals.
The papers come in a dispute dating to 2017, when the family of Jordanian citizen Nawras Alassaf -- who was killed in a terrorist bombing in Istanbul -- sued the companies for allegedly aiding and abetting terrorism.
Among other claims, the family members said the companies violated the Anti-Terrorism Act, which enables people harmed by international terrorism to sue anyone who knowingly provides “substantial assistance” to foreign terrorist organizations.
A trial judge dismissed the lawsuit, but a three-judge panel of the 9th Circuit revived the case in June.
Those judges noted in the ruling that the family members specifically alleged that the companies provided services “that were central to ISIS’s growth and expansion.”
“Not every transaction with a designated terrorist organization will sufficiently state a claim for aiding-and-abetting liability,” the opinion states. “But given the facts alleged here, we conclude the ... plaintiffs adequately state a claim for aiding-and-abetting liability.”
Google, Twitter and Facebook now argue that the June ruling marks an “unprecedented approach” that goes against traditional principles regarding the meaning of “aiding and abetting.”
The companies add: “As a practical matter, the legal uncertainty it creates threatens ordinary business activities across numerous economic sectors.”
The tech companies also say that other judges have rejected similar claims in at least a dozen comparable lawsuits.
“Until now, no court has allowed an [Anti-Terrorism Act] claim to proceed where the defendant provided only standardized services, common to billions of users, that supporters of a terrorist organization allegedly used to benefit that organization,” they wrote in papers filed Tuesday.
The companies are seeking a new hearing by the three-judge panel, or a rehearing by at least 11 of the circuit court's judges.
The same three judges who revived the claims stemming from the Istanbul bombing also refused to reinstate lawsuits brought by victims and family members of victims of two other terrorist attacks -- one in Paris and the other in San Bernardino, California. The lawsuit stemming from the Paris attack was brought against Google only, while the one stemming from the San Bernardino shooting was brought against all three.
The victims in those cases are also urging the 9th Circuit to reconsider those decisions.
“The gravamen of the complaint is that YouTube knowingly aided ISIS by providing it with a method of privately communicating with current or potential terrorists or supporters,” the family members of Nohemi Gonzalez, who was killed in the Paris attack, argue. “Doing so through an on-line scheme is no different in principle than if YouTube gave untraceable burner cell phones to ISIS fighters.”
The 9th Circuit on Wednesday directed Google to respond to the argument.
In the past, tech companies have prevailed in lawsuits by victims of terrorist attacks. For instance, in 2018, the 9th Circuit Court of Appeals sided with Twitter in a lawsuit brought by family members of men killed in a 2015 shooting in Amman, Jordan. In that case, the judges said there was no evidence that Twitter caused the shooting.
The following year, the 6th Circuit Court of Appeals refused to allow victims of the 2016 shooting at the Pulse nightclub in Orlando, Florida to proceed with a lawsuit against Google, Twitter and Facebook.
The 2nd Circuit Court of Appeals also sided with Facebook in a lawsuit brought by victims (and their families) of terrorist attacks that occurred between 2014 and 2016 in Israel.
In that matter, the appellate court ruled that Section 230 of the Communications Decency Act protects Facebook from liability for activity on the platform by users, including ones linked to terrorist groups. Last May, the Supreme Court declined to take up that case.