In November of 2015, 23-year-old Nohemi Gonzalez, a California State University student who was studying abroad in France, was killed in a terrorist attack in Paris.
Seven months later, her family sued Google, Facebook and Twitter, claiming that the companies enabled the growth of the terrorist organization ISIS. The family later dropped claims against Facebook and Twitter, but not Google.
In an amended lawsuit filed in November of 2017, the family alleged that Google not only allowed ISIS to post videos to YouTube, but also “assists ISIS in spreading its message” by recommending its videos to users.
Google countered that it was protected by Section 230 of the CDA, which immunizes web companies for material posted by users.
A trial judge dismissed the lawsuit, and the 9th Circuit Court of Appeals refused to revive the family's case.
The family is now asking the Supreme Court to breathe new life into the lawsuit.
In papers filed this week, attorneys for Gonzalez's family argue that Section 230 shouldn't protect companies from claims stemming from recommendations made to social media users.
“The defendants are alleged to have recommended that users view inflammatory videos created by ISIS, videos which played a key role in recruiting fighters to join ISIS in its subjugation of a large area of the Middle East, and to commit terrorist acts in their home countries,” lawyers for the family write in their petition to the Supreme Court.
“Application of section 230 to such recommendations removes all civil liability incentives for interactive computer services to eschew recommending such harmful materials, and denies redress to victims who could have shown that those recommendations had caused their injuries, or the deaths of their loved ones," the petition continues.
This request for a hearing comes two years after the Supreme Court refused to intervene in a similar dispute involving Facebook.
In that matter, attorneys for victims of terrorist attacks in Israel argued that Facebook's alleged use of algorithms to recommend content perceived as encouraging terrorism should strip the company of immunity.
The 2nd Circuit Court of Appeals sided with Facebook.
The appellate judges noted in their opinion that web publishers typically recommend third-party content to users -- such as by prominently featuring it on a homepage, or displaying English-language articles to users who speak English.
The use of algorithms to make those recommendations doesn't make Facebook liable for the posts, Circuit Judge Christopher Droney wrote in an opinion joined by Judge Richard Sullivan.
Circuit Judge Robert Allen Katzmann dissented.
“Shielding internet companies that bring terrorists together using algorithms could leave dangerous activity unchecked,” Katzmann wrote at the time.
Since then, Supreme Court Justice Clarence Thomas suggested in October of 2020 and again last month that judges across the country have interpreted Section 230 too broadly. On both occasions, he said the court should "in an appropriate case" rule on the extent of that law.