In the last six years, terrorist victims and their families brought numerous lawsuits against Google, Twitter and Facebook for allegedly enabling ISIS and other terrorists to communicate with each other, organize and spread propaganda to new recruits.
Almost all were dismissed. At least two of the lawsuits were thrown out on the grounds that there was no evidence the social platforms caused the terror attacks.
In other cases, judges said Section 230 of the Communications Decency Act protects websites from liability for posts by users.
Until now, the Supreme Court has declined to weigh in on those lawsuits. Earlier this year, however, the court agreed to consider whether Section 230 protects Google for allegedly recommending ISIS videos to users.
That dispute involves a lawsuit brought by the family of Nohemi Gonzalez, a California State University student who was studying abroad in France when she was killed in a terrorist attack in Paris.
Gonzalez's family sued Google, arguing that the company not only allowed ISIS to post videos to YouTube, but also “assists ISIS in spreading its message” by recommending its videos.
A trial judge and appellate court sided with Google, ruling that it was protected by Section 230.
Gonzalez's family is now asking the Supreme Court to rule that Section 230 doesn't apply when companies make algorithmically driven recommendations.
The family essentially argues that YouTube recommendations come from itself, not the ISIS members who posted the videos, and should therefore be treated as YouTube's speech.
“YouTube provides content to users, not in response to a specific request from the user, but 'based upon' what YouTube thinks the user would be interested in,” lawyers for the Gonzalez family write in a brief filed Wednesday.
Already, the dispute is drawing a huge amount of outside interest -- which is not surprising, given that just about every social-media site makes recommendations to its users.
On Thursday, the National Police Association weighed in against Google, asserting in a friend-of-the-court brief that social media is fueling an “epidemic” of anti-law enforcement “hatred and attacks.”
The organization contends that recommendations on social media “encourage and enable terrorism and other serious harm,” including against law enforcement.
On the other side, the tech industry funded policy group Chamber of Progress and the sexual health organization Advocates for Youth warn that a ruling against Google could threaten web users' ability to access information about abortions.
“Without Section 230, online services might be compelled to limit access to reproductive resources, for fear of violating various state anti-abortion laws,” the groups write in a letter urging Attorney General Merrick Garland to side with Google in a friend-of-the-court brief.
“Should the Court curb Section 230’s protections for algorithmic curation, online services would face extreme threats of liability for promoting life-saving reproductive health information, otherwise criminalized by state anti-abortion laws,” the letter states. “This would be a devastating reality for women seeking reproductive resources in states where they are unavailable.”
The Gonzalez family members aren't the first ones to argue argue that the use of algorithms should strip companies of Section 230 protections. Several years ago, victims of Hamas attacks sued Facebook for allegedly recommending terrorist-related content to users.
Facebook prevailed in that dispute when a panel of the 2nd Circuit Court of Appeals ruled that Section 230's protections apply even if sites recommend material to users. Those judges noted that websites typically recommend third-party content to users -- such as by featuring it on a home page. Using algorithms to make those recommendations doesn't subject Facebook to liability, the judges said.
Google is expected to file papers with the Supreme Court in January.