Google is urging the Supreme Court to rule that Section 230 of the Communications Decency Act, which the company calls “the central building block of the internet,” not only protects
platforms from liability for hosting terrorism-related created by users, but also for recommending that material to others.
A contrary interpretation “could have devastating spillover
effects,” Google writes in papers filed
with the court on Thursday.
“Recommendation algorithms are what make it possible to find the needles in humanity’s largest haystack,” Google writes.
“Websites
like Google and Etsy depend on algorithms to sift through mountains of user-created content and display content likely relevant to each user," the company adds. "If plaintiffs could evade Section
230... by targeting how websites sort content or trying to hold users liable for liking or sharing articles, the internet would devolve into a disorganized mess and a litigation
minefield.”
advertisement
advertisement
The company's papers come in a lawsuit brought the family of Nohemi Gonzalez, who was killed at age 23 in a November 2015 terrorist attack in Paris.
Her family sued
Google, alleging that the company not only allowed ISIS to post videos to YouTube, but also helped ISIS spread terrorist propaganda by recommending its videos to other users.
A trial judge
dismissed the lawsuit, and the 9th Circuit Court of Appeals refused to revive the family's case.
The 9th Circuit said in its ruling that Section 230, a 1996 law that immunizes web sites from
lawsuits over material created by third parties, protected Google.
The Supreme Court agreed in October to take up the matter, along with a related lawsuit over terrorist content on social
media services. The court will hear arguments in both cases in February.
Last month, the Gonzalez family urged the Supreme Court to rule that Section 230 doesn't protect web companies for
recommending clips. The family essentially argues that YouTube recommendations came from the company, not the ISIS members who posted the videos, and should therefore be treated as YouTube's
speech.
“YouTube provides content to users, not in response to a specific request from the user, but 'based upon' what YouTube thinks the user would be interested in,” lawyers for
the Gonzalez family wrote.
But Google counters that recommendation algorithms are essential to the modern internet.
“Travel websites like Expedia use algorithms to recommend
cheap flights by examining all possible routes, airlines, prices, and layovers,” Google adds. “Streaming services like Spotify and Netflix use algorithms to recommend songs, movies, and TV
shows based on users’ listening or watch histories, their ratings of other content, and similar users’ preferences.”
The battle has drawn interest from a wide range of
outside groups and individuals, including some who suggested in friend-of-the-court briefs that Section 230 shouldn't protect websites from liability for distributing content they have reason to know
could be harmful.
Google argued that these friend-of-the-court interpretations of Section 230 would “upend the internet” by encouraging websites to either remove anything that
could be considered objectionable, or disabling any filtering tools in order to “take the see-no-evil approach.”
Three years ago, the Supreme Court refused to hear an appeal in a
similar dispute involving Facebook. In that case, attorneys for victims of terrorist attacks in Israel argued that Facebook's alleged use of algorithms to recommend content should strip the company of
immunity.
A panel of the 2nd Circuit Court of Appeals voted 2-1 in favor of Facebook.
Circuit Judge Christopher Droney, who authored the majority opinion, wrote that web publishers
typically recommend third-party content to users -- such as by prominently featuring it on a homepage, or displaying English-language articles to users who speak English. Basing those recommendations
on algorithms didn't make Facebook liable, Droney wrote.
Since then, Supreme Court Justice Clarence Thomas has repeatedly suggested that lower court judges have interpreted Section 230 too
broadly.