A teenage boy who says he encountered racy material on Snapchat has filed a potential class-action lawsuit against the company.
"Snapchat is currently engaged in an insidious
pattern and practice of intentionally exposing minors to harmful, offensive, prurient, and sexually offensive content, without warning minors or their parents that they would be exposed to such
explicit content," the complaint alleges. The case was brought by a 14-year-old boy Los Angeles resident identified as "John Doe" in the complaint. He's seeking class-action status on behalf of 150
million users.
The lawsuit focuses on Snapchat Discover, which allows publishing partners like Buzzfeed and Fusion to create and distribute content on the platform. The complaint alleges that
some of that content -- like the post titled “People Share Their Secret Rules for Sex” -- is too racy for children.
advertisement
advertisement
"Without warning, minors swiping through the Discover Page are
being introduced to offensive adult-rated content that parents would likely prohibit if they know their children were being given unrestricted access to the content by Snapchat," the complaint
says.
The boy who is suing specifically alleges that he encountered Buzzfeed posts of Disney characters, but with "pornographic text and innuendo next to the photographs."
Snapchat
said in a statement that it hasn't yet been served with the complaint, but is "sorry if people were offended."
The company added: "Our Discover partners have editorial independence, which is
something that we support."
The lawsuit accuses Snapchat of violating the Communications Decency Act -- a 20-year-old law aimed at protecting children from online pornography. The Supreme
Court struck down some key provisions of that law in 1997, on the ground that they violated the First Amendment.
But one section of the Communications Decency Act that wasn't directly
addressed by the Supreme Court requires Web service providers to notify parents about services that can block offensive material.
That notification requirement doesn't appear to have ever been
litigated, according to Santa Clara University law professor Eric Goldman.
But he says that many other lawsuits that have attempted to hold Web platforms liable for content produced by third
parties have failed, due to both the First Amendment and a separate Communications Decency Act provision that immunizes Web platforms from liability for content created by other companies.
"Overall, this lawsuit is like a 1990s throwback when Congress and the states tried to make websites screen minors from offensive content," he says in an email to MediaPost.
Internet legal
expert Venkat Balasubramani adds that it's not clear whether the Communications Decency Act provides for private lawsuits. He also notes it's uncertain whether a 14-year-old boy will be able to show
that he was harmed by encountering risque posts in the service.
"I would be really surprised to see this succeed," Balasubramani says.