
Snapchat has settled a lawsuit brought by a teen who
alleged that she was harmed as a result of allegedly addictive design features on its service.
The teen, identified in court papers only as K.G.M., is also suing TikTok, Meta
and YouTube. Her complaint -- as well as complaints brought by other teens -- against those three companies is expected to go to trial next week in Los Angeles County Superior Court.
Details of the settlement with Snapchat have not been disclosed, and a company representative said only that the parties "are pleased to have been able to resolve this matter in an
amicable manner.”
K.G.M. is just one of thousands of plaintiffs -- including teens, school districts and state attorneys general -- who have sued social platforms for
allegedly harming adolescents. Other cases are pending in federal court in the Northern District of California, as well as various state courts.
advertisement
advertisement
Snapchat's settlement comes
around two months after Superior Court Judge Carolyn Kuhl rejected the company's bid to rule in its favor without holding a trial.
Snapchat had argued to Kuhl that there was no
evidence showing its design features injured K.G.M., but Kuhn said in a November 5 ruling that an expert witness for K.G.M. opined that Snapchat features -- including ephermeral messaging, infinite
scroll and filters that can change users' appearances -- increased the risk of anxiety, eating disorders, compulsive use and other harms.
Snapchat also argued it's protected
from liability by the First Amendment as well as Section 230 of the Communications Decency Act. In general, the First Amendment shields companies from lawsuits over lawful speech (including content
that could be considered harmful, such as posts about eating disorders or drug use), while Section 230 immunizes web companies from liability over material posted by users, including speech that's
defamatory or otherwise unlawful.
Kuhl said in her recent ruling that she rejected that argument in October 2023, when she refused to dismiss sprawling litigation, including
K.G.M.'s complaint, against the four platforms.
The companies had specifically argued that Section 230 protects them because any harm suffered by the teens would have been
caused by user-created content, such as pro-dieting posts.
But Kuhl effectively drew a distinction between the services' content and design.
“Plaintiffs allege they were injured by features of defendants' platforms that were designed to, and did in fact, maximize use of the platforms in ways leading to minors'
addiction and resulting health consequences,” she wrote in October 2023.
“The features themselves allegedly operate to addict and harm minor users of the platforms
regardless of the particular third-party content viewed by the minor user,” she added at the time.
U.S. District Court Judge Yvonne Gonzalez Rogers, who presides over a
class-action against the platforms in federal court, also allowed the lawsuits to move forward -- but ruled that Section 230 and the 1st Amendment foreclosed some claims.
For
instance, Rogers said Section 230 protected the platforms from allegations that algorithmic recommendations of third-party content promoted addiction, but not from claims that the platforms failed to
warn users about potential harms.
Meta appealed that ruling to the 9th Circuit Court of Appeals, which heard arguments earlier this month.