New York Attorney General Letitia James can proceed with a lawsuit alleging that TikTok harms young users' mental health, a state court judge in Manhattan has ruled.
Justice Anar Rathod Patel issued the ruling Wednesday from the bench and hasn't yet come out with a written opinion.
Her order came in a complaint brought in October, when the attorney general alleged that TikTok's design harms teens, and that the company violates state consumer protection laws by promoting itself as safe for young people. The complaint took aim at several TikTok design features -- including automatically playing videos, push notifications, beauty filters, and the “for you” feed, which allegedly addicts young users by displaying algorithmically recommended videos in a continuous scroll.
advertisement
advertisement
“The 'For You' feed is one of the numerous features designed to exploit the human body’s natural reaction to the receipt of small rewards through the release of the pleasure-creating neurotransmitter dopamine, and in turn promote addictive behavior,” James's office alleged.
“Beauty filters can exacerbate eating disorders as the filters create an impossible standard for teens who are forming opinions of themselves,” the attorney general added.
TikTok urged Patel to dismiss the case at an early stage, arguing in a motion filed in January that Section 230 of the federal Communications Decency Act barred the claims.
Section 230 protects interactive companies from liability for publishing user-generated content. TikTok contended that the design features cited in the complaint -- such as algorithmic recommendations -- are covered by Section 230 because they are integral to publishing.
“A key function of an online platform is to decide what content to select, how to organize it, and how to present it to users,” the company argued in a motion seeking dismissal.
Design features “are the means by which user-generated content is selected, organized, and presented -- i.e., published -- on the platform,” TikTok added.
The company also noted that the 2nd Circuit Court of Appeals, which covers New York, previously found Meta Platforms immune from liability for algorithmic recommendations.
TikTok additionally argued that the First Amendment separately protects the company from New York's claims.
“The First Amendment’s protections for editorial judgments made by traditional publishing entities apply equally to online platforms that disseminate and make expressive choices about how to display third-party content,” the company argued, noting that the Supreme Court said last year that social media platforms, like other publishers, have First Amendment rights to determine their own editorial policies.
The First Amendment “does not go on leave when social media are involved,” Justice Elena Kagan wrote in a ruling handed down last July.
New York officials urged Patel to reject TikTok's arguments, writing that the complaint focused on TikTok's “own conduct without regard to content created by third parties.”
The state also argued that “the use of algorithms and other tools that are harmful to teenagers and children” is not protected by the First Amendment.
James wasn't the only law enforcement official to sue TikTok last year. Attorneys general brought similar lawsuits in the District of Columbia and 12 other states, including South Carolina and California, on the same date that James filed.
A judge in South Carolina also ruled recently that the attorney general could proceed with the complaint in that state.