Attorneys general in 13 states and the District of Columbia brought separate lawsuits against TikTok on Tuesday, claiming that the app harms young users' mental health, and violates consumer protection laws by touting itself as appropriate for young people.
"TikTok consistently tells users the platform is 'safe,' 'appropriate for teens' and that safety is a 'top priority,'" New York Attorney General Letitia James alleges in that state's complaint, filed in New York County Supreme Court.
She claims these statements are deceptive, alleging that TikTok designed its service to be addictive, and is aware that “compulsive use” of the platform is “wreaking havoc on the mental health of millions of American children and teenagers.”
advertisement
advertisement
That complaint takes aim at several TikTok features -- including automatically playing videos, push notifications, beauty filters, and the “for you” feed, which allegedly hooks young users by displaying algorithmically recommended videos in a continuous scroll.
“The 'For You' feed is one of the numerous features designed to exploit the human body’s natural reaction to the receipt of small rewards through the release of the pleasure-creating neurotransmitter dopamine, and in turn promote addictive behavior,” the complaint alleges.
TikTok filters, which let users digitally retouch their photos, “encourage unhealthy, negative social comparison, body image issues, and related mental and physical health disorders,” the complaint alleges.
“In particular, beauty filters can exacerbate eating disorders as the filters create an impossible standard for teens who are forming opinions of themselves,” the attorney general adds.
California Attorney General Rob Bonta's complaint, brought in Santa Clara County state court, makes similar allegations.
The suits also claim TikTok violates the federal Children's Privacy Protection Act by collecting data from children under 13 without parental consent.
A TikTok spokesperson says the company strongly disagrees with the states' claims, and believes many are “inaccurate and misleading.”
“We provide robust safeguards, proactively remove suspected underage users, and have voluntarily launched safety features such as default screentime limits, family pairing, and privacy by default for minors under 16,” the spokesperson stated. “We've endeavored to work with the attorneys general for over two years, and it is incredibly disappointing they have taken this step rather than work with us on constructive solutions to industrywide challenges.”
The new cases add to TikTok's growing legal troubles, including the prospect of a nationwide ban.
Earlier this year, Congress passed the Protecting Americans From Foreign Adversary Controlled Applications Act, which will prohibit web-hosting services and app marketplaces from distributing TikTok unless its owner, the China-based ByteDance, divests the app by next April at the latest.
Separately, the Department of Justice charged TikTok in August with violating the Children's Online Privacy Protection Act by allegedly collecting “extensive personal information” from children younger than 13.
Additionally, Texas sued TikTok last week for allegedly violating a new state law by sharing teens' personal data without first obtaining verifiable parental consent.
The company also faces two lawsuits brought in 2022 by Indiana's attorney general, who claimed the company threatens' users' privacy, and harms teens by promoting “inappropriate” content. A state court judge dismissed the complaints in those cases, but an appellate court reinstated them late last month.