A California law prohibiting social platforms from algorithmically recommending posts to minors without their parents' consent violates the First Amendment, Meta, Google and TikTok
argue in separate lawsuits filed Thursday.
The new suits, filed Thursday, comes after the 9th Circuit Court of Appeals rejected the tech industry organization NetChoice's challenge to
the restrictions on recommendations. The 9th Circuit ruled in September that NetChoice was not the appropriate entity to bring suit. Instead, according to the appellate judges, only individual social
media companies could sue over the constraints.
The platforms are now seeking injunctions prohibiting California from enforcing the Protecting Our Kids From Social Media
Addiction Act (SB976), which was passed last year.
advertisement
advertisement
Meta argues in its complaint that
the statute unconstitutionally restricts it from curating content to teen users of Facebook, Instagram and Threads.
"The state may not dictate how Meta disseminates its
'expressive products,' ... by demanding that Meta select third-party content for display without regard to the individualized interests, preferences, or characteristics of Meta’s users," the
company argues.
"Surely, the state could not require museums or bookstores to eliminate their category classifications and place their works of art or books in random order
without confronting the First Amendment -- even if, in theory, the works remain available to their patrons," the company adds.
Meta adds that the restrictions on
recommendations would hinder its ability "to enhance teen safety and well-being" on its platforms.
"For example, the Act inhibits Meta’s ability to de-prioritize content
based on its own decisions restricting, or teens’ settings indicating a desire to restrict, content that some users may find upsetting but that does not violate Meta’s content moderation
policies," the company argues.
TikTok says in its complaint, also brought in the Northern District of California, that the ability to curate content is "critical to the TikTok
experience," given the volume of material on the platform.
The company says that this year alone, U.S. TikTok users uploaded more than 7 billion videos, which were viewed more
than 17 trillion times worldwide.
"No single person could possibly sift through all of that content, much less identify content that they would enjoy," TikTok writes. "So
TikTok curates that content for them."
TikTok adds that the law "fundamentally alters the TikTok experience."
"Part of what makes TikTok TikTok is that
it introduces people to content that it believes will interest them by creators to whom the person may have no pre-existing social connection," the company writes. "A seventeen-year-old who goes to
TikTok to learn new things, encounter new communities, and find new voices will be deprived of the primary way TikTok facilitates that discovery, unless and until he or she gets parental consent to do
so. That, simply put, is unconstitutional."
TikTok also notes that the Supreme Court in 2011 struck down a California law that would have prohibited the sale of violent video
games to minors, without parental consent.
"By conditioning teens’ ability to access any personalized feeds on verifiable parental consent, and by prohibiting TikTok,
absent such consent, from considering teens’ expressed preferences or other information provided by them or associated with their devices when curating content, the personalized-feed provisions
... impose unconstitutional burdens on First Amendment-protected activity by TikTok and the people on its platform," the company argues.