A federal appellate court has rejected the tech group NetChoice's request to block enforcement of a California law banning social media companies from algorithmically recommending
posts to minors without their parents' consent.
NetChoice claimed that the Protecting Our Kids From Social Media Addiction Act (SB976), violated the First Amendment, arguing that web publishers have a First Amendment right to
recommend content, and teens have a constitutional right to access content.
In a 40-page
ruling issued Tuesday, a three-judge panel of the 9th Circuit Court of Appeals essentially said that NetChoice -- which counts dozens of tech companies including Google, Meta and Snap as members
-- wasn't in a position to make that argument. Instead, according to the appellate judges, individual NetChoice members would have to directly participate in legal proceedings.
advertisement
advertisement
But the judges also strongly suggested that even if social media companies themselves sued over the law, they might not prevail. That's because, according to the appellate judges,
businesses might not have a First Amendment right to recommend content to users based on their activity and inferred preferences.
"NetChoice acknowledges that each of its
members is unique," 9th Circuit Judge Ryan Nelson wrote in an opinion joined by Judges Michael Daly Hawkins and William A. Fletcher.
"That matters because the unique design of
each platform and its algorithm affects whether the algorithm at issue is expressive," Nelson added. The First Amendment, which prohibits censorship by the government, typically protects
"expressive."
An algorithm "that promotes a platform’s own message to users is likely to be protected speech," but that one that selects content based on users' online
activity "probably is not expressive," the judge wrote.
"Personalized algorithms might express a platform’s unique message to the world, or they might reflect
users’ revealed preferences to them. Knowing where each NetChoice member’s algorithm falls on that spectrum reasonably requires some individual platforms’ participation," Nelson
wrote.
Nelson referenced Supreme Court Justice Amy Coney Barrett's concurrence in a dispute about laws in Texas and Florida that would have restricted companies' ability to engage in content
moderation. While the Supreme Court ultimately returned both cases to lower courts for additional analysis, Justice Elena Kagan wrote for the majority that social media platforms have First Amendment
rights to wield control over content on their platforms.
Barrett said in a concurring opinion that the
First Amendment implications of laws might be different depending on how a platform's algorithm works.
"Assume that human beings decide to remove posts promoting a particular
political candidate or advocating some position on a public-health issue. If they create an algorithm to help them identify and delete that content, the First Amendment protects their exercise of
editorial judgment," she wrote.
"But what if a platform’s algorithm just presents automatically to each user whatever the algorithm thinks the user will like -- e.g.,
content similar to posts with which the user previously engaged? ... The First Amendment implications of the Florida and Texas laws might be different for that kind of algorithm," Barrett wrote.
The 9th Circuit ruling issued Tuesday largely upheld a decision handed down by U.S. District Court Judge Edward Davila in the Northern District of California. He said provisions of
the law likely violated the First Amendment -- including restrictions on push notifications from social media companies to minors between midnight and 6 AM -- but not the restrictions on
recommendations.