A federal judge in Utah ruled Monday that restrictions on social media platforms' ability to serve content to minors doesn't conflict with Section 230 of the Communications Decency Act, a 28-year-old media law that protects web companies from liability over posts by users.
The ruling doesn't address whether the Utah law violates the First Amendment, leaving open the possibility that the statute ultimately will be blocked as unconstitutional.
The Utah Minor Protection in Social Media Act, passed earlier this year, requires platforms to limit the ability of minors under 18 to communicate with users who aren't “connected” to the minor -- which roughly means within that minor's network. That restriction can only be lifted by parents.
The new law, slated to take effect October 1, also requires platforms to disable push notifications and refrain from automatically playing content on minors' accounts.
advertisement
advertisement
The statute repealed and replaced a 2023 measure that would have prohibited social-media companies from allowing minors under 18 to have accounts without parental permission, and banned the companies from serving ads to minors.
The tech group NetChoice sued to block the new law, arguing that it is unconstitutional on several grounds.
The group's main argument is that the measure violates the First Amendment rights of minors as well as tech companies, but the organization also said the restrictions are inconsistent with Section 230.
Utah officials asked U.S. District Court Judge Robert Shelby to dismiss NetChoice's claim regarding Section 230, arguing that the group was interpreting that law too broadly.
NetChoice countered that judges across the country have ruled that Section 230 not only protects web companies from liability over speech that is illegal (such as posts that are defamatory) but also protects companies from liability for editorial decisions relating to that speech (such as whether to recommend posts to particular users).
For instance, the tech group called Shelby's attention to a 2019 decision by the 2nd Circuit Court of Appeals in a lawsuit brought against Meta by terrorist victims. The victims in that case argued that Meta facilitated wrongly allowed its platform to be used by terrorist organizations who sought to organize and recruit new members. Specifically, the plaintiffs contended that Meta's use of algorithms to recommend content stripped the company of Section 230's protections.
The 2nd Circuit disagreed, ruling that Section 230 covered Meta's alleged use of algorithms to recommend third-party content.
Shelby sided against NetChoice, essentially ruling that Section 230 doesn't apply to “design features,” such as videos that play automatically.
“The Act’s prohibitions focus solely on the conduct of the covered website -- the website’s use of certain design features on minors’ accounts -- and impose liability irrespective of the content those design features may be used to disseminate,” he wrote. “In other words, the prohibitions do not impose liability on NetChoice members based on their role as a publisher of third-party content because the potential liability has no connection to that content. Accordingly, the challenged provisions fall outside the scope of Section 230’s protections and are not inconsistent with it.”
Shelby is expected to decide before October whether the law likely violates the First Amendment.