A federal appeals court has blocked the bulk of a Florida law that sweepingly restricts the ability of YouTube, Twitter, Facebook and other large social media companies to moderate content.
In a ruling handed down this week, a three-judge panel of the 11th Circuit Court of Appeals said social media companies have a First Amendment right to decide what material to allow on their platforms.
“We hold that it is substantially likely that social-media companies -- even the biggest ones -- are 'private actors' whose rights the First Amendment protects ... that their so-called 'content-moderation' decisions constitute protected exercises of editorial judgment, and that the provisions of the new Florida law that restrict large platforms’ ability to engage in content moderation unconstitutionally burden that prerogative,” Circuit Judge Kevin Newsom, a Trump appointee, wrote in an opinion joined by Circuit Judges Gerald Tjoflat and Ed Carnes.
The ruling largely upheld an injunction issued last year by U.S. District Court Judge Robert Hinkle in Tallahassee. He blocked the law at the request of tech industry groups NetChoice and Computer & Communications Industry Association, which challenged the measure in court.
The decision comes less than two weeks after a different federal appellate court, the 5th Circuit, lifted a block on a similar law in Texas. That state's measure prohibits Twitter, YouTube, Facebook and large tech platforms from suppressing lawful posts based on viewpoint -- including posts that are racist, sexist, misleading, or otherwise objectionable.
The tech industry filed an emergency appeal with Supreme Court Justice Samuel Alito, asking him to reinstate a trial judge's injunction that blocked enforcement of the Texas law. Alito hasn't yet ruled on that request. On Monday, the tech industry alerted Alito to the 11th Circuit's decision to block the Florida law.
Among other specifics, the Florida law (SB 7072) would have subjected social media companies to fines of $250,000 per day for “deplatforming” candidates for statewide office, and $25,000 per day for other offices. (The bill defines deplatforming as banning a user for more than 14 days, or permanently deleting the users' account.)
The measure also would have prohibited social media companies from “censoring,” “deplatforming” or “shadow banning” journalistic enterprises, based on content.
Florida lawmakers passed the law last year, soon after the state's Republican governor, Ron DeSantis, called for a crackdown on tech companies due to their supposed an anti-conservative bias.
“Silicon Valley is acting as a council of censors,” he said when he signed the bill, adding that tech companies “use shadow banning and secret algorithms to shape debates and control the flow of information.”
Florida argued to the 11th Circuit that the law merely extends “common carrier” rules -- including rules have long prohibited telephone companies from denying service to people based on their opinions -- to social media platforms.
The appellate judges flatly rejected that comparison.
“In point of fact, social-media platforms are not -- in the nature of things, so to speak -- common carriers,” Newsom wrote.
He elaborated that social media companies like Facebook -- unlike telephone companies -- have always had rules about the type of content they would allow users to transmit.
Newsom added that state governments can't simply transform social media companies into common carriers by tagging them with that label.
“Neither law nor logic recognizes government authority to strip an en-tity of its First Amendment rights merely by labeling it a common carrier,” he wrote. “In short, because social-media platforms exercise -- and have historically exercised -- inherently expressive editorial judgment, they aren’t common carriers.”
The Florida law also has provisions requiring tech companies to disclose information about their content moderation practices. The 11th Circuit blocked enforcement of a provision requiring companies to give users detailed explanations of why their posts were removed, stating that requirement was “unduly burdensome and likely to chill platforms’ protected speech,” but allowed other disclosure provisions -- including one requiring platforms to post their standards -- to take effect.