Commentary

Texas-Led State Coalition Seeks To Revive Florida Curbs On Social Media Platforms

Attorneys general from Texas and nine other states are supporting Florida in its quest to revive a state law that aims to restrict social media companies' ability to moderate content on their platforms.

The Florida law (SB 7072), which was struck down by a federal judge earlier this year, would have subjected social media companies to fines of $250,000 per day for “deplatforming” candidates for statewide office, and $25,000 per day for other offices. (The bill defines deplatforming as banning a user for more than 14 days, or permanently deleting the users' account.)

Another provision would have prohibited social media companies from “censoring,” “deplatforming” or “shadow banning” journalistic enterprises, based on content.

Other key parts of the bill would have required social platforms to apply content moderation standards “in a consistent manner,” and would have required the platforms to publish their standards and notify users who are “censored,” “deplatformed,” or “shadow banned.”

The industry groups NetChoice and Computer & Communications Industry Association sued to block enforcement, arguing that the law violates companies' First Amendment rights to exercise editorial control over the material they publish. The companies also said the law is invalid due to Section 230 of the Communications Decency Act, which protects online companies' ability to engage in content moderation.

In June, U.S. District Court Judge Robert Hinkle in Tallahassee said the law likely violates the First Amendment, and prohibited enforcement.

“The legislation now at issue was an effort to rein in social-media providers deemed too large and too liberal,” he wrote. “Balancing the exchange of ideas among private speakers is not a legitimate governmental interest.”

He specifically noted in his ruling that even though the statute required companies to apply their standards consistently, it didn't define that term. What's more, he wrote, that requirement was contradicted by the portions of the law prohibiting the platforms from suppressing posts by political candidate or news organizations.

“The statute does not address what a social media platform should do when the statute itself prohibits consistent application of the platform’s standards -- for example, when a candidate engages in conduct that would appropriately lead to deplatforming any other person ... or when a 'journalistic enterprise' posts content that would otherwise be censored,” Hinkle wrote.

Despite Hinkle's ruling, lawmakers in Texas recently passed a bill that prohibits companies from suppressing speech based on viewpoint. That measure will almost certainly face a court challenge.

For its part, Florida recently asked the 11th Circuit Court of Appeals to reinstate the bulk of the law.

This week, Texas Attorney General Ken Paxton -- along with attorneys general from Alabama, Alaska, Arizona, Arkansas, Kentucky, Mississippi, Missouri, Montana and South Carolina -- filed a friend-of-the-court brief urging the 11th Circuit to partially lift the block on enforcement.

Specifically, they're urging the appellate court to allow enforcement of provisions requiring platforms to publish their content moderation standards and apply them uniformly.

Among other arguments, those states say that Florida has a “compelling interest” in “ensuring that its citizens enjoy access to the free flow of information and ideas, unencumbered by arbitrary and erratic censorship, deplatforming, and shadow banning.”

It seems unlikely that a court will agree, given that any requirement about how to apply editorial standards is not only problematic from a First Amendment perspective, but also hopelessly vague.

Earlier this year, organizations including the industry-funded think tank Chamber of Progress pointed out the unworkability of those provisions.

“The Act ... gives no guidance on what kinds of content are equivalent so that the same kind of moderation could be applied without risk of liability,” the organizations wrote in a friend-of-the-court brief filed with Hinkle.

“Would it be consistent if a provider takes down a post with [child sexual abuse media] material but not a post doxxing a private citizen?" the groups asked, referring to the practice of exposing users' private informatiom.

"Would it be consistent to ban a user who repeatedly attempts to incite violence but not a user who posts a single ill-considered comment?" the organizations added. "These are issues and questions on which reasonable people can come to different conclusions.”

Next story loading loading..