Editorial decisions about what content is permissible on social media platforms are best left to the platforms themselves -- even if their standards appear less than clear, the digital rights group Electronic Frontier Foundation argues in new court papers.
“Inconsistent and opaque private content moderation is a problem for users. But it is one best addressed through self-regulation,” the organization writes in a friend-of-the-court brief filed Friday with U.S. District Court Judge Robert Pitman in Austin.
The organization is urging Pitman to block a new Texas law (HB 20) that prohibits Facebook, Twitter, YouTube and other social platforms with more than 50 million monthly users from suppressing posts based on viewpoint.
The law, which was passed last month, is slated to take effect in December.
The measure allows tech platforms to suppress illegal content, but apparently requires companies to host a wide variety of objectionable speech -- including posts that contain incorrect information about vaccines, as well as posts that deny the Holocaust.
The tech industry groups NetChoice and the Computer & Communications Industry Association sued last month to block enforcement, arguing that the law violates tech companies' First Amendment rights to set editorial policies.
The Electronic Frontier Foundation, like several other advocacy groups, is supporting the industry organizations' request.
The organization argues that the Texas law violates the First Amendment as well as Section 230 of the Communications Decency Act.
“Every court that has considered the issue has rightfully found that private entities that operate online platforms for user speech enjoy a First Amendment right to curate that speech,” the Electronic Frontier Foundation writes.
Section 230 also specifically protects tech companies' ability to moderate speech, and can give companies a path to speedy dismissals of lawsuits over content policies.
The group notes that many companies have been moderating online speech for at least 20 years.
“Social media platforms like Facebook, YouTube, Twitter, and others targeted by HB 20, are not the first online services to moderate -- or edit, or curate -- the user speech that appears on their sites,” the Electronic Frontier Foundation writes.
The organization adds that some social media companies are currently structured around particular ideologies. For instance, the organization writes, the platform ProAmericaOnline promotes itself as “social media for conservatives,” while The Democratic Hub is aimed at liberals.
A law requiring those sites “to adopt neutral viewpoint policies as they became popular is nonsensical and contrary to the interests of internet users,” the digital rights group writes.
“HB 20 forces platforms to defend their specialized moderation practices in court, and prospect of the costs of repeated litigation will chill their exercise of editorial discretion,” the Electronic Frontier Foundation adds. “That will ultimately harm users by limiting the availability of online services that cater to their particular interests, communities, or political viewpoints, and which seek to protect their users from abuse and harassment.”