Zuckerberg Proposes Congress Should Limit Web Companies' Legal Protections

Facebook CEO Mark Zuckerberg will suggest that Congress revamp Section 230 of the Communications Decency Act by linking web companies' legal protections to the companies' efforts to remove unlawful content.

“We believe Congress should consider making platforms’ intermediary liability protection for certain types of unlawful content conditional on companies’ ability to meet best practices to combat the spread of this content,” Zuckerberg says in written testimony submitted to Congress on the eve of a House hearing about social media.

Google CEO Sundar Pichai and Twitter CEO Jack Dorsey will also testify at Thursday's hearing, titled "Disinformation Nation: Social Media's Role in Promoting Extremism and Misinformation."

Lawmakers at the hearing are expected to discuss a variety of proposals regarding Section 230, ranging from calls to abolish the 25-year-old law to more minor tweaks.

That law broadly immunizes web publishers from liability for users' posts, including ones that are unlawful because they are defamatory. (There are some exceptions to that immunity, including for content that infringes intellectual property rights, violates federal criminal laws, or violates laws against sex trafficking.)

Zuckerberg suggests that Congress replace Section 230's broad immunity with a provision requiring platforms “to demonstrate that they have systems in place for identifying unlawful content and removing it.”

He adds: “Platforms should not be held liable if a particular piece of content evades its detection -- that would be impractical for platforms with billions of posts per day -- but they should be required to have adequate systems in place to address unlawful content.”

That proposal, if adopted by Congress, would effectively force companies to develop policies to deal with defamatory speech -- which could encompass a broad array of posts, ranging from Yelp reviews to people's comments on Facebook about their neighbors.

But questions about when particular posts are defamatory are often complicated, and it's not clear from Zuckerberg's proposal how web platforms would be expected to decide those questions.

Zuckerberg isn't suggesting that web platforms should be required to revise their content-moderation policies in ways that would require them to take down posts that are legal, but “harmful” -- such as "hate speech" or "fake news."

But he says Congress should “bring more transparency, accountability, and oversight to the processes by which companies make and enforce their rules about content that is harmful but legal.”

Next story loading loading..