Joe Biden Takes On Section 230: Web Freedom Vs. Accountability

Take a look at this number: 230. Do you know what it means? If not, today is a good day to learn about it, because the future of the internet as we know it may be riding on the preservation or renovation of Section 230 of the 1996 Communications Decency Act (CDA).

Section 230 is what the Electronic Frontier Foundation calls "one of the most valuable tools for protecting freedom of expression and innovation on the Internet."

Section 230 says that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

Simply put: Platforms are NOT publishers, and therefore can’t be held responsible for what is published on their sites by their users.

The alternative, free speech advocates argued, would be holding Microsoft and its Word program responsible for what people write, or AT&T responsible for what people say on their phones.



It’s been 24 years since the so-called “safe harbor provisions” in the CDA were passed, and that law has let sites like YouTube and Vimeo allow user-generated content to be uploaded. It created the environment for user-generated reviews on Yelp and Amazon to be published. It allowed Craiglist to host user-posted classified ads -- and, perhaps most importantly, for Facebook and Twitter to provide the frictionless publishing of comments, images, and often controversial posts from hundreds of millions of Internet users.

Section 230 was the legal backbone that created what we now think of as the open and free internet. But it wasn’t simply a free pass for platforms. They did need to create a mechanism for notification of copyright violations and provide a way for content that was in violation to be taken down.

But the legal exposure to being sued for defamation wasn’t able to be made against platforms, because they weren’t publishers and didn’t have to review and take responsibility for what they published.

Flash-forward to today.

These companies are now massive, powerful, and editorially engaged. They are, by any reasonable measure, publishers. The invite content, they manage speech that is hateful, threatening, and often defamatory. And increasingly, their advertising is being purchased, and targeted, and gamed to impact the outcome of what should be free and fair elections.

There’s a growing number of voices calling for changes in the way CDA 230 is framed. Some want it updated, others want it struck down.

In an interview in The New York Times, former Vice President Joe Biden called for Section 230 to be “revoked, immediately.” And he didn’t stop there,  noting that "for Zuckerberg and other platforms,” 230 "should be revoked because it is not merely an internet company. It is propagating falsehoods they know to be false.”

Republicans have made similar charges. Both Sen. Josh Hawley (R-MO) and Sen. Ted Cruz (R-TX) have suggested that the way to address what they see as platform censorship is to make changes to 230.

Yael Eisenstat, a former CIA officer and ex-global head of Facebook's elections integrity ops unit, says 230 is “basically a subsidy that says internet platforms will not be responsible for the content they host.”

Eisenstat is not alone in her concern that platforms' current business model, without any responsibility for the content they serve, is poised to have a dramatic impact on the upcoming presidential election.

Ellen Weintraub, the chair of the Federal Elections Commission, has said publicly that Facebook’s election safety plan “virtually invites Congress to re-examine Facebook's Section 230 exemptions.”

Eisenstat, Weintraub, and the UN’s David Kaye are going to be debating the future of Section 230 in a session at this year's March South By Southwest conference that I’m going to moderate.

Kaye is the UN's Special Rapporteur on the promotion and protection of the right to freedom of opinion and expression. He is in favor of protecting section 230.

So it’s a smart group, with some pretty divergent opinions. If you want to be part of a serious debate about the future of the internet, you can read about it here.

2 comments about "Joe Biden Takes On Section 230: Web Freedom Vs. Accountability".
Check to receive email when comments are posted.
  1. Richard Reisman from Teleshuttle Corporation, January 20, 2020 at 1:52 p.m.

    Calls to revoke rather than refine Section 230 are oversimplifications – the task is to apply an understanding of its proper limits, including that it should not apply to the algorithmically filtered (moderated) distribution of social media.  Some excellent recent think tank analyses explain why.

    If disinformation or harmful content falls in a forest… but appears in no one’s feed, does it disinform or do harm?  The main confusion is the failure to distinguish the posting of items (not unlike posting on Web sites) from the distribution of items in feeds.   Instead of revoking Section 230, it should be made clear that algorithmic filtering is just a new kind of moderation.  There is no Section 230 safe harbor if an “interactive computer service” does moderation.  "Free speech, not free reach."

    For some sophisticated analysis of Section 230 and how it should be interpreted to thread the needle that our platforms and newly social media present, there are several excellent reports from think tank experts.  I recently reviewed and commented on some them in Regulating our Platforms -- A Deeper Vision (  Two in particular are worth studying to understand the issues underlying Section 230:  The Case for the Digital Platform Act:  Market Structure ( and Regulation of Digital Platforms and the Stigler Committee on Digital Platforms. Final Report (  Chapter V of the first report is especially illuminating as background on the constitutional and legislative history as it applies to our modern platforms.  Everyone who cares about these issues should read that chapter.

    These studies address the broader issue of the need for a new regulatory agency that has the multidisciplinary expertise and ongoing responsibility to manage regulatory policy now, and as our platform services continue to evolve.  We seem be thrashing because the legislators, agencies, courts, press, tech world, and public are not applying the wide range of skills and deep focus needed to make sense of the social and economic issues we now face. 

    I hope your session can illuminate these important issues.

  2. Tom Tyler from GCTVTexas, January 27, 2020 at 7:05 p.m.

    We need to force them to be platforms, and make it illegal for them to censor content.

Next story loading loading..