
During the very earliest days of the internet, there
was a law passed that effectively gave what were then tiny or nonexistent web platforms the right to act as technology rather than publishers. This meant that no one could hold YouTube, Facebook
or any of the other platforms responsible for what they shared.
Back then the term social media didn’t exist, and the rule -- known as Section 230 of the Communications Decency
Act of 1996 -- would be heralded as the critical piece of legislation that allowed the web to flourish.
Flash-forward to 2019, before the pandemic, and already there was a battle heating up to
redraw or remove Section 230, giving politicians the power to legislate the internet, and individuals who felt wronged the ability to sue platforms.
So, as platforms battle to provide
factual, accurate COVID-19 information, and both the right and the left agree that Section 230 needs to be changed -- or maybe even repealed -- what are the likely outcomes?
SXSW convened
a panel of industry leaders to explore this conversation. The panel was produced in virtual space and can be viewed here.
But some of the questions and conclusions jumped off the page.
"There's a lot of truth in the idea that Section 230 created the space for the companies to develop [and] create their own
rules of what's acceptable and what's not on their platforms because they didn't have the shadow of liability hanging over them,” said David Kaye, the United Nations Special Rapporteur on
Freedom of Expression, and a law professor at the University of California, Irvine.
"I do not want the government to be the ones to decide what should or shouldn't be posted
online,” said Yaël Eisenstat, a visiting fellow at Cornell Tech. "I want to think about this in a new way that actually applies in 2020, as opposed to in 1996 when this was
written.
“So a platform like Facebook, for example, they don't just host content. They're not just a neutral intermediary. Their algorithms are deciding how they are curating
content.”
And not only does section 230 current protect platforms from liability, but it incentivizes them to micro-target salacious content based on what many think should be
private data.
"The platforms are just sucking up all of this information about us,” said Ellen Weintraub, commissioner at the Federal Election Commission. "In some cases, not only is the
person next door not getting the same advertising that you're getting, but the person sitting across the table from you in your own home might be getting different advertising than you are
getting because it is micro-targeted to that degree based on data that nobody really voluntarily gives up."
So wouldn’t that fact increase the accountability of the platforms and lead to
revoking of 230? Kaye says not so fast. “It's not like you can just wave a wand and suddenly… there’s new legislation that is bipartisan and everybody agrees
to.”
Section 230 means different things to different people. Joe Biden, the presumptive Democratic nominee, says he wants to revoke 230. Democrats are looking to have new tools to
confront fake news, sex trafficking and other objectionable posting. Conservatives are looking to be able to fight what they say is the platform's liberal biases.
Kaye said there’s
danger ahead from such partisanship. "Some of the proposals that have been made in Congress... particularly from the right, but not only from the right, have been pushing [for] liability for
the companies for some of the decisions they make around political speech, which I think is actually quite dangerous."
Thinking about Facebook, Eisenstat said “They're not a neutral
platform and they're not a publisher. I don't want The New York Times to be regulated the exact same way I want Facebook to be regulated, because the New York Times actually has
a fact check. They have a whole journalistic process. Facebook, it's a free-for-all. Why don't we actually create a new category that's neither publisher nor platform -- that's like
a digital curator?”
Eisenstat says the reworked 230 would mean accountability not for what is published, but for how platforms use algorithms to amplify content. "How do you make
their algorithmic decision-making more transparent, so that we can say, it's not that you're responsible for the fact that this disinformation is on your platform, you're responsible for the
fact that your algorithm decided to amplify it to two billion people, whereas before that person would have gotten no [exposure]?”
It’s going to be some time before Congress
gets back to the meat and potatoes work of legislating, and even then changes to Section 230 may seem less relevant in the new economic times that lie ahead.
But with few issues that
share a bipartisan motivation, changes in how the law sees the responsibilities and freedoms of web platforms are sure to make its way back to the legislative agenda in the not-too-distant
future. Stay tuned, because however this plays out, there are changes in the air for platforms and their role as content publishers.