Former Facebook employee Frances Haugen blasted the company's practices at a Senate Commerce Committee hearing on Tuesday, stating that the social networking service repeatedly chose profits over users' safety.
“The result has been a system that amplifies division, extremism, and polarization -- and undermining societies around the world,” she stated in prepared remarks. “In some cases, this dangerous online talk has led to actual violence that harms and even kills people. In other cases, their profit optimizing machine is generating self-harm and self-hate -- especially for vulnerable groups, like teenage girls.”
Haugen, a former Facebook product manager who recently exposed a trove of damaging information about the company -- including that its internal research showed Instagram was harmful to many teen girls -- suggested at one point that company leaders should declare “moral bankruptcy.”
At the hearing, several senators touted legislation that would broadly affect the tech industry.
Senator John Thune (R-South Dakota) noted that he previously proposed the Filter Bubble Transparency Act, which would have required companies to offer people the ability to opt out of receiving some forms of personalized content, and to offer a version of their services that doesn't return results based on personal factors, such as web-browsing history.
Other lawmakers, including Senator Marsha Blackburn (R-Tennessee) and Maria Cantwell (D-Washington), suggested that privacy bills could address some of the issues that Haugen raised.
Haugen herself expressed support for revising Section 230 of the Communications Decency Act, a 25-year-old law that protects companies from lawsuits over content-moderation decisions, as well as from lawsuits over posts by users.
Specifically she appeared to urge Congress to revise the law in a way that would allow companies to be sued for algorithmically promoting content. (It's not clear whether that type of bill would survive a First Amendment challenge, given that companies often program their algorithms to make traditional editorial decisions -- such as placing particular articles on the home page.)
She also said Facebook should be required to allow researchers to study how the company handles potentially harmful content.
"The core of the issue is that no one can understand Facebook’s destructive choices better than Facebook, because only Facebook gets to look under the hood," she stated. "A critical starting point for effective regulation is transparency: full access to data for research not directed by Facebook."
Facebook on Tuesday responded to criticism of its service by urging lawmakers to “create standard rules for the internet.”
“It's been 25 years since the rules for the internet have been updated,” a company spokesperson stated Tuesday, obviously referring Section 230. “Instead of expecting the industry to make societal decisions that belong to legislators, it is time for Congress to act,” the spokesperson added.
Facebook CEO Mark Zuckerberg previously proposed that Congress revise Section 230 by requiring companies to be more transparent about content moderation decisions.