Facebook was quick to respond to the latest body blow to its reputation: Sunday night’s “60 Minutes’ segment with a whistleblower who accused the platform of engaging in “a betrayal of democracy.”
The whistleblower, Frances Haugen, a former product manager on Facebook’s Civic Integrity team, revealed her identity publicly for the first time during the segment.
Haugen leaked tens of thousands of pages of internal Facebook documents that were the basis for the explosive Wall Street Journal “Facebook Files” series last week, the Journal confirmed after she revealed her identity on television.
About a month ago, Haugen filed multiple complaints with the Securities and Exchange Commission alleging that Facebook is withholding important research from the public and investors, and requesting whistleblower protection. She is scheduled to testify before Congress on Tuesday.
"The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook, and Facebook over and over again chose to optimize for its own interests, like making more money," Haugen told “60 Minutes” interviewer Scott Pelly.
“I’ve seen a bunch of social networks and it was substantially worse at Facebook than anything I’d seen before,” she said.
“I knew what my future looked like if I continued to stay inside of Facebook, which is person after person after person has tackled this inside of Facebook and ground themselves to the ground,” she added. “At some point in 2021, I realized, ‘Okay, I’m gonna have to do this in a systemic way, and I have to get out enough that no one can question this is real.”
Prior to her two years at Facebook, which she left in May after resigning in April, Haugen had served as a product manager at Pinterest, Yelp and Google, among other tech positions.
Haugen said she decided to make the tens of thousands of pages in the Facebook documents public — which include research reports, presentations and discussions of internal research on the effects of its platforms on individuals and societies around the world
She told the Journal that she took most of the documents off Facebook’s internal employee forum, which is open to most Facebook employees.
The Journal series documented Facebook’s practice of exempting high-profile users from its posting rules; its knowledge that the algorithm changes it made in 2018 foster divisiveness that drives engagement, and that Instagram has negative effects on some teens' mental health; and how drug cartels and human traffickers use Facebook's services openly.
On “60 Minutes,” Pelly quoted one of the leaked internal Facebook documents as stating: “We have evidence from a variety of sources that hate speech, divisive political speech and misinformation on Facebook and the family of apps are affecting societies around the world."
One 2021 Facebook study concluded that the platform may be taking action on as little as 3% to 5% of hate posts, and under 1% of violence and incitement posts, despite being “the best in the world” at such intervention, according to “60 Minutes.”
“It’s one of these unfortunate consequences, right?,” said Haugen, who says she wants to fix Facebook, not destroy it. “No one at Facebook is malevolent, but the incentives are misaligned, right? Facebook makes more money when you consume more content. People enjoy engaging with things that elicit an emotional reaction. And the more anger that they get exposed to, the more they interact and the more they consume.”
Haugen said she took the job at Facebook, in which she was charged with helping to protect the platform from being used for election interference and misinformation, because she had lost a friend as a result of online conspiracy theories.
But she claims that after the 2020 election, before the Jan. 6 insurrection at the U.S. Capitol, Facebook dissolved the Civic Integrity team.
Facebook, which says it distributed the Civic Integrity functions to other units, and that the measures still needed for election protection purposes were still in place in February 2021, quickly issued a response to the “60 Minutes” segment asserting that it used select company materials to present a “misleading story about the research we do to improve our products. ”
Excerpts from the lengthy statement from Lena Pietsch, Facebook’s director of policy communications:
“Every day our teams have to balance protecting the right of billions of people to express themselves openly with the need to keep our platform a safe and positive place.
“We’ve invested heavily in people and technology to keep our platform safe, and have made fighting misinformation and providing authoritative information a priority. If any research had identified an exact solution to these complex challenges, the tech industry, governments and society would have solved them a long time ago. We have a strong track record of using our research—as well as external research and close collaboration with experts and organizations—to inform changes to our apps.”
“Hosting hateful or harmful content is bad for our community, bad for advertisers, and ultimately, bad for our business. Our incentive is to provide a safe, positive experience for the billions of people who use Facebook. That’s why we’ve invested so heavily in safety and security.”
Pietsch says that the “Meaningful Social Interactions” algorithm change in 2018, rather than fostering hatred and divisiveness, actually improved users’ experience by prioritizing personal conversations and deprioritizing public content. She stresses that polarization in the U.S. had been growing for decades prior to the advent of social media, and claims that polarization has actually decreased in other countries where internet and Facebook use has increased.