Facebook Papers Show Company Questioning Its Core Features

The Facebook Papers, a series of internal documents recently leaked by Francis Haugen, show that the social networking mogul is running internal tests that question the core elements of its platform.

Core elements include the NewsFeed, the Like button, a slew of group features such as Pages, the share button, and an automatically generated recommendations system, all of which have been linked to the dangerous spread of misinformation.

A New York Times article from Monday describes in detail Facebook’s years-long confusion over what to do with the Like and Share buttons:

“When we talk about the Like button, the share button, the News Feed and their power, we’re essentially talking about the infrastructure that the network is built on top of,” said Jane Lytvynenko, a senior fellow at the Harvard Kennedy Shorenstein Center, who studies social networks and misinformation. “The crux of the problem here is the infrastructure itself.”

advertisement

advertisement

A section of the document entitled “Project Daisy Mark Review” offers some internal findings on the Like button. “Teens feel a real ‘pressure to be perfect’ on Instagram,” it reads. “We hope making like counts private will help reduce that pressure.”

Facebook nixed this possibility when researchers found that without like counts, social anxiety amongst young users remained and more importantly, users were sharing and interacting less with posts and ads.

A recent Atlantic article, “People Aren’t Meant to Talk This Much,” makes the point that social media users, “including Donald Trump and your neighbors,” expect to be able to accrue and manage a massive following:

“The more posts, the more followers, the more likes, the more reach, the better. This is how bad information spreads...This isn’t a side effect of social media’s misuse, but the expected outcome of its use.”

An internal memo from August 2019 shows several researchers stating that it was this infrastructure, “Facebook’s ‘core mechanics’” that “let misinformation and hate speech flourish on the site.” 

Here is a clip from a leaked document entitled “What Is Collateral Damage,” that shows internal researchers commenting on the platforms’ bias: 

“If integrity takes a hands-off stance for these problems, whether for technical (precision) or philosophical reasons, then the net result is that Facebook, taken as a whole, will be actively (if not necessarily consciously) promoting these types of activities. The mechanics of our platform are not neutral.”

On the other hand, the internal documents showed that Facebook groups, which have been linked endlessly to hate-speech and sometimes even violence, were often targeted by “invite whales” -- people who sent invitations to others to join a private group.

Invite whales could attract thousands to new groups, “so that the communities ballooned almost overnight,” the study said. Invite whales could then “spam the groups with posts promoting ethnic violence or other harmful content,” reported The Times. 

In addition, the share button has obvious flaws. The button was “designed to attract attention and encourage engagement,” read a September 2020 study, but gone unchecked, the feature will “serve to amplify bad content and sources.” The researcher named “bullying” and “borderline nudity posts” as examples.

The Times reported that there is a common thread throughout the Facebook Papers: employees have argued for changes in how the platform operates but have been shut down by executives. 

Referencing conspiracy theory movements like QAnon and COVID-19 conspiracies, one researcher was quoted as saying: “Out of fears over potential public and policy stakeholder responses, we are knowingly exposing users to risks of integrity harms...During the time that we’ve hesitated, I’ve seen folks from my hometown go further and further down the rabbit hole.”

Unfortunately, thousands of pages of findings have not prompted Facebook to change much of anything. 

Next story loading loading..