Report: Social Media Companies Must Revamp Inadequate Content Policies

Facebook and other social giants need to completely rethink their content moderation policies, according to a new report from NYU’s Stern Center for Business and Human Rights.

The Stern Center’s leadership takes particular issue with the widespread practice of relying on third-party vendors for content review, which they believe amounts to an outsourcing of responsibility for the safety of billions of users.

“The results of this practice range from moderators not receiving the caliber of mental-health care they deserve, to real-world violence in some countries where insufficient moderation leads to hateful content remaining online,” Paul Barrett, deputy director of the Center, argues in the new report.

Taking direct aim at Facebook, YouTube and Twitter, the researchers contend that such outsourcing creates a marginalized class of reviewers who are seen as “second-class citizens.”

The report finds the peripheral status of moderators has contributed to inadequate attention being paid to incendiary content spread in developing countries, sometimes leading to violence offline.

Outsourcing has also contributed to the provision of subpar mental-health support for content moderators who spend their days staring at disturbing material.

For years, Facebook and other platforms have relied on moderators to determine the suitability of content for mainstream consumption. For many people, this resulted in prolonged exposure to the most vile and inhumane content that one can imagine.

In more recent years, a number of high-profile lawsuits and news stories have shed light on the potential harm that content moderation was doing to people. 

In early 2019, The Verge published a report detailing the psychological damage Facebook’s army of reviewers had to endure in exchange for annual salaries of less than $30,000.

At the time, Facebook seemed to suggest the issue was one big misunderstanding.

“There are a lot of questions, misunderstandings and accusations around Facebook’s content review practices … including how we as a company care for and compensate the people behind this important work,” Justin Osofsky, vice president. global operations at Facebook, said at the time.

Last month, however, Facebook agreed to pay $52 million to past and present content moderators.

Per a preliminary settlement filed with the San Mateo Superior Court, the company said it would pay more than 11,000 moderators at least $1,000 for their pain and suffering.

To truly remedy this issue, Barrett and his colleagues are proposing eight recommendations for social-media companies, including putting an end to the process of outsourcing content moderation and bringing more moderators in-house.

They are also recommending the social giants hire content moderation czars to oversee all content moderation operations; expand content moderation in at-risk countries where online-fueled violence is likely; and provide moderators with better medical care and sponsor research into the health effects of content moderation.
Next story loading loading..