Commentary

Is Content Moderation Facebook's Achilles Heel?

It’s no secret that Facebook has struggled to properly define, detect, and then delete “inappropriate” content.

Back in April, word got out that the social giant was quietly looking for a new leader to cure its growing content issues. Then, earlier this month, the company decided to hire another 3,000 human content monitors to police its network.

Now, the Guardian has learned that Facebook’s content guidelines are spread across more than 100 training manuals, spreadsheets and flowcharts.

In what the Guardian portrays as a bumbling strategy, it suggests that the guidelines “illustrate difficulties faced by executives scrabbling to react to new challenges such as ‘revenge porn’ -- and the challenges for moderators, who say they are overwhelmed by the volume of work.”

If not quickly enough for critics, Facebook’s content strategy is evolving in response to user behavior.

When it became that Facebook needed more than 4,500 people to achieve the mission outlined by its community operations team, it decided to hire another 3,000.

The tech titan has also been working with local community groups and law enforcement, who are well positioned to act on threats or acts of violence when they are shared by Facebook users.

Facebook is also building what CEO Mark Zuckerberg has insisted are “better tools” to keep community members safe.

Despite these efforts, Facebook has recently served as a platform for teens streaming their own suicides; the broadcast of a young woman being raped; and a young man with special needs being tortured.

In April, Facebook also unwittingly made it possible for an Ohio man to distribute video footage of a murder.

Next story loading loading..