Among other challenges, it isn’t easy to balance user safety and free expression, says Monika Bickert, head of global policy management at the social giant.
For Facebook’s thousands of human content reviewers, it’s all about “understanding context,” Bickert explains in a new blog post. “It’s hard to judge the intent behind one post, or the risk implied in another,” according to Bickert.
In some cases, content that many users find offensive might benefit the broader Facebook community.
For instance: “Experts in self-harm advised us that it can be better to leave live videos of self-harm running so that people can be alerted to help, but to take them down afterwards to prevent copycats,” Bickert explains.
Specifically, “when a girl in Georgia … attempted suicide on Facebook Live two weeks ago, her friends were able to notify police, who managed to reach her in time,” she recounts.
Other ongoing issues include distinguishing between art and pornography, and specific calls for a named individual to be harmed.
Adds Bickert: “These tensions -- between raising awareness of violence and promoting it, between freedom of expression and freedom from fear, between bearing witness to something and gawking at it -- are complicated, and there are rarely universal legal standards to provide clarity.”
Going forward, Bickert said that Facebook would continue to face criticism from people who want more censorship and people who want less.
While she welcomes the feedback, the issue is whether brand partners will exercise patience when the company tries to figure out this immense challenge.
This column was previously published in Moblog on May 23, 2017.