Cutting to the heart of the brand safety issue,
The Guardian got access to Facebook's guidelines for moderators to evaluate whether content is offensive and hate-filled and therefore, subject
to removal. The newspaper said Facebook permits users to livestream video of people attempting to injure themselves, because it “doesn’t want to censor or punish people in distress who are
attempting suicide.” By contrast, Facebook will remove video footage when it determines there’s no longer an opportunity to help the person, according to the documents obtained by
The
Guardian. Monika Bickert, Facebook’s head of global policy management, told the paper: “No matter where you draw the line there are always going to be some gray areas.”
Read the whole story at Guardian »