Facebook Alters Policies On Suicide, Self-Harm Content

Trying to position itself as a force for good, Facebook is making several changes to the way it handles suicide- and self-harm-related content.

The social giant says that, to avoid unintentionally promoting or triggering self-harm, it will no longer allow graphic cutting images on its platform.

Facebook also committed to cracking down on the uploading and sharing of such content on Instagram, and its Explore feed in particular.

“We’ve also taken steps to address the complex issue of eating-disorder content on our apps by tightening our policy to prohibit additional content that may promote eating disorders,” stated Antigone Davis, global head of safety at Facebook.

Moving forward, Facebook says it also plans to display a sensitivity screen over healed self-harm cuts.

Facebook is in the process of hiring what it calls a “health and well-being expert,” who will join the company’s safety-policy team.

“This person will focus exclusively on the health and well-being impacts of our apps and policies and will explore new ways to improve support for our community, including on topics related to suicide and self-injury,” according to Davis.

Additionally, Davis said her team is investigating ways to share public data on how Facebook users talk about suicide, starting with providing academic researchers access to the social-media monitoring tool, CrowdTangle.

To date, CrowdTangle has been available primarily to help newsrooms and media publishers understand what is happening on Facebook.

Facebook’s efforts to address the existence of suicide- or self-injury-related content date back to 2006.
Next story loading loading..