While trying to clarify its lenient position on Holocaust deniers and other conspiracy theorists, the social giant was forced to address its role in the facilitation of physical violence.
In the coming months, Facebook now plans to begin removing content that it believes contributes to imminent violence.
“There are certain forms of misinformation that contribute to physical harm, and we are making a policy change which will enable us to take that type of content down,” a Facebook spokesperson said on Thursday.
In Sri Lanka, for example, claims by Facebook users that Muslims were poisoning Buddhists recently led to incidents of mob violence.
Yet, the tech titan apparently believes that removing all misinformation from its platform represents an abuse of power.
“We believe that reducing the distribution of misinformation -- rather than removing it outright -- strikes the right balance between free expression and a safe and authentic community,” the spokesperson said.
That position is consistent with comments made by Facebook CEO Mark Zuckerberg, earlier in the week, which drew broad criticism.
In an interview& nbsp ;with Recode, Zuckerberg tried to explain why Facebook doesn’t shun conspiracy theorists.
“At the end of the day, I don’t believe that our platform should take that down because I think there are things that different people get wrong,” Zuckerberg told Recode. “I don't think that they’re intentionally getting it wrong.”
Following an immediate backlash, Zuckerberg tried to clarify his comments.
“I personally find Holocaust denial deeply offensive, and I absolutely didn’t intend to defend the intent of people who deny it,” he explained in an email to Recode.
This week’s controversy comes on the heels of Facebook’s decision not to nix Infowars -- a site that specializes in wild and egregious conspiracy theories.
Per its new policy, Facebook said it will soon remove content that has been flagged, “escalated,” and confirmed by third-parties as false and potentially contributing to imminent violence.