
In the first 24
hours after the New Zealand mosque attack last Friday, Facebook says it removed approximately 1.5 million videos of the massacre.
More than 1.2 million of those videos were blocked by
Facebook’s system immediately upon being uploaded, according to the company.
For Facebook, releasing the figures appears to be an implicit response to widespread criticism over its
failure to block footage of the Christchurch mosque shootings.
Brenton Harrison Tarrant, the gunman who attacked the two mosques, successfully live-streamed the attacks on Facebook for 17
minutes.
Critics continued to express disappointment with the actions taken by Facebook, over the weekend.
“How did the LIVE VIDEO OF A MASS MURDER ever make it to your platform
to begin with??” one user tweeted at Facebook.
“Too little too late,”
tweeted another user.
Consumers are not the only ones looking for more accountability from Facebook. Among other powerful figures, New Zealand Prime Minister Jacinda Ardern said she plans to
speak with Facebook leadership about the platform’s inability to block such violent content.
The newly released data wasn’t Facebook’s first and only response to the mosque
attacks.
“Police alerted us to a video on Facebook shortly after the live stream commenced, and we quickly removed both the shooter’s Facebook and Instagram accounts and the
video,” the company tweeted late last week. “We’re also removing any praise or support for the crime and the shooter or shooters as soon as we’re aware.”
Facebook’s battle with inappropriate content isn’t a new one. The tech titan has spent the past several years scaling up its investment in safety and security, including dramatically
growing our content review teams.
By Sunday, the death toll in the Christchurch mosque shootings climbed to 50.