On Friday, a brutal massacre in a New Zealand mosque left dozens of people dead. In a grotesque twist, the attack was allegedly live-streamed on Facebook, with the gunman apparently wearing a camera on their body.
Hours after the massacre, copies of the video were still being shared on YouTube, Facebook, Twitter and other digital platforms. While all of the companies said they were actively removing the videos, they appeared to be playing catch-up.
“Police alerted us to a video on Facebook shortly after the live stream commenced, and we quickly removed both the shooter’s Facebook and Instagram accounts and the video,” Facebook stated on its Twitter account. “We're also removing any praise or support for the crime and the shooter or shooters as soon as we’re aware.”
“Our hearts are broken over today’s terrible tragedy in New Zealand,” YouTube said in a statement posted to its Twitter account. “Please know we are working vigilantly to remove any violent footage.”
The shooting underscored a new reality: just as digital video platforms allow for new ways of connecting,they can also serve to propagate violence, and fringe ideas.
Last year, a live-streamed video game tournament was cut short after a shooter entered the Esports arena and opened fire. Audio and video from the shooting was broadcast live to thousands of viewers. In 2015, a local news crew in Virginia was shot dead by a gunman, who subsequently posted his own video of the shooting to Twitter.
Making the matter even more complicated, the New Zealand gunman also appeared well-versed in the social-video zeitgeist, urging his viewers to subscribe to the popular YouTuber PewDiePie before he started shooting. PewDiePie, whose real name is Felix Kjellberg, has encouraged his fans to promote his channel as part of a long-running feud with another channel, T-Series.
“Just heard news of the devastating reports from New Zealand Christchurch,” he wrote on his Twitter account. “I feel absolutely sickened having my name uttered by this person. My heart and thoughts go out to the victims, families and everyone affected by this tragedy.”
For digital platforms, the challenge of policing a live video is immense. As the response to the shooting shows, however, even removing copies of the original video are proving to be a formidable challenge.
Algorithms and human moderators can pull videos after they are uploaded. But at the moment, they remain ill-equipped to remove every version of the video, which has been edited and sliced into hundreds of variants, allowing them to evade moderators.