The meeting comes in response to the horrific terror attack that killed 50 people last month in Christchurch. Its ambition is for the participants to sign onto the Christchurch Call, a pledge to prevent violent extremism and terrorism online.
And while the details of the pledge aren’t yet determined, its explicit target is the sharing of violent content. In her announcement, Ardern reaffirmed the right to freedom of expression, but said: "I don't think anyone would argue that the terrorist had the right to livestream the murder of 50 people. That is what this call is specifically focused on."We don’t yet have technology that can automatically distinguish between breast-feeding and pedophilia. We don’t yet have technology that can automatically identify a live shooting or a terror attack.
The task is difficult, but not insurmountable.
We could embed ID info into every video that gets uploaded, so taking down one version takes down all versions. We could build delays into livestreams until the user’s account is verified in some way. We could require Facebook to spend that $3 billion to $5 billion it has prepared to pay in fines on improving its ability to detect such postings in real time.
Here's what won't appear on the table:
Ardern said the Paris meeting is about how online platforms were used “during” the terror attack. The issues above were in play well before it and created environmental conditions ideal for dangerous ideology.
For the May meeting, let's focus on concrete outcomes. We get that by being specific and focused, by picking the thing we’re least likely to disagree on— “people shouldn’t get to livestream murder” — and working on a solution.