This week, French President Emmanuel Macron and New Zealand Prime Minister Jacinda Ardern announced they will co-host a meeting in Paris in May, where they are expecting other world leaders and top
executives from Facebook, Twitter, Google and YouTube.
The meeting comes in response to the horrific terror attack that killed 50 people
last month in Christchurch. Its ambition is for the participants to sign onto the Christchurch Call, a pledge to prevent violent extremism and terrorism online.
And while the details of the
pledge aren’t yet determined, its explicit target is the sharing of violent content. In her announcement, Ardern reaffirmed the right
to freedom of expression, but said: "I don't think anyone would argue that the terrorist had the right to livestream the murder of 50 people. That is what this call is specifically focused on."
We don’t yet have technology that can automatically distinguish between breast-feeding and pedophilia. We don’t yet have technology that can automatically identify a live shooting or a
terror attack.
The task is difficult, but not insurmountable.
We could embed ID info into every video that gets uploaded, so taking down one version takes down all versions. We could build
delays into livestreams until the user’s account is verified in some way. We could require Facebook to spend that $3 billion to $5 billion it has prepared to pay in fines on improving its ability to detect such
postings in real time.
Here's what won't appear on the table:
- The way filter bubbles only show us content that reinforces our existing beliefs, creating a self-reinforcing cycle
that lends itself to strong views in every direction.
- The way those filter bubbles create a breeding ground for extremist bacteria, long before there is enough of it to culminate in a public
event.
- The way in which social-media platforms get rewarded for provoking shock and outrage. Their incentives are aligned against what Ardern and Macron are trying to
accomplish.
Ardern said the Paris meeting is about how online platforms were used “during” the terror attack. The issues above were in play well before it and created
environmental conditions ideal for dangerous ideology.
For the May meeting, let's focus on concrete outcomes. We get that by being specific and focused, by picking the thing we’re least
likely to disagree on— “people shouldn’t get to livestream murder” — and working on a solution.