So just how good are the algorithms behind Facebook’s analysis of human emotions? Pretty, pretty, pretty good.
In fact, it turns out they’re better at flagging users who may be
contemplating self-harm than other people.
According to BuzzFeed, Facebook is rolling out a new suicide prevention program powered by its massive artificial intelligence resources.
Facebook product manager Vanessa Callison-Burchold tells BuzzFeed that on average the social platform delivers more notices flagging potential self-harm identified by Facebook’s AI than by
other users.
And it’s not hard to see why this might be the case: among other things, the AI analyzes posts and related content for emotions and sentiments possibly indicating self-harm
by comparing them with an archive of posts that led to intervention in the past.
However good the intuition of an individual’s family and friends might be, most people don’t have
prior experience with a single suicidal individual, let alone access to thousands of cases.
advertisement
advertisement
The move was prompted in part by a spate of highly-publicized cases in which users used Facebook to
live stream their suicides, triggering fears of copycat actions as well as questions about the extent of Facebook’s responsibility in these kind of situations.
Although Facebook is
unlikely to be held legally liable, Mark Zuckerberg indicated he feels a sense of personal responsibility, telling BuzzFeed: “It’s hard to be running this company and feel like, okay, well
we didn’t do anything because no one reported it to us.”
Outreach efforts include screens showing contacts for suicide prevention resources, messaging with suicide prevention
organizations, and suggestions to contact friends.
Meanwhile, users in the individual’s social network may also see options for reporting the potential case of self-harm to the relevant
authorities.
In cases where there is a clear and present danger of suicide, Facebook will alert its own community team.
In the final resort, Facebook Live itself may also be a suicide
prevention tool, by allowing family and friends to contact law enforcement or intervene directly.
In the case of Nakia Venant, a 14-year-old who killed herself in Miami in January, a friend
contacted the police after she saw Venant’s preparations on live streaming video, but the police were first sent to the wrong address and arrived too late.