CANNES, FRANCE -- Google's Jigsaw is countering the provocative, or toxic (depending on your point of view) environment created by trolls with its filter designed to flag abusive online comments.
Three in four Internet users have witnessed harassment, and 47% have been targets of abuse, said Jigsaw CEO Jared Cohen during his presentation at the Cannes Lions today. One in three self-censor in fear of what people might do or say.
"I worry about my kids," says Cohen. "Is this going to be their frame of reference in how people talk to each other? Is this what they are going to come to expect? It doesn't have to be this way."
Jigsaw is the Google incubator (formerly known as Google Ideas) that devotes time and effort to building better cyber security tools. One such tool, called “Perceptive” uses machine learning to rate text comments on a “toxicity” scale from 0 to 100; the score is generated by comparing a comment against a data set of hundreds of thousands of already-catalogued comments.
advertisement
advertisement
Many of the flagged abusers aren't intentionally bad. "A lot of toxic comments just lack social awareness or are having a bad day," he says.
Publishers do the actual censoring (if they choose to) by setting the boundaries of their accepted toxicity threshold. For instance, a comment like "climate change is bullshit" is flagged by Perceptive, since 84% of comments in the tool’s data set have deemed similar statements as toxic.
Publishers will then receive notification of offensive comments that reach their toxicity threshold and can then decide what to do with them.
Google disputes charges that Perspective is limiting free speech by censoring comments. Technically, Perspective simply provides publishers with a method of doing so if they choose to. Right now, when trolls enter a conversation, the default position is increasingly to shut down comments entirely and sensible people are fleeing the conversation, says Cohen. "You are just getting one toxic person talking to another toxic person."
The API tool launched in February with clients Wikipedia and The New York Times, whose battles against trolls includes 10 full-time employees to monitor 10,000 comments a day, says Cohen. That only accounts for 10% of the total articles it publishes each day. Now, Perspective is enabling the newspaper to expand its comment forums to all top stories and opinion pieces.
The tool also has expanded to so users can set a specific toxicity alert to monitor their own personal messages. "Who hasn't fired off an unnecessarily obnoxious email while in a bad mood?" asks Cohen. "You never have to worry about the mood you are in."
Jigsaw is experimenting with other monitoring models, including obscenities, attacks on other commenters, and being off-topic. "Just because a conversation is civil doesn't mean everyone will participate," says Cohen. "We want to make sure it is coherent."