To better detect different kinds of clickbait headlines, Facebook is now separately identifying signals that withhold or exaggerate information.
For the social giant, this is just the latest effort to cut down on crappy content in users’ New Feeds.
Over the past year, in fact, the social giant has categorized hundreds of thousands of headlines as clickbait -- or not clickbait -- by considering if headlines exaggerate the details of a story, and separately if the headline withholds information.
From there, Facebook identifies what phrases are commonly used in clickbait headlines, but which are are not used in other headlines.
“This is similar to how many email spam filters work,” Facebook engineers Arun Babu, Annie Liu, and Jordan Zhang note in a new blog post.
Of course, posts determined to have clickbait headlines appear lower in News Feed.
“Publishers that rely on clickbait headlines should expect their distribution to decrease,” Babu, Liu, and Zhang warn. “If a Page stops posting clickbait and sensational headlines, their posts will stop being impacted by this change.”
By its own estimate, Facebook is getting better at spotting spam, bogus accounts, fake news, con jobs, and other types of misinformation.
Specifically, it has made improvements in recognizing inauthentic accounts by identifying patterns of activity. For Facebook, red flags include repeated posting of the same content, and an increase in messages sent.
In partnership with top third-party fact-checking organizations, Facebook recently launched a full-frontal attack on “fake news."
Facebook executives know they increasingly serve as gatekeepers between publishers and readers. The social giant has slowly realized it is partially responsible for policing the veracity of the content that flows through its platform.
Facing mounting criticism for failing to curb phony news, Facebook CEO Mark Zuckerberg recently called for patience and understanding. “We take misinformation seriously,” Zuckerberg asserted in a blog post. “We've made significant progress, but there is more work to be done.”
The young mogul said Facebook has historically relied on its community of users to point out inaccurate news content, but admitted that the task has become increasingly “complex, both technically and philosophically.”