Commentary

Facebook Says Spammers Are Losing Ground

By its own reckoning, Facebook is getting better at spotting spam, bogus accounts, fake news, con jobs, and other types of misinformation. Of late, “We’ve made improvements to recognize these inauthentic accounts more easily by identifying patterns of activity … without assessing the content itself,” Shabnam Shaik, a technical program manager on Facebook’s Protect and Care Team, notes in a new blog post.

For Facebook, red flags include repeated posting of the same content, and an increase in messages sent.

In tests, this evolving strategy is already showing results, according to Shaik. “In France, for example, these improvements have enabled us to take action against over 30,000 fake accounts,” he calculates. 

Going forward, Shaik says he and his team on focused on sidelining the biggest and more prolific offenders. “Our priority … is to remove the accounts with the largest footprint, with a high amount of activity and a broad reach,” he notes.

In partnership with top third-party fact-checking organizations, Facebook recently launched a full-frontal attack on “fake news."

Facebook executives know they increasingly serve as gatekeepers between publishers and readers. The social giant has slowly realized it is partially responsible for policing the veracity of the content that flows through its platform.

Facing mounting criticism for failing to curb phony news, Facebook CEO Mark Zuckerberg recently called for patience and understanding. “We take misinformation seriously,” Zuckerberg asserted in a blog post. “We've made significant progress, but there is more work to be done.”

The young mogul said Facebook has historically relied on its community of users to point out inaccurate news content, but admitted that the task has become increasingly “complex, both technically and philosophically.”

“We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible,” Zuckerberg explained. “We need to be careful not to discourage sharing of opinions or to mistakenly restrict accurate content.

“We do not want to be arbiters of truth ourselves, but instead rely on our community and trusted third parties.” Further deflecting blame, Zuckerberg insisted the percentage of misinformation on Facebook remains “relatively small.”

Next story loading loading..