Expanding its war on “false news,” Facebook just released a broad plan to stop bad actors from spreading misinformation on its platform.
“We wanted to share what we are doing to help ensure our community remains a safe and secure environment for authentic communication,” Alex Stamos, chief security officer at Facebook, notes in a new blog post.
It’s all included in a new white paper, in which Facebook explains: “We have had to expand our security focus from traditional abusive behavior, such as account hacking, malware, spam and financial scams, to include more subtle and insidious forms of misuse, including attempts to manipulate civic discourse and deceive people.”
Rather than “fake news” -- a term Facebook believes has been rendered meaningless through overuse -- its security team is now focused on what it’s calling “information operations,” which it defines as “actions taken by organized actors (governments or non-state actors) to distort domestic or foreign political sentiment, most frequently to achieve a strategic and/or geopolitical outcome.”
Such operations can use a combination of methods, including “false news,” “disinformation,” and “networks of fake accounts aimed at manipulating public opinion,” which Facebook refers to as “false amplifiers.”
Without naming Russia by name, the social giant is obviously alluding to the country’s continued efforts to manipulate foreign elections by spreading false and misleading news reports.
“Information operations” take different forms, according to Facebook.
It includes targeted data collection, with the goal of stealing, and often exposing, non-public information, which the tech titan believes can provide “unique opportunities” for controlling public discourse.
It also consists of content creation -- false or real -- either directly by the information operator, or by seeding stories to journalists and other third parties, including via fake online personas.
Then, there’s what Facebook calls “false amplification,” which it defines as coordinated activity by inauthentic accounts with the intent of manipulating political discussion.
The company says it can detect this type of activity by analyzing the inauthenticity of the account and its behaviors, and not the content the accounts are publishing.
In addition to the whitepaper, Facebook is reportedly looking for a new leader to spearhead its war on “information operations.”
The company is apparently scouring the technology and media industries for the right executive, but the search is proving more difficult than expected.
By its own reckoning, Facebook is getting better at spotting spam, bogus accounts, fake news, con jobs, and other types of misinformation.
“We’ve made improvements to recognize these inauthentic accounts more easily by identifying patterns of activity … without assessing the content itself,” Shabnam Shaik, a technical program manager on Facebook’s Protect and Care Team, recently noted.
For Facebook, red flags include repeated posting of the same content, and an increase in messages sent.
In tests, this evolving strategy is already showing results, said Shaik. “In France, for example, these improvements have enabled us to take action against over 30,000 fake accounts,” he calculated.
Going forward, Shaik said he and his team were focused on sidelining the biggest and more prolific offenders. “Our priority … is to remove the accounts with the largest footprint, with a high amount of activity and a broad reach,” he noted.
In partnership with top third-party fact-checking organizations, Facebook recently launched a full-frontal attack on bogus news.
Not everyone has applauded Facebook’s approach to fighting fake news, however.
Among other critics, Mike Caulfield, director of blended and networked learning at Washington State University Vancouver, recently took to Medium to air his grievances.
Among other gripes, Caulfield said he thought Facebook’s proposed process takes too long, deals only with “surface issues, targets fake news but not slanted claims,” and doesn’t effectively make use of its own powerful network.