Facebook is pushing back against reports it is assigning “trustworthiness” scores to individual users.
Among other factors, flagging an accurate news story as “false” can negatively affect someone’s reputation score, The Washington Post reports.
“The idea that we have a centralized ‘reputation’ score for people that use Facebook is just plain wrong,” a company spokeswoman insisted on Tuesday.
As part of a larger effort to clean up its content, Facebook admits that it is taking measures to determine whether users are accurately reporting fake news.
“We developed a process to protect against people indiscriminately flagging news as fake and attempting to game the system,” the spokeswoman said. “The reason we do this is to make sure our fight against misinformation is as effective as possible.”
Whatever you call it, Facebook’s “process” does include factoring users’ trustworthiness into its algorithm for filtering out fake news.
Still, the spokeswoman said it is misleading to interpret that process as assigning a unified user score comparable to a broadly-applicable credit rating.
Facebook says ranking users’ trustworthiness is specific to its crackdown on misinformation.
The distinction will likely be lost on Facebook’s critics, who accuse the company of letting political bias shape its content.
On Monday, President Trump told Reuters that it is “very dangerous” for Facebook and other social networks to possess broad censorship powers. The comments came on the heels of Facebook and other networks banning Alex Jones and his conspiracy theory factory Infowars.