Facebook Lets Users Track Updates, Reduces Ability To Spread Misinformation

As part of a broader effort to cut down on offensive and abusive content, Facebook is updating its existing “remove, reduce, and inform” strategy.

“This involves removing content that violates our policies, reducing the spread of problematic content that does not violate our policies and informing people with additional information, so they can choose what to click, read or share,” Guy Rosen and Tessa Lyons, vice president, integrity-head of News Feed Integrity at Facebook, note in a new blog post.

Among other changes, the tech titan is rolling out a new section on its Community Standards site where users can track related updates.

Facebook is also updating the enforcement policy for groups and launching a new “group quality” feature.

In order to reduce the spread of what it calls “problematic” content, Facebook is asking for help from outside experts and expanding the content the Associated Press will review as a third-party fact-checker.

The company also plans to reduce the reach of Facebook Groups that repeatedly share misinformation and incorporate a “Click-Gap” signal into News Feed rankings so users are exposed to less low-quality content.

As for better informing users, Facebook plans to add “trust indicators” to the News Feed context button and more information to the “page quality” tab.

Also, Facebook will now let users remove their own posts and comments from a Group after they decide to leave it.

Expanding its efforts to curb impersonations, Facebook plans to add Verified Badges to Messenger, along with new messaging settings and an updated “block” feature on Messenger.

To help prevent the spread of misinformation, Facebook is has added a “forward indicator” and “context button” to Messenger.
Next story loading loading..