Facebook on Thursday introduced a series of new rules designed to penalize those who violate its Community Standards and cut down on the spread of misinformation sent through Facebook Groups private communities.
Artificial intelligence can pick up an issue even when someone doesn’t report it to Facebook, writes Tom Alison, vice president of engineering, Facebook.
“When it comes to groups themselves, we will take an entire group down if it repeatedly breaks our rules or if it was set up with the intent to violate our standards,” he wrote. “Over the last year, we took down more than 1 million groups for violating these policies.”
To stop misinformation in groups, Facebook will remove groups that share content violating Community Standards, reduce the distribution of group sharing misinformation, and inform people when they encounter misinformation.
The idea is not only to take down the group violating Community Standards, but to stop the people involved in those groups from starting new ones.
Members who have post violations in a group will now require approval for the next 30 days. This stops their post from being seen by others until an admin or moderator approves it.
If admins or moderators repeatedly approve posts that violate Facebook's Community Standards, Alison explains, the company will remove the entire group.
The company in the coming weeks will begin to archive groups that do not have an active admin when they leave the group and no one assumes the responsibility.
Admins of the groups will be held responsible for fostering the purpose and culture of their group. Along with announcement came suggestions from Facebook on how to run their group.
Facebook will also take a stronger stand on health groups. People can still invite friends to health groups or search for them, but the social network will no longer show health groups in recommendations.