Facebook Improves Enforcement Against Bad Content

Facebook now estimates that 5% of monthly active accounts are fake, according to the social giant’s latest Community Enforcement Report.

In March, Facebook reported 2.38 billion monthly active users -- 5% of which is about 119 million.

In its latest enforcement report, Facebook also estimated that for every 10,000 times users viewed content, 11-to-14 views contained content that violated the network’s adult nudity and sexual activity policy, while 25 of those views contained content that violated its violence and graphic content policy.

For every 10,000 times users viewed content, fewer than three views contained content that violated its policies for global terrorism, child nudity and sexual exploitation, according to Facebook.

Guy Rosen, vice president, integrity at Facebook, said the company arrives at these figures by periodically sampling content viewed on its network, and then reviewing it to see what percent violates its standards.



For fake accounts, Rosen said the amount of accounts that Facebook took action against increased, due to large-scale automated attacks by bad actors.

From the fourth quarter of 2018 to the first quarter of 2019, Rosen said that the number of accounts that Facebook disabled increased from 1.2 billion to 2.19 billion.

“We’ll continue to find more ways to counter attempts to violate our policies,” Rosen vowed in the new report.

In six of the policy areas Facebook included in its report, the company proactively detected over 95% of the content it took action against before needing it to be reported.

For hate speech, Facebook now detects 65% of the content its remove -- up from 24% just over a year ago.

In the first quarter of 2019, Facebook took down 4 million hate speech posts, according to Rosen.
Next story loading loading..