
Facing growing criticism for its failure to stop the
spread of offensive video, Facebook is promising to hire another 3,000 human content monitors.
“Over the next year, we'll be adding 3,000 people to our community operations team around
the world,” Facebook CEO Mark Zuckerberg promises in a new blog post.
That adds to the 4,500 or so people who already make up Facebook’s community operations team.
The
team’s foremost responsibility is to “review the millions of reports we get every week, and improve the process for doing it quickly,” Zuckerberg said.
For Facebook, the
hiring spree is part of a larger effort to curb the spread of murders, rapes, and similarly unsuitable video.
The tech titan is working with local community groups and law enforcement, who are
well positioned to act on threats or acts of violence. Facebook is also building what Zuckerberg insists are “building better tools” to keep community members safe.
“We’re going to make it simpler to report problems to us, faster for our reviewers to determine which posts violate our standards and easier for them to contact law enforcement if
someone needs help,” according to Zuckerberg.
Working with local law enforcement, Zuckerberg said Facebook was recently able to prevent one community member from acting on his expressed
suicidal ideas. “In other cases, we weren't so fortunate,” he admitted.
Indeed, Facebook has recently served as a platform for teens streaming their own suicides; the broadcast of a young woman being raped; and a young man with special needs being tortured.
Just
last month, Facebook also unwittingly made it possible for an Ohio man to distribute video footage of a murder.
Facebook stated that while the assailant did use its Live service, he did not
actually broadcast the act of violence live. Still, questions remain about why video of the crime remained on Facebook’s platform for part of the day.