Commentary

PBS' 'The Cleaners' Uncovers Secret World Of Social Media's Content Moderators

Facebook, Twitter and YouTube have come under much criticism for posts containing controversial sexual, violent or political images, videos and text.

But who decides what should be deleted and what should be kept?

Not the tech geeks in Silicon Valley, but thousands of “content moderators” working for third-party subcontractors in the Philippines. They are “The Cleaners,” described as the “unknown people making major editorial decisions” in a 90-minute documentary debuting tonight on PBS’ “Independent Lens.”

“The Cleaners” reveals this secret world largely through interviews with actual moderators in Manila, plus attempted explanations from former social media executives in the U.S.

The former folk steal the show.

One one level, you wonder why these Filipinos should be the ones deciding what the world sees on social media.  

advertisement

advertisement

A woman who looks at sexual posts all day is shown visiting religious sites around Manila while her voiceover explains, “You need to prevent sin from being out there on social media.”

A man who decides what political content you should see is an avowed lover of Rodrigo Duterte, the Philippine president who has said he would like to emulate Adolf Hitler’s killing of the Jews by killing all drug dealers and criminals. “Just like our president is doing everything he can to keep the Philippines safe, I’m doing just the same in my job,” says this content moderator. “I protect the users who use the app.”

In the end, however, the “cleaners” are just following rules laid down by others.

One content moderator comes across the iconic 1967 photo showing a naked girl fleeing an American napalm attack in Vietnam. This image helped galvanize the U.S. anti-war movement, and he realizes the photo’s significance -- but “based on our guidelines, [with] genitals and minors involved, delete it.”

In a haunting moment, repeated over and over again, content moderators sit in front of computer screens, pausing at image after image, sometimes for long periods, before hitting either “delete” or “ignore."

And you feel for them.

“You are not allowed to commit one mistake,” one says. “It could trigger war…. Our job is not easy.”

Indeed not.

Some quotes:

“I’ve seen hundreds of beheadings, pictures and videos.”

“I have to identify terrorism.”

“I have to stop cyberbullying.”

And that woman out to combat sin?

When she began her job, which involves looking at sexual images all the time, she needed to learn what words like “tits” meant. After a while, she began dreaming of “penises everywhere,” admitting that “in the end, it was my guilty pleasure.”

And now?  “I’m different from what I was before. It’s like a virus in me, slowly penetrating in my brain…I need to stop. There’s something wrong happening.“

Imagine if your sole job was to watch live suicides being promised onscreen as viewers egg on the troubled souls to commit their deeds.

“As long as he actually hasn’t committed suicide, we’re not allowed to stop his live stream,” says a content moderator.  ”If we stop it, we’re the ones who get in trouble.”

Of course, if they don’t, even worse things can happen. Spoiler alert: An email message from a moderator to the film’s producers reads, “My teammate hung himself. He specialized in self-harm live videos.” This worker had previously asked to be transferred three times.

I haven’t quoted any of the former social media executives who are heard in the documentary, probably because MediaPost readers, if anyone, have heard their arguments before.

Watch “The Cleaners,” instead, for what goes on, as the doc says, “in the shadows”: The secret world of the previously unknown people who decide what you see on social media.

Next story loading loading..