Ut won the Pulitzer Prize for the photo in 1973, which is credited with turning the tide of public opinion and leading to the end of the Vietnam War.
This picture -- this image of unfathomably vast historical importance, of undeniably significant social commentary -- was removed from Facebook last week, when the social network’s algorithms detected the nudity and deleted it from the account of the Norwegian newspaper Aftenposten, which had posted the image.
Understandably, people were upset. Half the ministers in the Norwegian parliament reposted the picture, including Prime Minister Erna Solberg. Facebook then deleted the post from their pages.
Aftenposten’s editor in chief, Espen Egil Hansen, wrote an open letter to Mark Zuckerberg: “[D]ear Mark, you are the world’s most powerful editor… However, even though I am editor-in-chief of Norway’s largest newspaper, I have to realize that you are restricting my room for exercising my editorial responsibility... I think you are abusing your power, and I find it hard to believe that you have thought it through thoroughly.”
Paul Carr at Pando was even more livid, saying that Facebook had lost its mind: “Facebook and Zuckerberg are fine with censoring the most important war photograph of all time, and with censoring the Norwegian prime minister when she objects. But don’t you dare upset Trump or Modi or there’ll be hell to pay. (Idea: Maybe Donald Trump could repost the Napalm Girl image and call her ‘low energy’ and ‘disgusting.’ Then FB will likely promote it.)”
I read Carr’s piece, and immediately joined him in the soothing embrace of righteous outrage.
But then I read the account by Horst Faas and Marianne Fulton about how the original picture reached the world. “[A]n editor at the AP rejected the photo of Kim Phuc running down the road without clothing because it showed frontal nudity. Pictures of nudes of all ages and sexes, and especially frontal views were an absolute no-no at the Associated Press in 1972… [Then head of the Saigon photo department] Horst Faas argued by telex with the New York head-office that an exception must be made, with the compromise that no close-up of the girl Kim Phuc alone would be transmitted. The New York photo editor, Hal Buell, agreed that the news value of the photograph overrode any reservations about nudity.”
At that time, of course, there were no algorithms deciding what was appropriate and what wasn’t. It was only people -- and they almost got it wrong, too.
To be clear, the fact that an AP editor censored the Napalm Girl 40 years ago doesn’t make it OK for Facebook to censor her today. But it does demonstrate that the task of distinguishing news from trash, of weighing context and determining social value, isn’t merely a challenge in our algorithmically driven media age. It’s a problem we’ve been having for decades -- centuries -- and every time it comes up, it gives us the opportunity to become better.
Better how? Aftenposten’s Hansen has a simple suggestion to start with: “Facebook should distinguish between editors and other Facebook-users. Editors cannot live with you, Mark, as a master editor.”
Surely this would be straightforward to implement. News sources of a certain standard can be authorized to determine context and newsworthiness. Allowing actual editors to have actual editorial control would actually bolster Facebook’s argument that it’s just a neutral carrier, not a news outlet.
In this instance, the uproar eventually reached some real humans at Facebook, and the photo was reinstated. Now it’s up to Facebook to adapt its systems so this doesn’t happen again.