Commentary

It Was A Quiet Week In The Evil Empire

Found something Sunday on my Facebook feed from my cousin Leah. It was a clip from a 1988 BBC special about Sir Nicholas Winton, who had saved 669 Czech Jews in 1938 and 1939. The then 79-year-old humanitarian is seen in an audience among hundreds of others, and when the host asks if anyone in the theater owed his or her life to Sir Nicholas, they all rise. Taken by surprise, the old man came to tears.

Me too.

Two years ago, I might not have seen that post. At the time, Facebook was conducting its now infamous experiment in mood manipulation by filtering out certain content that was emotional in nature, positively or negatively. The test on 689,003 unwitting users employed sentiment analysis software to locate emotionally telling words -- “tears,” for example -- to create a baseline against which to measure the activity around emotion-triggering content.

The goal was not to help society come to grips with its emotions. The goal was to learn how to increase activity on Facebook. And the oblivious 689,003 were the guinea pigs. Naturally, a number of Facebook users/emotion havers have freaked out at this revelation.

advertisement

advertisement

Now, filtering out heartwarming and infuriating social-media messages isn’t exactly Tuskegee. No harm came to anyone, unless lost sniffles rises to the level of harm. And there’s no evidence that the subjects were chosen based on powerlessness, marginalization or any other criteria bespeaking special vulnerability. But that misses the point.

The point is that they were subjects of a scientific experiment without their informed consent -- which, simply put, is unethical.

Facebook first responded to the outrage, predictably, by saying, essentially: “Nuh uh! Using user data for research is covered in the terms of service!” -- which, simply put, was a lie. A double lie. When the experiment was conducted, that language had not yet been unilaterally and ex post facto inserted into the terms of service. Also, there is a great difference between crunching data based on organic user experience and surreptitiously altering user experience to see what happens.

One of the researchers also had the gall to observe that the 689,003 were anonymous to the researchers -- as if this were a privacy matter. It is not a privacy matter. It is a tyranny matter. But it gets worse. CEO Sheryl Sandberg, who has spent the past three years circling the globe telling women the right way to live, leaned back and dismissed the controversy as an endeavor “poorly communicated.”

Uh huh -- perhaps in the way Pearl Harbor was poorly communicated. The thing is, when one is doing something sneaky and objectionable, the essence of the thing is, ahem, zero communication. In other words, simply put, she is full of shit.

And entirely in character for her and the company she and Mark Zuckerberg lead. Again and again, the company makes unilateral decisions that affect users and advertisers, often obscuring them in terms-of-service boilerplate, and owning up to them only when they are discovered, whereupon they rescind the decision and congratulate themselves for their heroic ability to take criticism to heart.

Of course, we only know about the stuff they get caught at. We don’t know what lurks undetected, do we? You are, no doubt, familiar with the truism “It is better to ask for forgiveness than permission.” The Facebook motto might be “What can we get away with?” Or, perhaps, “Users? Fuck them.”

Pretty ironic, actually. All throughout the corporate universe, and among all other public-facing institutions, the stewards are learning that they must be transparent lest they face the wrath of the public in social media. Yet Facebook -- social media behemoth -- operates with a mentality of impunity that is as retrograde as it is breathtaking.

Because it can. So far.

History records the calamities that befall the world when too much power is concentrated into few hands: Nazi Germany, Standard Oil, Kevin Costner.

We should be afraid of a company that serves 15% of mankind and treats them like lab rats...or worse. The Wall Street Journal reported that concurrent with the emotions experiment two years ago, Facebook -- in designing anti-fraud measures -- informed thousands of users that they would be blocked from the site unless they could prove their users were humans, not bots -- this despite knowing all along that the users were human indeed.

Yes -- the researchers intentionally alarmed and frightened their subjects. I mean, I guess those people were alarmed and frightened.  If they sent out posts to their friends describing those emotions, the content may have been blocked altogether.

 

5 comments about "It Was A Quiet Week In The Evil Empire".
Check to receive email when comments are posted.
  1. Henry Blaufox from Dragon360, July 7, 2014 at 11:13 a.m.

    If you find Facebook's corporate behavior so objectionable, stop using the service.

  2. Paul Ratzky from Olson, July 7, 2014 at 11:16 a.m.

    To paraphrase John Oliver in his recent Net neutrality rant, "If you want to do evil, wrap it in boring. Most of us would 'approve' the entire contents of Mein Kampf if it was wrapped within the iTunes user agreement."

  3. Alex Goulder from Gaiam, July 7, 2014 at 11:24 a.m.

    I hate FB as much as the next guy, but as marketers, aren't we all engaged daily in similar experiments, just on a smaller scale? I don't really see much difference between what FB did and A/B testing button placement or offer terms--stuff we do as a matter of course.

  4. Andrew Boer from MovableMedia, July 7, 2014 at 2:22 p.m.

    All online marketing, as Alex G. points out, is at some level an experiment in social psychology; The weird irony here is that if you conduct experiments like this in the name of profit it is fine, but if you use that information it in the name of furthering science, you are being unethical.

    "The Proceedings of the National Academy of Sciences concluded that the decision to manipulate the content appearing on the Facebook pages of about 700,000 people without their prior consent may have violated some principles of academic research.
    The journal also pointed out that, as a for-profit company governed by its own terms of service, Facebook had no obligation to adhere to those scientific principles."

  5. J S from Ideal Living Media, July 7, 2014 at 4:38 p.m.

    If they wanted more use of Facebook they would have, oh I don't know, produced content. Or stopped charging users who are producing content for them -- for free -- to share it with others. Instead, they are charging to limit the spread (and thus, usefulness) of free content shared on their site.

Next story loading loading..