Facebook's social experiment on 700,000 unsuspecting users constitutes a deceptive and unfair practice, the Electronic Privacy Information Center says in a complaintfiled on Thursday with the Federal Trade Commission.
The privacy watchdog is asking the FTC to investigate Facebook's “unlawful manipulation” of users' news feeds. “The company purposefully messed with people’s minds,” EPIC asserts in its papers.
The group also wants the FTC to order Facebook to disclose its news feed algorithm.
EPIC filed its complaint in response to this week's widely publicized reports that Facebook tinkered with the news feeds of nearly 700,000 people in order to test whether their moods would be influenced by friends' posts. For the study, published earlier this month in the Proceedings of the National Academy of Sciences, Facebook manipulated users' news feeds to deliberately filter out some positive or negative posts.
Researchers observed people's reactions and concluded that mood was “contagious,” with users' responses matching the tone of the posts they saw. In other words, people shown more negative posts themselves began posting more negative material, while those shown more positive comments responded in kind.
Facebook gathered the data, which was analyzed by researchers from Facebook, the University of California, San Francisco and Cornell.
Numerous commentators have questioned the ethics of running psychological tests on people without first obtaining their informed consent.
Facebook's Chief Operating Officer Sheryl Sandberg responded to the uproar this week by saying the company “poorly communicated” about the study -- but she didn't apologize for having conducted the research.
But EPIC says the project wasn't just unethical, but unlawful. The advocacy group says that Facebook violated its promises to users, as well as a consent decree the company previously entered into with the FTC.
Facebook's data use policy currently states that the company can draw on information about users for a broad array of purposes, including “research.” But in January of 2012, when Facebook ran its test, the company's policy reportedly didn't list “research” as one of the permissible uses of data.
EPIC says that the omission was unfair and deceptive. “Users could not reasonably have guessed that use of their Facebook account might subject them to behavioral testing,” the group writes.
The privacy organization also says that Facebook violated a consent decree stemming from an earlier FTC case. In that matter, the FTC accused Facebook of repeatedly sharing users' data more broadly than they authorized. The best-known example probably occurred in December of 2009, when Facebook reclassified a host of data about users as “public” -- including people's names, photos and friend lists.
But that agreement wasn't finalized until August of 2012 -- seven months after Facebook conducted its mood-contagion experiment. Given the timing, it doesn't seem likely that the FTC can charge the company with contempt.
Even so, EPIC executive director Marc Rotenberg points out that Facebook agreed to the settlement's terms in late 2011 -- before conducting its test. Rotenberg tells MediaPost that a violation of the terms would show “bad faith” on Facebook's part, even if it occurred before the deal was finalized.