“We’re committed to doing research to make Facebook better, but we want to do it in the most responsible way,” Chief Technology Officer Mike Schroepfer said today in a blog post.
His post outlines some of the new limits on research, but only in the vaguest possible terms. “If proposed work is focused on studying particular groups or populations (such as people of a certain age) or if it relates to content that may be considered deeply personal (such as emotions) it will go through an enhanced review process before research can begin,” he wrote.
He added that the panel conducting the review will include “senior subject-area researchers, along with people from our engineering, research, legal, privacy and policy teams.”
But what Schroepfer omits from his post is at least as telling as what he says. Consider, Schroepfer doesn't say what standards the review panel will use, and how -- if at all -- the panel will disclose the research to consumers, let alone seek their consent to participate.
Facebook's new policy marks the latest fallout from the news that the company manipulated the news feeds of 700,000 users in order to test whether their moods would be influenced by friends' posts. For the study, Facebook tinkered with the feeds by deliberately filtering out some positive or negative posts.
Researchers then examined those users' responses, and concluded that mood was “contagious.” That is, people shown more negative posts began posting more negative material, while those shown more positive comments themselves posted in a more positive tone.
When news of the tests came to light, many observers criticized the company for failing to obtain users' informed consent. Two law professors, the University of Maryland's James Grimmelmann and Leslie Meltzer Henry, have gone further, arguing that Facebook might have deceived its users by running secret psychological experiments.
“The failure to disclose research is an omission that a reasonable consumer would consider material in deciding whether or not to use a service,” they said this summer in a letter urging the Federal Trade Commission to protect consumers from future research projects.
More recently, they accused Facebook of violating a 2002 Maryland law requiring researchers to obtain people's informed consent before running tests on them. That measure requires researchers to describe their experiments to subjects, disclose the risks, and allow people to opt out, the law professors say.
If Grimmelmann and Henry are correct, Facebook's new policy will only protect the company if it plans to fully inform potential subjects about experiments, and allow them to avoid participating.