Searching For Ethics
After the Department of Justice subpoenaed search engine records last year and AOL posted search histories just this summer, New York University Assistant Professor and privacy expert Helen Nissenbaum and her colleague Daniel Howe constructed an antidote to search tracking. Their TrackMeNot plug-in for the Firefox browser thwarts reliable search history tracking by masking your real search history with a torrent of random and nonsensical queries. Nissenbaum is also working on a book about privacy, so we asked her to explore why search behaviors represent a special case when it comes to online tracking.
Behavioral Insider: What sort of consumer controls should there be on what search engines keep?
Nissenbaum: As users, we all need to figure out what policies to set for the proper use of this information--[ones] in the interests of people situated in many different positions in the chains, and also consistent with the values to which society subscribes. That's why I think that, particularly in the case of Web searches, this is information which should be protected. It shouldn't be open to just anybody who wants to see what your or my searches were.
BI: How is search history different from credit cards and phones that leave trails? Is there something particular about the search query that changes the game?
Nissenbaum: I think that credit card records are also sensitive. This is my whole philosophy of privacy; there is no information to which norms don't apply. There is no information where I can say anything goes. But there are certain types of information that touch on certain issues and should be controlled to a greater extent. I do think search histories are in that category because I think about searches the way I think about lifestyle choices.
These are the kind of activities that a liberal society believes should be under the control of the individual unless we have concrete evidence that what the person is doing may be harmful to others.... I think we think there are areas of life [where we should] indulge in free inquiry.
BI: Do you think that search is a technology that crept up on us and involves a privacy we didn't realize we hold dear?
Nissenbaum: I think that a lot of it is going to depend on how you think about the activity of Web search. It wasn't until relatively recently that search became such an important method for inquiring and doing research or finding critical information that could be really important to our lives and health and relationships...There might have been an implication in what you said that therefore we have to give up our privacy. But I don't think that that follows. I think we as a society should think about what sort of policies are the best to govern what search companies or anybody could be doing with this information.
I don't have any direct contact with search companies. I only go by the statements they make with the popular media. When Google was asked whether it would re-assess its policies in response to this embarrassing episode with AOL, they just answered 'no.' Not even 'maybe.' There is something important there, and we do need to rethink our response.
BI: Do we know how much data Google and Yahoo actually retain, and for how long?
Nissenbaum: I don't know for sure because this is not well publicized. I do have another Ph.D. student in my department doing a dissertation [that's attempting to] create a picture of exactly what Google keeps. Just speaking secondhand, they keep everything. If you create a profile for yourself in which you identify yourself or have a Gmail account and you do searches within your Gmail account, then your searches can be identified without your realizing it.
BI: Is search becoming such a fundamental part of our lives and related to things we do protect, like health records, so that some legal protection for consumer search history is needed?
Nissenbaum: I'm hard-pressed to say when, exactly, we need federal legislation. The people in the area of law have helped me think about when regulation is important. I learned that when you look at a situation in which companies are almost compelled to violate people's privacy in order to maximize profit--then if you don't [regulate], you are creating a competitive disadvantage. I think that's the moment when regulation is not only good for the people whose privacy is being violated, but it's also good for the company. It allows them to play on a level playing field. So it may be that in this instance, when there is the potential for exploitation, that is the answer. Although what exactly you do in that law, I don't know.
BI: Are Americans as upset about online privacy as the press and some academics are?
Nissenbaum: There have been some good studies by the Pew Trust and Annenberg that show that people really do care about online privacy. Skeptics will say people say they care about privacy, but when you look at their behavior, they obviously don't care. I am very skeptical about those arguments. How are you measuring their behavior? You can't only tell by what people do....
I didn't fully realize that my searches were being logged to this extent until I saw the Department of Justice subpoena. Then it really hit home. And I consider myself an expert. Second, people aren't often given choices. They are stressed out and paying attention to a million things, and so this is just something that they go along with.
BI: Marketers and search providers would argue that using things like search histories helps them deliver more relevant content to users and make users' lives more efficient. Doesn't it seem that there is now a level of intimacy between private companies and consumers that wasn't there ten years ago? Are we seeing a fundamental shift in the way media and business relate to consumers?
Nissenbaum: On the one hand, we live in a much more impersonal world, and so this may bring back some of the personal relationships we had with professional people in the past. I don't know if all of that can be proved. I don't know if it's just that we are bringing back something that we've had, or whether there is greater intimacy in that relationship between individual and institutions.
I do think that a lot of the personalized interaction is good. There is this weird thing that says, well you must take the bad with the good. If you expect personalized service, then you really have no choice but to expect all the other things that come with it, like the data aggregations and data mining and the profiling. I am not sure that that really follows. Why can't we have personalized relationships and reasonable policies governing the flow of information about individuals?