These are among the issues considered by Joseph Turow, professor at Annenberg School for Communication, University of Pennsylvania, as he studies the social implications of BT on consumers. He recently shared some of his thoughts with MediaPost. Excerpts from the interview follow.
MediaPost: What are the social implications of behavioral targeting?
Turow: Most people talk about harm. Often they talk about finding out about prescription drugs that could affect medical insurance. I think it's important to be concerned about issues of harm and issues of invasion of privacy, but we underestimate the importance of social discrimination and behavioral targeting. Not just behavioral, but those related, such as physiographic, demographic and implied social relations based on activities on social sites.
We're creating a situation where people will not only receive advertisements and discounts, but also news and entertainment. Advertisements we see are information based on ideas about us that we may not realize exist.
I'm not objecting to target marketing. That horse has left the barn. I'm concerned about consumers not being able to monitor or have control of the information companies have about them. Or, not knowing what to do about it once they discover the companies have the information.
MediaPost: How can companies make consumer more aware and provide directions on how to change or stop the flow of information, other than opting out?
Turow: Many times companies use the word 'transparency.' While that's important, simply knowing what happens really doesn't help much. It's what I call the tough-luck approach to information. Many sites have a clause that reads here's what we do, but tough luck.
The answer is 'false.' But typically between 20% and 22% say it's false, between 58% and 62% say it's true, and the remainder don't know. These are intelligent people who think this.
We don't know what companies know about us. And we don't know where to access them. There is information about you creating what I call reputation silos, which will determine the ads, discounts and, eventually, news and entertainment.
MediaPost: Is it possible for the search engine to know you better than you know yourself?
Turow: It's not a matter of knowing. It's a matter of constructing. Audiences don't exist. They are constructed. Advertisers do it. Networks do it. I do it when I'm talking to an audience. We construct the people we address through the categories we have about them. The constructions made are intimately related to the value and reward systems. So when a network decides on an audience, it's thinking 18- to 24-year-olds with specific characteristics, because that's what the advertiser is interested in.
The question is how do we know you -- and is it a construction you would agree with? The profile a company has about me might be accurate, but it may be a vision about me that I don't agree with, don't think they should have, and wouldn't want to have a media environment based on it. But that's the direction we are headed in. And that's why companies need to respect the information they have about consumers.
Companies need to inform people how to change the information they have, reject it, and how to either align their interests or have the information removed from their grid. The notion that companies are creating profiles about people that they may not understand is inherently problematic from a social standpoint.
MediaPost: What is a privacy dashboard?
Turow: It's a notion that when a company serves a targeted ad, there should be a little icon in the corner. Clicking on the icon should reveal why you received the ad you did. It's a grid that tells you the demographics and information we know about you. It will give people a sense of the information companies have. The advantage of doing it at the moment is that you can access it when viewing the ad, so it's there and it's gone. There's no concern of someone getting into a database where this information is stored. And, when companies know they need to show you where they got the information, they will treat you with more respect.
If companies know the information they use to create these stores will be accessible by you, they will be more careful about the stories they tell.
Google uses a dashboard, but they have been careful not to call it a privacy dashboard. The stuff Google uses about us is relatively primitive compared with some of the stuff others do. They use words like 'previous activities,' rather than 'we use information from searches.' Having Google allow you to check categories and decide whether to opt in or opt out is a good beginning, but it give the wrong impression on what's really going on in the behavioral targeting market.