The Privacy Monster Under The Internet Bed

"I think Big Foot IS blurry. That's the problem. And that's extra scary to me. 'Cause there's a large, out-of-focus monster roaming the countryside." -Mitch Hedberg

It turns out that 81% of respondents to a Consumer Reports survey want to see a universal do-not-track mechanism, reported MediaPost's Wendy Davis yesterday. Further, nearly half of us don't want customized ads. (This appears to be a significant drop from the 2009 report she references, in which two thirds of respondents didn't want customized ads.)

What do you want to bet that virtually everyone in that 81% has cookies enabled? Is on Facebook? Uses Google? What do you want to bet that they don't realize that their search results and even their News Feeds are being adjusted based on their clickstream?

At a talk I attended a few years ago, Hal Varian, Google's Chief Economist, said that we don't seem to actually care about companies having our data. We don't care that Google knows what we search for in the wee hours of the night or that Facebook has a permanent record of every indiscretion we've ever committed. What we care about is that information "getting out." We care about AOL Valdez and identity theft. The problem, he pointed out, is not a privacy one; it's a security one.



But the real problem with public opinion and Consumer Reports surveys is that privacy questions sound scary. And part of the reason they sound scary is because we don't really understand them.

The things we are most afraid of are the things that are somewhat mysterious. That is why Stephen King's books are inevitably scarier than his movies: because our imagination leaves the monsters sufficiently vague to allow our terror free rein. That's why, in "Monsters Inc.," Sulley stopped being scared of Boo once he actually saw her for who she was.

Don't get me wrong here; I'm not suggesting that customer tracking and targeting is the same as an animated two-year-old girl. "Aww, look, that personalized ad is adorable! You just needed to get to know it." What I'm exploring is the fact that we all say we're concerned about privacy online and yet we voluntarily subject ourselves to services -- Facebook and Google being the two best examples -- that regularly invade our privacy. They give us what they think we want to see. They filter results based on the data they glean from our Web habits. And we continue to surf, and to click, and to proffer said data, and to reward advertisers who use behavioral targeting with improved ROI.

Our collective actions are not consistent with our collective statements -- and that tells me that we don't really understand the issue, and that our fear is based on the fuzzy monster in the closet rather than on the actual dangers of personalization.

And, to me, that is one of the biggest actual dangers of personalization: how little we ARE aware of it. This is why MoveOn founder Eli Pariser is so concerned about filter bubbles. The problem is not that we're surrounding ourselves with a like-minded choir; it is that we are letting ourselves be surrounded, often without realizing it. (As I write this, Danny Sullivan just tweeted: "We really don't need to worry Google & Facebook cause a "filter bubble.' People totally create their own without help.")

Do I care if you serve up an ad based on my browser history? Truth is, most of us don't care about ads at all -- not enough to want to see them, personalized or otherwise. Nobody goes online for the ads. Our preference for targeting, or lack thereof, is generally post-facto rationalization triggered by... well, a survey question.

So I'll close on my own question: what are your thoughts on Web privacy? And just how scared are you of that blurry monster?

7 comments about "The Privacy Monster Under The Internet Bed".
Check to receive email when comments are posted.
  1. Douglas Ferguson from College of Charleston, July 1, 2011 at 11:11 a.m.

    I wonder how Consumer Reports worded the question. 82 percent suggests that it was a "loaded" wording, because you can't get 82 percent of people to agree on ANYTHING!

  2. Rick Monihan from None, July 1, 2011 at 11:22 a.m.

    I think different people have different tolerances for the level of invasive collection that occur. However, we all tend to want the things the Web offers without thinking twice about the overall ramifications of having data collected while we get those things.

    I clear my data regularly. I know that doesn't make all information on my computer disappear, and I know it doesn't make me invisible. But it provides a layer of comfort. I'd much prefer to block everything being collected, but then I'd have trouble accessing some of the sites I like to visit. It's a double edged sword - I am willing to allow some data collection in order to visit sites I'm interested in, but I know that this is something I'm uncomfortable with.

    Most people who are not savvy at all are surprised to hear how I delete alot of the stuff dropped on my computer. They want to learn to increase their privacy when they learn they can. Not out of fear, but rather out of a desire to maintain a level of self. I look at it this way: if your dream house was offered to you, but you had to keep the curtains open at all times and allow people to peek in from time to time, would you want the house?
    Probably not. But that, in effect, is what we do on the web. And we do it because we're not offered the choice. We all have the opportunity to limit the "peeks" (as I do), but many are unaware that there are ways to draw the curtains a bit.

    I'm not scared of the blurry monster at all (which is a terrific analogy, btw). I'm concerned that the blurry monster is slowly moving past peeking in my window and starting to find ways to open the windows themselves in an attempt to "help" me rearrange my life.

    Recently, I read Esther Dyson had suggested we allow people to "Opt In" their information. I agree with her. There should be a means, for those people who have no qualms about sharing what they do online, to let them allow "peeks". They should get something of value for this - perhaps improved access to certain sites. This makes more sense than the current model which is entirely built upon "Opt Out" (and a very weak opt out it is).

  3. John Montgomery from GroupM Interaction, July 1, 2011 at 11:47 a.m.

    That 82% of consumers dont want to be tracked comes as no surprise.
    If you ask people whether they want to pay tax or take out the garbage they would respond in a similar way.
    Privacy advocates have to stop using research in this way - it undermines their integrity.
    Of the consumers who click on the "Ad Choices" icon (we can readily assume they are interested or concerned about their privacy) less than 1% decide to opt-out. To say this at odds with the research mentioned in the article would be an understatement.
    I wonder how consumers would respond to the following question:
    "In exchange for free content (in fact, a largely free internet) would you be prepared to share some of your (non personally identifiable) browsing behavior with an advertiser?"
    Its all about a value exchange.

  4. Paula Lynn from Who Else Unlimited, July 1, 2011 at 12:08 p.m.

    It's not so much allowing the thumper thing to have some of your basic demographics for their own use to present you with a blue one or a faster one. It's that the thumper thing company can sell that information to the rest of the outside world. Just because they say they won't, guess what? Who is going to stop it? And when it goes from a peek in the window to a full time camera in the window with that camera following you all over. Who's going to stop it? Eliminating all social networking is a start, but.......

  5. David Carlick from Carlick, July 1, 2011 at 2:26 p.m.

    People don't often do what they say they will do in a survey or focus group. That's why it is better to rely on their behavior, which ironically, can be so carefully tracked, and which points to a giant 'I don't care enough to be bothered.'

  6. Rick Monihan from None, July 1, 2011 at 2:56 p.m.

    I believe if more users knew what information was being collected, they'd be more concerned.
    I agree that replying to a survey, as opposed to how people act in real life, represent 2 different things. But sometimes they just don't see or realize that something is happening to them.

    Based on my interactions with folk not in our industry, many don't realize they are being followed, tracked, or having information about them collected. So of course their behavior seems to indicate they don't care. Most don't know.

    I'm certain that if they don't know this piece of information, they certainly have no idea what an "Ad Choices" icon is, and are unlikely to click on it. I work in the industry, and asked my wife and kids if they were familiar with it. None were. Nor were my neighbors. After showing them what the page looks like after I clicked on it, I asked how they would respond. Most replied that it seems too complicated and looks as if someone were actually trying to collect more information, and felt they would close it.

    In the industry, we would like to believe typical web users understand these issues in a fashion close to the way we do - but most do not. They are unlikely to opt out of something they don't know the ramifications of, simply because that is the "easy way". We teach users to NOT click on things they don't know or recognize, then expect them to click on something to opt out. We send them to a page which is designed to help them understand the issues, but the surprise of being sent to such a page will take many people off guard and sometimes confused.

    It's well known that in order to get people to join a 401(k) plan firms should offer it as opt-out, because most people will not opt-out of something. There are many studies documenting this behavior. Opt-in, on the other hand, becomes a different animal altogether because it requires an understanding of what one is opting into, and making a decision if opting in is "worth" it.

    In the email business, opt-in lists are far more useful and valuable than lists derived via the opt-out method, and the reasons for this are pretty obvious.

  7. Craig Mcdaniel from Sweepstakes Today LLC, July 1, 2011 at 3:52 p.m.

    Kaila, you are lacking of knowledge of what "tracking" means and how it is done. Yes cookies do track small amounts of information. My members love the way we use cookies and it is to there benefit. We track the number of sweepstakes entries on Sweepstakes Today, when they last visited the site and the sweep and more. WE DO NOT track personal information or sell it.

    What I see all the time is "TRACKING URL's". This is when a sponsor will put in a tracking code within the URL address. If I can not take out the tracking part of the URL, I simply don't run the sweep. The sponsor loses out big time when this happens.

    Make sure what you are writing about and with uptodate information.

Next story loading loading..