"I think Big Foot IS blurry. That's the problem. And that's extra scary to me. 'Cause there's a large, out-of-focus monster roaming the countryside." -Mitch Hedberg
It turns out that 81% of
respondents to a Consumer Reports survey want to see a universal do-not-track mechanism, reported MediaPost's
Wendy Davis yesterday. Further, nearly half of us don't want customized ads. (This appears to be a significant drop from the 2009 report she references, in which two thirds of respondents didn't
want customized ads.)
What do you want to bet that virtually everyone in that 81% has cookies enabled? Is on Facebook? Uses Google? What do you want to bet that they don't realize that
their search results and even their News Feeds are being adjusted based on their clickstream?
At a
talk I attended a few years ago, Hal Varian, Google's Chief Economist, said that we don't seem to actually care about companies having our data. We don't care that Google knows what we search for
in the wee hours of the night or that Facebook has a permanent record of every indiscretion we've ever committed. What we care about is that information "getting out." We care about AOL Valdez and
identity theft. The problem, he pointed out, is not a privacy one; it's a security one.
advertisement
advertisement
But the real problem with public opinion and Consumer Reports surveys is that privacy questions
sound scary. And part of the reason they sound scary is because we don't really understand them.
The things we are most afraid of are the things that are somewhat mysterious. That is why
Stephen King's books are inevitably scarier than his movies: because our imagination leaves the monsters sufficiently vague to allow our terror free rein. That's why, in "Monsters Inc.," Sulley
stopped being scared of Boo once he actually saw her for who she was.
Don't get me wrong here; I'm not suggesting that customer tracking and targeting is the same as an animated
two-year-old girl. "Aww, look, that personalized ad is adorable! You just needed to get to know it." What I'm exploring is the fact that we all say we're concerned about privacy online and yet we
voluntarily subject ourselves to services -- Facebook and Google being the two best examples -- that regularly invade our privacy. They give us what they think we want to see. They filter results
based on the data they glean from our Web habits. And we continue to surf, and to click, and to proffer said data, and to reward advertisers who use behavioral targeting with improved ROI.
Our collective actions are not consistent with our collective statements -- and that tells me that we don't really understand the issue, and that our fear is based on the fuzzy monster in the closet
rather than on the actual dangers of personalization.
And, to me, that is one of the biggest actual dangers of personalization: how little we ARE aware of it. This is why MoveOn
founder Eli Pariser is so concerned about filter bubbles. The problem is not that we're surrounding
ourselves with a like-minded choir; it is that we are letting ourselves be surrounded, often without realizing it. (As I write this, Danny Sullivan just tweeted: "We really don't need to worry Google & Facebook cause a "filter bubble.' People totally create
their own without help.")
Do I care if you serve up an ad based on my browser history? Truth is, most of us don't care about ads at all -- not enough to want to see them,
personalized or otherwise. Nobody goes online for the ads. Our preference for targeting, or lack thereof, is generally post-facto rationalization triggered by... well, a survey question.
So I'll close on my own question: what are your thoughts on Web privacy? And just how scared are you of that blurry monster?