Commentary

The Danger Of Personalization: A Lesson From TED

At the TED conference last week, Chris Anderson prefaced one of the speakers by saying, "Imagine if, instead of all attending the same conference, we each attended our own personalized version of TED. You could dispense with all the uncomfortable, boring stuff and only watch the stuff you wanted to watch."

My reaction was one of visceral recoil, as was the reaction of everyone in my vicinity. The whole point of a conference like TED is to open your mind to new ideas -- ideas that, by definition, you didn't know you were interested in. At our TEDxChCh event last October, for example, I learned that estrogen-mimicking molecules can be fascinating. Who knew?

As it turned out, this was exactly how Chris was hoping we'd respond. The speaker was Eli Pariser, of MoveOn fame, and his talk was on the insidious encroachment of invisible filters on the Web content we see and consume. Consider, Eli said, his own Facebook feed. He's got a number of friends who are liberal, as you'd imagine, and a number of friends who are conservative. One day, though, he noticed that his conservative friends' comments weren't showing up in his News Feed anymore. Facebook, in its infinite wisdom, had determined that, because Eli is more likely to click on a link from a liberal friend than on one from a conservative one, it would be easier for everybody concerned to only show Eli links from liberal friends. Without consulting Eli, the algorithm quietly hid all the apparently unwanted messages.

advertisement

advertisement

I get the value of personalization, I really do. I've used this very platform to expound on the incredible power of truly personal attention. But there is a big difference between personalization of advertising and personalization of content, and it comes down to intent of consumption.

Advertising, as a general rule, is pushed onto us: marketers decide what message they want to put out to the public, and consumers either respond or don't. Because we know that the sole purpose of advertising is to sell to us, we tend to discount its information -- heavily. (Incidentally, the word most closely associated with "advertising" is "false.") There is very little difference in the covenant between advertiser and consumer if the ad is personalized or not.

Content, on the other hand, is the stuff that makes us willing to put up with the advertising in the first place. We might be motivated to watch the TV show or read the magazine or browse the website by any number of reasons: because a) we want to feel good about ourselves, or b) we want to be entertained, or c) we want to broaden our horizons, or d) something else. But here's the thing: the Internet doesn't know our motivation for consuming content at any given time. And, as Eli pointed out, the algorithms being used to personalize content are so ham-handed and clumsy that they don't even consider the possibility of (c). Broaden horizons? That's so 1990s. In 2011, apparently, people only want to reconfirm what they already know.

You don't need to extrapolate too far to see the danger of this: how it polarizes our society yet further, how it precludes empathy and understanding, how it belittles us and diminishes us and leaves us impoverished of the joys of serendipity. As Donald Rumsfeld might have said, the world is full of unknown unknowns, and we mustn't allow our technological infrastructure to deprive us of the opportunity to get to know a few.

Do you think involuntary personalization of content is a dangerous thing? Or are content providers just giving us what they think we want? Looking forward to hearing from you on Twitter or in the comments. I'll read your comment even if you disagree with me -- provided you can get past the filter.

10 comments about "The Danger Of Personalization: A Lesson From TED".
Check to receive email when comments are posted.
  1. Andrew Walmsley from Various, March 11, 2011 at 11:26 a.m.

    http://walmsleysdigitalview.blogspot.com/2011/03/death-of-serendipity.html

    They say the Queen thinks the world smells of fresh paint, and there's the national anthem playing everywhere.

    The internet is in danger of painting us into an intellectual corner - here's a link to a piece I wrote in 2009 on exactly this topic.

  2. Jill Berardi from SnapRetail, March 11, 2011 at 11:28 a.m.

    I agree with you that the relevance of content consumption is all about the INTENT of my need at that given time - not who I am as a person. For example, say I want an alternative view of my persona? I want the ability to put on that hat and demand the appropriate content. Personalization can go too far when mixed with predictive modeling. A content provider can't possibly know what I am interested in at any given moment in time! However, personalization does have a role in today's advertising. Amazon's "you may like these books too" hints are helpful and compelling - but not because they guessed my current need. It's helpful because it narrowed my search among Amazon's millions of book titles -something that would be daunting for me to do myself.

  3. Roy Perry from Greater Media Philadelphia, March 11, 2011 at 11:40 a.m.

    Can the genie go back into the bottle? Guessing NOT. Thanks to fast search technology it's been possible for quite a while to avoid exploring, analyzing, internalizing, remembering or learning any information at all...even where to get a cup of coffee! Skill with gadgets is the only skill you need to get by; ask a teacher how this changes his or her daily world. Now it's also possible to avoid experiencing, learning, reacting to or even hearing about things you've made clear are of no interest to you. Good thing? Bad thing? Probably somewhere in between...

  4. Digital Marketer from .., March 11, 2011 at 1:42 p.m.

    excellent, insightful point.

    i have stopped visiting facebook since i noticed it was selectively displaying posts based on the popularity of the poster (i was never a heavy FB user anyway).

    what you're also saying by implication is that this approach to content entrenches the status quo, handicapping new ideas and new voices by favoring the ones who *used to be* popular over the ones who might someday be popular if the playing field remained level.

    is it a dangerous thing? it surely for the content publishers who dare assert they know what i want, and get it wrong.

  5. Bruce May from Bizperity, March 11, 2011 at 2:46 p.m.

    This came up recently in the context of search engines and I had the same reaction then as I do now. Don't put up information barriers between me and the world. If these features are presented as options, then great. I can use them when I want and ignore them or turn them off as I like. But when they work in the background without me knowing it then I have a big problem with this.

  6. Judy Margolis from Independent, March 11, 2011 at 4:34 p.m.

    Agree wholeheartedly. It's taken close to 20 years to usher in the likes of Zite, etc., despite all the blue-skying back in the earliest Internet days about personalized newspapers delivered over your digital assistant. News junkies were not then and are not now the market for this service and the majority of folks, most of whom aren't heavy news consumers any way, are the least likely to exploit this functionality. So where's the market, save in academe and business?

  7. Tanya Thomas from TechNotate, March 12, 2011 at 7:57 a.m.

    "The Danger of Pesonalization". . . this has always been my complaint re: personalized search results and even localized listings -->I do not want to have those I am familiar with returned to me...I want to discover what I have never been introduced, am unfamiliar, and could not possibly describe in a phrase to query via search. As the commercial goes,
    'You don't know, what you don't know'.
    And even in respect to advertising, those are the results I want to have delivered to me. I don't care to have a machine match-make me in any form or fashion.

  8. Larry Allen from www.kikin.com, March 15, 2011 at 10:32 a.m.

    Giving the consumer intent driven content that is filtered by their favorites is a great thing, but you can not ignore the broader deadly relevant results. kikin is delivering personalized results for content, commerce and social across all services. Give it a try, http://www.kikin.com

  9. Paula Lynn from Who Else Unlimited, April 9, 2011 at 11:05 a.m.

    We are forging a very narrowscaped, perhaps boring and definitely mis/under education generation (especially those home schooled, under socialized kids without trained teachers). We are all sorry now that we didn't pay enough attention or learn something we could use now. Wait until nobody even knows that. As I said before, the price we pay for all of this unsocialization and disfocusing in the cacaphony will be so high we will have to bring back newspapers and mandatory school attendance with mandatory variety of subjects just to catch up with 3rd world countries, at least learn they exist.

  10. R.J. Lewis from e-Healthcare Solutions, LLC, July 1, 2011 at 12:46 p.m.

    Aren't we moving in the OPPOSITE direction? Historically, this is was "media" was... a channel by which a powerful few (editors) controlled what the masses read, listened too, or saw. The Internet was a great disruptor of this model by empowering anyone to become a content creator. We've democratized content and created niche verticals for every possible audience interest or fetish. At some point, and I'd argue we are already there, we will need tools (hopefully USER controlled, not Facebook/Google controlled) that help us better sort and categorize the data based on our interests. If Google and Facebook go the way of the traditional editor and try to "control" that flow, than the free market will come up with better solutions that empower the user... filtering is good, as long as we are in control of our own filters. This goes to the very nature of the way we think by the way.... it's important to remember that everything we think is already going through the "filters" we've established in our brain - and it's equally important to remember that we have control of those filters, and sometimes they get outdated and need to change. Our great "debates" on things like privacy, are largely driven by the fact that we aren't all using the same filters in our brains and some have "evolved" (I'm not saying that evolution is good or bad... just different from what it was historically) and thus we have debate.

Next story loading loading..