My reaction was one of visceral recoil, as was the reaction of everyone in my vicinity. The whole point of a conference like TED is to open your mind to new ideas -- ideas that, by definition, you didn't know you were interested in. At our TEDxChCh event last October, for example, I learned that estrogen-mimicking molecules can be fascinating. Who knew?
As it turned out, this was exactly how Chris was hoping we'd respond. The speaker was Eli Pariser, of MoveOn fame, and his talk was on the insidious encroachment of invisible filters on the Web content we see and consume. Consider, Eli said, his own Facebook feed. He's got a number of friends who are liberal, as you'd imagine, and a number of friends who are conservative. One day, though, he noticed that his conservative friends' comments weren't showing up in his News Feed anymore. Facebook, in its infinite wisdom, had determined that, because Eli is more likely to click on a link from a liberal friend than on one from a conservative one, it would be easier for everybody concerned to only show Eli links from liberal friends. Without consulting Eli, the algorithm quietly hid all the apparently unwanted messages.
I get the value of personalization, I really do. I've used this very platform to expound on the incredible power of truly personal attention. But there is a big difference between personalization of advertising and personalization of content, and it comes down to intent of consumption.
Advertising, as a general rule, is pushed onto us: marketers decide what message they want to put out to the public, and consumers either respond or don't. Because we know that the sole purpose of advertising is to sell to us, we tend to discount its information -- heavily. (Incidentally, the word most closely associated with "advertising" is "false.") There is very little difference in the covenant between advertiser and consumer if the ad is personalized or not.
Content, on the other hand, is the stuff that makes us willing to put up with the advertising in the first place. We might be motivated to watch the TV show or read the magazine or browse the website by any number of reasons: because a) we want to feel good about ourselves, or b) we want to be entertained, or c) we want to broaden our horizons, or d) something else. But here's the thing: the Internet doesn't know our motivation for consuming content at any given time. And, as Eli pointed out, the algorithms being used to personalize content are so ham-handed and clumsy that they don't even consider the possibility of (c). Broaden horizons? That's so 1990s. In 2011, apparently, people only want to reconfirm what they already know.
You don't need to extrapolate too far to see the danger of this: how it polarizes our society yet further, how it precludes empathy and understanding, how it belittles us and diminishes us and leaves us impoverished of the joys of serendipity. As Donald Rumsfeld might have said, the world is full of unknown unknowns, and we mustn't allow our technological infrastructure to deprive us of the opportunity to get to know a few.
Do you think involuntary personalization of content is a dangerous thing? Or are content providers just giving us what they think we want? Looking forward to hearing from you on Twitter or in the comments. I'll read your comment even if you disagree with me -- provided you can get past the filter.