After 10 “likes,” Michal Kosinski knows you better than your work colleagues. After 70, he knows you better than your partner does, including -- whether these things were explicitly referenced in your clicks or not -- your skin color, your sexual orientation, whether you’re a Democrat or a Republican, whether you smoke or do drugs… The list goes on.
Like some creepy alt-Santa, he knows how open you are. Whether you’re a perfectionist. Whether you’re considerate. Whether you’re neurotic.
Kosinski isn’t a CIA agent or a spy. He isn’t even a marketer. He’s a researcher at Stanford, and he’s worked out how to turn your clicks into psychographic profiles.
As detailed in a long and superb article on Motherboard, Kosinski’s technique is similar to the technique used by the company Cambridge Analytica to help the Brexit and Trump campaigns win.
Your clicks helped Cambridge Analytica develop a psychographic profile of every adult in America, and Facebook’s targeting capabilities helped them reach every adult in America with exactly the ad that would most affect them.
During a presentation at the Concordia Summit last September, Alexander Nix, Cambridge Analytica’s CEO, showed “how psychographically categorized voters can be differently addressed, based on the example of gun rights, the 2nd Amendment: ‘For a highly neurotic and conscientious audience the threat of a burglary—and the insurance policy of a gun.’ An image on the left shows the hand of an intruder smashing a window. The right side shows a man and a child standing in a field at sunset, both holding guns, clearly shooting ducks: ‘Conversely, for a closed and agreeable audience. People who care about tradition, and habits, and family.’"
Those seem like nice examples. Also, if you’re black, you might see an ad in which Hillary Clinton refers to black men as predators, or if you live in Little Haiti you might see one about “the failure of the Clinton Foundation following the earthquake in Haiti.”
Whether it’s marketing presidents or Pepsi, this technology can help -- and that should terrify you.
“But, Kaila,” I hear you say, “Advertising has always been semisynonymous with manipulation. Why is this any different?”
In Isaac Asimov’s “Prelude to Foundation,” the robot R. Daneel Olivaw describes his reluctance to tamper with human emotions, even though he is eminently capable of doing so. “I try never to interfere except when I have no choice but to do so. And when I do interfere, it is rarely that I do more than strengthen, as little as I can, what is already there… Emotions, my dear Seldon, are a powerful engine of human action, far more powerful than human beings themselves realize, and you cannot know how much can be done with the merest touch, and how reluctant I am to do it.”
There is no clear boundary beyond which advertising becomes too manipulative. It’s not like we say, “It’s OK to manipulate people using clever copy and imagery, but it’s not ok to manipulate them using psychographic profiling and precision targeting.”
Perhaps there should be. The latter seems to be a far more powerful engine of human action than we realize, and it is precisely this power that makes it simultaneously irresistible to marketers and dangerous to society. Olivaw was programmed to show restraint; marketers are programmed for exactly the opposite.
Kosinski believes that having access to this information, "could pose a threat to an individual's well-being, freedom, or even life." This might seem hyperbolic, and clearly he wasn’t only talking about advertising. But those ads are skillfully and intentionally crafted especially for you. Some of them are designed specifically and powerfully to activate your worst fears and your lowest self. And when they are effective, you will not respond as an independent, rational creature. You will dance. And they will fiddle, as society burns.