“Future developments will, however, not just depend on technology, but also, most importantly, on people being comfortable with interactive technology.”
The quote comes from Kaveh Memari, CEO of Renew Orb, who posted it on the digital out-of-home media company’s website Monday after it discontinued a test in London that served personally targeted ads on digital trash bins that could detect and identify individuals based on the MAC address of their smartphones. The test generated a flurry of coverage in the U.K, but nary a word in the U.S. trade press. Maybe it’s because we didn’t get wind of it, or maybe it’s because we’re so inured by the real-world possibilities of targeting people based on their personal data, that we just don’t seem to care anymore. But Mamari, who no doubt wants to build a business around making people feel comfortable around such technology, is right. It ultimately will depend on people -- business people, for sure, but regulators and especially consumers too. And the big question is, are we ready for “Minority Report?”
Maybe, or maybe not. The Philip K. Dick story and movie adaption are set in the a science-fiction future world of 2054, but the reality is that many of the technologies, including the ability to serve ads to individuals based on identifying their physical presence, are here today. Sure, the film utilized a person’s own biometrics -- the unique code of their retinas -- to detect, identify and serve ads to them. I don’t know for sure, but I wouldn’t be surprised if that wasn’t technically feasible right now. The bigger question is why, when and for what purposes would we deploy it. The only time I ever had my retinas scanned -- as far as I know -- interestingly was when I was visiting an ad agency. It was right after the 9/11 attacks and the building security in Omnicom’s old offices on Madison Avenue had installed them as a security measure. I have no idea what happened to my retinal data, if it exists in some database somewhere, and if it will ever be used to identify me again, but I think it’s fitting that it had a Madison Avenue connection to it.
The first time I was aware of someone using my MAC address to identify me personally was during a visit to Interpublic’s lab near Herald Square a couple of years ago. At one point, while I and the rest of the MediaPost edit team paused by a large screen with dots on it, the lab’s curator said, “Those dots are you.” He then explained that the technology was able to identify when we -- or the MAC addresses on the smartphones in our pockets as a proxy for us -- entered and left the facility.
Interpublic said it was only researching that technology, and had no immediate plans to deploy it, but it has already begun deploying another technology developed by the lab and Interpublic’s Shopper Sciences unit, that could identify some personal characteristics of people looking at products on a store shelf. The technology utilized a method developed to help teach autistic people to detect emotions based on people’s expressions, and it was capable of telling Interpublic agencies and clients what the emotional state of people looking at their brands is. I have no idea how much that technology has been deployed, or if it has advanced much beyond that point. But the real point is that if you can imagine a technological application, you can pretty much figure out a way of developing and deploying it. And we’re a long way from 2054.
The real question is the one raised by Renew Orb’s Memari: what is our comfort zone?
Personally, I don’t believe that zone is fixed. If you ask me, it changes all the time. I thought nothing of letting my retinas be scanned and databased when entering a building after 9/11. I feel uncomfortable, but accept being technologically undressed by an airport scanner. But do I want to have ads -- and other things -- targeted at me because of those data? Well, I’m not sure I feel comfortable with that. And judging by the U.K. press’ reaction to Renew Orb’s test, neither are Londoners.But want a tip on how to make people feel comfortable with it? Regardless of whether it’s their MAC address, their retinas, their facial codes, or some digital cookie or other code tracking and identifying the who, when, where and why of who they are? The answer is easy, simply give them control over it. And by that, I don’t mean the jinky way the cryptic way the Digital Advertising Alliance, or any of its sub-contractors do it: with a series of convoluted opt-outs that often leave people wondering whether they left more of their data behind than would have been compromised in the first place. Those things are still confusing, and just not transparent enough. If you really want to make people feel comfortable sharing their identity, give them real control -- the ability to turn it on or off, and to share it with whom they want, when they want. Just like they do in the real world.