The simple -- and most plausible -- answer is that we’re really not being given a choice.
As MediaPost Editor in Chief Joe Mandese noted in a very on-point comment, what's being created is a transactional marketplace where offers of value are exchanged for information: “Like any marketplace, you have to have your information represented in it to participate. If you're not 'listed,' you cannot receive bids (offers of value) based on who you are.”
Amazon is perhaps the most relevant example of this. Take Alexa and Amazon Web Services (AWS). Alexa promises to “make your life easier and more fun.” But this comes at a price. Because Alexa is voice-activated, it’s always listening -- which means the privacy of anything we say in our homes has been ceded to Amazon through its terms of service. The same is true for Google's Assist and Apple's Siri.
But Amazon is pushing the privacy envelope even further as the company tests its new in-home delivery service, Amazon Key. In exchange for the convenience of having your parcels delivered inside your home when you’re away, you literally give Amazon the keys to your home. Your front door will have a smart door lock that can be opened via the remote servers of AWS.
Opt in, and suddenly you’ve given Amazon the right not only to listen to everything you say in your home, but also to enter your home whenever it wants.
How do you feel about that?
The key question then becomes: How do we feel about the convenience/privacy exchange?
It turns out our answer depends in large part on how that question is framed. In a study conducted in 2015 by the Annenberg School for Communications at the University of Pennsylvania, researchers probed respondents' sensitivity around the trading of privacy for convenience. Here's a sampling of the results:
Here, along the spectrum of privacy pushback, we start to see what the real problem is. We’re willing to exchange private information, as long as we’re aware of all that's happening and feel in control of it.
But that concept is unrealistic, of course. We can’t control everyting. And even if we could, we’d soon learn that the overhead required to do so is unmanageable. It’s why Vint Cerf said we’re going to have to learn to live with transparency.
Again, as Mandese points out, we’re really not being given a choice. Participating in the modern economy required us anteing up personal information. If we choose to remain totally private, we cut ourselves off from a huge portion of what’s available.
And we're already at the point where the vast majority of us really can’t opt out.
We all get pissed off when we hear of a security breach a la the recent Equifax debacle. Our privacy sensitivities are heightened for a day or two, and we give lip service to outrage.
But unless we go full-out Old Order Amish, what are our choices?
We may rationalize the tradeoff by saying the private information we’re exchanging for services is not really that sensitive. But that’s where the potential threat of Big Data comes in. Gather enough seemingly innocent data, and soon companies can start predicting with startling accuracy the aspects of our lives we're sensitive about. We run headlong into the Target Pregnant Teen dilemma.
And that particular dilemma becomes thornier as the walls break down between data siloes, and our personal information becomes a commodity on an open market.
The potential risk of trading away our privacy becomes an escalating aspect here. It’s the frog in boiling water syndrome, starting innocently but then becoming a scenario that will keep most anyone up at night with the paranoiac cold sweats.
Let’s say the data is used for targeting, singling us out of the crowd for the purpose of selling stuff to us. Or, in the case of governments, seeing if we have a proclivity for terrorism. Perhaps that isn’t so scary if Big Brother is benevolent and looking out for our best interests. But what if Big Brother becomes a bully?
There's another important factor to consider here, one that may have dire unintended consequences. When our personal data is used to make our world more convenient for us, that requires a “filtering” of that world by some type of algorithm to remove anything that algo determines to be irrelevant or uninteresting to us.
Essentially, the entire physical world is “targeted” to us. And this can go horribly wrong, as we saw in the last presidential election. Increasingly we live in a filtered “bubble” determined by things beyond our control. Our views get trapped in an echo chamber and our perspective narrows.
But perhaps the biggest red flag is the fact that in signing away our privacy by clicking "accept, we often also sign away any potential protection when things do go wrong.
What almost no one caught were “gotcha clauses” about data sharing with the NSA and giving up your first-born child. While these were fictitious, real terms of service and privacy notifications often feature clauses that include total control over the information gathered about you, including giving up your right to sue if anything goes bad.
And even if you could sue, there might not be anyone left to sue. One analyst calculated that even if all the people who had their financial information stolen from Equifax won a settlement, it would actually amount to about $81 each.