Commentary

Report: Our Privacy Regimens Are Outdated

In a new study by The World Economic Forum and the Boston Consulting Group, researchers argue that the traditional models for acquiring user permission to use their data is no longer applicable or helpful in an age of big data and interconnected systems. “Unlocking the Value of Personal Data: From Collection to Usage” contends that our notions of data privacy protection are mired in 1970s computing architectures and use cases. When data lived in discrete silos and was aimed at specific purposes, then it was appropriate to ask the user a binary question at the point of collection: Can we use this data for this purpose?

But the modern enterprise’s use of data has evolved, and keeping permissions tied to any given use case is impossible or impractical. The sheer volume of data collected on an individual at any given point is also so comprehensive that it would be overwhelming and burdensome for the subject to be consulted at every point of collection, let alone every different use of the data.

advertisement

advertisement

As the WEF is quick to point out, since its interests are in advancing business, then the social and economic benefits of wide use of data is inarguable. Not surprisingly, they use humanitarian examples left and right to illustrate how anonymized health data has helped improve hospital patient outcomes and costs, for instance.

The study’s authors contend that a new approach requires our shifting from protecting privacy at the point of data collection and instead focus on data usage.  “Data itself does not create value or cause problems,” the authors put it pithily. “Its use does.”

Yeah, I can’t be the only one out there that hears the echoes of NRA in that one.

Still, there are important points to be made here about how the context of data use is critical in assessing the risks of privacy violation. The authors suggest, for instance, that risk be associated with whether a particular use could personally affect a user. They call for a “flexible, dynamic system” in which all stakeholders agree to guiding principles and have both codes of conduct and technology to enforce them.

Exactly what shape this system takes is hard to say. The WEF and BCG point to two initiatives in the U.K. and U.S. where consumers have access to public data and  can even manage their own data. In the U.K., an initiative is trying to assess how to give people access to the data collected about them in forms they can understand and then voluntarily pass on to third parties. A U.S. initiative gives consumers access to massive data sets the government maintains  can be of help to consumers. The idea here is that data is something that isn’t just collected from citizens but a resource that citizens themselves can leverage.

The authors argue sensibly that the range of data collection, data interconnectedness and undiscovered use cases make simple notify and consent options for data use at the point of collection impractical. The practical and safe alternative, however, is still unclear. Whether and how organizations of any sort should be allowed to determine when they think the use of a person’s data will or will not directly impact them, is a very open question, however. Moving attention away from securing permissions only at the point of collection only underscores the need for security of data throughout the system. Are we really anywhere near giving consumers that level of assurance that bad actors can’t grab personally identifiable data regardless of permissions granted to a more trusted party?  

And the practices around informing and explaining data practices to the end user remain mired in legalese that is aimed at maximizing some principle of “transparency” that is something altogether different from consumer “understanding.” That is a distinction these authors make that is laudable, I think. Disclosure, when complex and overwhelming, actually works against the interests of consumer control because people just tick off boxes out of habit rather than understanding. The report provides some interesting examples of how Mozilla, PrivacyScore and others are experimenting with icon-based and simplified categorization schemes that communicate the nuances of data-sharing policy more clearly.  

It is important to mix into the arguments the fact that consumers are no longer just data subjects. They also consume data as they themselves look to apps, trending topics, user reviews and ratings, hospital outcome statistics, etc. to make their own decisions.  There is something to be said for engaging with consumer more in a spirit of collaboration, where the role of data and analytics in all of our lives helps further explain how this information is used by third parties.

I am not doing the full report justice here. It raises some excellent issues, even if it is within a framework that aims to argue for the widest business use of data. It is worth downloading in its entirety here.  Still this kind of rethinking of basic principles of data and privacy can cut both ways and help us imagine a very different and consumer-centric approach. What if the starting point was optimizing data usage for privacy and personal control? What if we started from the principle that private companies cannot use and share data in ways that can’t be easily, practically explained to users at the key usage points?

Why not put the onus on industry to erect structures of understanding as prerequisites to using a person’s data at all? The communications piece of privacy protection and control has always suffered from being the last aspect of the data infrastructure put into place, having to explain the complexity of practices that third parties built mainly to benefit their businesses. Putting the consumer first in building data practices would force companies to front-load into their programs immediate and clear benefits to their consumers, a value exchange at the point of collection rather than a check box.  

2 comments about "Report: Our Privacy Regimens Are Outdated".
Check to receive email when comments are posted.
  1. Ben Isaacson from Part-Time Privacy, March 4, 2013 at 2:28 p.m.

    While there will always be a front-end transparency and consent construct, there is a lot of merit to move this debate to the back-end and about the intended data use. The best example is the FCRA law where anyone dealing in eligibility decisioning (irrespective of the type of data collected-including social) has to create processes to best understand it's proposed use to determine compliance. This is where privacy engineering is paramount-and leaving aside the arcane decisions of whether to check or uncheck the box.

  2. Pete Austin from Fresh Relevance, March 5, 2013 at 9:31 a.m.

    Re: “Data itself does not create value or cause problems,” the authors put it pithily. “Its use does”. Utter nonsense: the mere existence of collections of personal data *is* a problem in itself, because any sufficiently valuable data will eventually be stolen and misused.

Next story loading loading..