Targeting, Discrimination And Envy

Most of the serious minds exploring the implications of behavioral tracking and consumer rights over the years have always known that the real third rail of data-driven marketing and targeting is not privacy. Instead, discrimination is the area where this industry is most in danger of moving outside the lines of acceptable behavior, where legislators and citizens will quickly align with even with the most rabid anti-commercial watchdog. When audiences can be segmented and targeted in ways that provide preferential or punitive pricing to some, or exclude some targets from opportunities: that is when data catapults across any academic argument about whether cookies violate privacy.

“While big data is revolutionizing commerce and government for the better, it is also supercharging the potential for discrimination,” Wade Henderson, CEO of the Leadership Conference on Civil and Human Rights, told the AP in a report this weekend.  The AP was foreshadowing a report the White House plans to issue in the next week on the legislative and policy implications of big data.

Apparently, the study group, led by Presidential counselor John Podesta, thought they were going into the project to explore the parameters of privacy in the connected age. When meeting with various groups, Podesta and company were surprised to find that an even more thorny question was waiting to surface: parsing data, applying predictive analytics, etc. in ways that put some groups at a disadvantage.



If big data shows, for instance, that employees living further from their place of work are more likely to quit, how can an employer apply those presumptions to hiring practices? To what degree can insurers tap into users’ online behaviors and then extrapolate risks? Will certain kinds of online cookie history disqualify a consumer from certain online offers, pricing, jobs, or more? As the AP article points out, the law prevents the use of some kinds of consumer data like credit reports for certain kinds of targeting. But the online world and big data are filled with algorithms and models based on proxies. We don’t need the prohibited data point to discriminate against a user, because there are ten or a thousand other data points that can add up to the same profiling.

And how does the recipient or non-recipient of a digital offer even know whether she is being unfairly singled out? How do you know when the art of targeting involves your not seeing the “irrelevant” or other guy’s offer? This idea is nothing new, of course. As I said, anyone having paid attention to the larger serious arguments about data and its social, political and ethical implications should have been tracking this a while ago.

An old friend of this column and of Mediapost events, Prof. Joseph Turow wrote a book called "Niche Envy: Marketing Discrimination in the Digital Age" back in 2007. In that book, Turow was especially prescient in spinning out the many layers a data-driven marketplace might form, in which consumers are aware of the data discrimination game and even game the system to get preferential offers. “They may be alarmed if they feel that certain marketers have mistaken their income bracket, their race, their gender, or their political views. They may ask themselves if the media content that friends or family members receive is better, more interesting, or more lucrative than theirs, and whether the others have lied about themselves to get better material. They may try to improve their profiles with advertisers and media firms by establishing buying patterns and lifestyle tracks that make them look good – or by taking actions to make others look bad.”

Does this sound like an unlikely data dystopia? Perhaps not, if the algorithms are determining the availability and price of critical services like healthcare, jobs, and credit. When the stakes get high and personal, people will respond in strong and self-interested ways. Politicians and regulators are always looking for signs of aggregated power and control when scrutinizing emerging platforms like big data. Turow’s seven-year-old warning reminds us that these new technologies also spin out in unintended and unanticipated paths, toward social relations, and even self-definition.    

1 comment about "Targeting, Discrimination And Envy".
Check to receive email when comments are posted.
  1. Robert Pettee from DigitalMouth Advertising, April 29, 2014 at 12:53 p.m.

    If my grocery store prints a coupon with my receipt because I bought over $200 worth of items, but doesn't give the coupon to the next guy who bought $30 worth - would you call that "discrimination"? I get the health care or employment points - despite the fact that those industries hinge on algorithms and data - but let's not get carried away.

    Seems like this is the modern equivalent of "I'd go to the grocery store, but then my neighbors might see me and think I'm fat..." and we all just need to get over ourselves. Data is here to stay as long as it's kept unidentifiable at a personal level and there's not a single government body capable of reigning-in the use of data appropriately.

Next story loading loading..