Top 10 Privacy Lessons Mobile And TV Can Learn From The Web

While many Web advertising companies individually have done a good done of offering consumers transparency and choice, collectively the Web industry has not done a great job managing privacy protection issues. Every week, there are headlines in the trade or consumer press of some creepy Web tracking or data practices recently uncovered -- whether it's Facebook apps leaking user IDs, or companies scraping personal comments on health forums or releasing "anonymous" search data that ended up not being so anonymous.

 Of course, the ability to deliver precision targeted ads is one great promise of the Web -- and is a critical driver for the tens of billions of ad and marketing dollars funding it and its extraordinary and robust array of content and services delivered to billions around the world for free. The Internet is learning lessons the hard way: making mistakes, taking heat from the press, regulators and the public, and then retroactively fixing those mistakes.

Today, both the mobile and TV industries are just starting to embark on targeted advertising. Both could learn some valuable privacy lessons from the Web industry's experiences. Here are my top 10:

Take Privacy Protection Seriously. The protection of personal privacy cannot be ignored when companies use digital media and communication channels to capture user information or deliver tailored ads. It doesn't matter that it's happening behind the scenes. It doesn't matter that the practices are no different from those direct marketers have used for years. Digital is different. People care. What happens in offline data collection doesn't scare them as much as online data collection does.

Bake Privacy Protection in from the Beginning. I've borrowed this idea from Federal Trade Commission chair Jon Leibowitz: Make privacy protection an integral part of all targeted mobile and TV ad offerings. It's much easier than retrofitting privacy protection later.

Embrace Self-Regulation Early. The only way to prevent new legislation or regulation -- which is never a great way to solve a problem -- is to be proactive about self-regulation. The mobile and TV industries should leverage the work of the IAB, AAAA, ANA and DMA and their pioneering self-regulatory frameworks, as mobile and TV companies go down similar paths.

No Creepy. What matters most is not just avoiding what's illegal, but avoiding what's creepy. You can't have a productive and long-term advertising relationship without trust. If it seems creepy, don't do it. If nothing creeps you out, ask your mother or neighbor or child for a reaction; they are probably better thermometers. Deep-packet inspection is creepy. It doesn't matter how many Big 4 auditors say that it's legal. Enough said.

No Personal Data. You don't need data that can be related to particular individuals to deliver dramatically better ads. Also, just because the recipient is anonymous to you, your targeting might not seem anonymous to them. Billions of dollars of TV ad inventory is bought each year on not much more than demographic projects, none of it remotely personal.

Use Broad, Anonymous Segmentation. Rather than trying to thread the needle between personal and anonymous, use broad, anonymous consumer segmentations instead. You don't need one-to-one targeted-to-deliver ads that perform hundreds of percentage points better than the "spray and pray" method used in mass media advertising today. Mass customization against broad segmentations can do that just fine.

Limit Individual Appended Data to First-Party Uses. Lots of folks want to append third-party data to media channel behaviors at the individual level and then sell that for usage on many other third-party media and marketing channels. Once this happens, it's hard for that data to be protected. We just heard from Yahoo that they don't even know what data is being captured on their own sites. Let's not let that happen to mobile or television.

Lots of Notice. Be straight with your users. Do it early and often. I call this The Walt Mossberg Rule, since The Wall Street Journal columnist has been preaching this for years. Listen to Walt. This should be self-evident.

Don't Keep User Data Long. I know. I know. IT departments want to keep data forever. They like data. They don't like destroying it. I bet most of them are hoarders at home as well. Don't treat user data the same. Most of it does not have a long shelf life, and keeping it makes consumers uncomfortable. Similar to giving notice, destroy this data early and often.

Get to Know your Regulators and Legislators. The first time you meet officials from the FTC or Federal Communications Commission or House Consumer Protection Committee should not be when you are under investigation. They have a very good perspective on what is good and what is bad when it comes to protecting privacy. Get to know them. Tell them what you want to do. Listen to them. Follow their advice.

I am very excited about the prospect for new robust emerging marketing services on the mobile and television platforms. I am hopeful that companies building these services won't repeat the mistakes we saw on the Web. What do you think of my Top 10 Lessons? What ones would you add?

Recommend (21)
5 comments about "Top 10 Privacy Lessons Mobile And TV Can Learn From The Web".
  1. Jerry Gibbons from A-Team Advertising Advisors , October 21, 2010 at 12:53 p.m.

    David - Really good advice. And well said. Too bad the Web did not learn these lessons earlier.
    Jerry

  2. Bennie Smith from yahoo , October 21, 2010 at 2:34 p.m.

    Dave - overall these are good starting points for discussion and evaluation, thanks.

    I do wish, however, that we (industry) stopped using the term "creepy" to describe a super-subjective, constantly shifting standard. If privacy professionals employed by companies engaged in the online ad space want their advice taken seriously by executive leadership and/or the business team, we're going to have to use more objective standards in determining where that company's risk tolerance breakpoint is.
    Saying to a biz dev person that data can't be used in a certain way or to a product mgr that a certain functionality can't be deployed because it's "creepy" isn't really going to be helpful or position that person as a resource for sound, meaningful advice or guidance.

  3. Dave Morgan from Simulmedia , October 21, 2010 at 3:37 p.m.

    Bennie, I know that it's tough to have a subjective filter like "creepy" control our world, but users' reactions are many times quite subjective and change from time to time. Therefore, with businesses that have so much public policy impact, you have to evaluate practices regularly and can't be afraid to hold yourselves to relatively subjective standards. It's not easy, but it's the way it is, I think. It is why I believe that our industry needs liberal arts graduates as much as we need software engineers. They are used to dealing with issues that are many shades grey, not just black and white.

  4. Paula Lynn from Who Else Unlimited , October 21, 2010 at 4:32 p.m.

    Dave, just because you are a terrific person, smart and moral, doesn't mean everyone is. It only take a few, if not just one irresponsibile marketer (I'm being nice.) to go past even the creepy stages to invoke chaoitc, personal info brew ha ha that breaches privacy to alter traumatic information sent to the wrong person or wrong information to traumatize the sender and receiver. That action may be picked up or tracked until, well, until "Winston tastes good like a cigarette should" becomes true and alienates many citizens. Nothing is free. Freedom is not free. Self regualtion will not work 100% for 100% of the population. E.G., it took a constitutional amendment for the women's right to vote - not for some women, but all women.

  5. Bennie Smith from yahoo , October 22, 2010 at 3:36 p.m.

    Dave - we may have to agree to disagree (respectfully of course) but not sure how a subjective standard can be used in any meaningful way in terms of product functionality, data collection and or usage. Especially, when the "creepy" factor can't necessarily be equally applied across a customer base. What my mom perceives as creepy is likely very different from how I would define it but we are both consumers of social media for example.

    For me this is particularly true when we look at the gap between what consumers say and what consumers do.
    I think perhaps the better "lesson" is for a company to have a clear understanding of it's story - why you do what you do and how it supports the value exchange. I think most companies get into trouble when they can't answer those questions and are left looking as if they are sloppy/negligent (at best) or simply deceptive or liars.