Forgive me for harping on the same subject three weeks in a row, but it's time to talk about Facebook, and the sharing of personal information, yet again. Why? Because I'm growing increasingly
convinced that social media, as an industry, needs to step up to the plate and make it clear to consumers how their data is being used. Or else.
I'll preface the rest of this column with
some stats. Jeff Fox, Consumer Reports' technology editor, rattled off the following data on Facebook behavior during a panel (full stream here) the publication sponsored today, called "Social Insecurity: Risky Consumer Behavior During the Era of Social Networks":
· 56% of people have posted what Consumer Reports said was "risky" information.
· 42% have posted their full birth date,
including the year (actually, I'm surprised at how many of my Facebook friends who are also in social media do that).
· 7% have posted their street address.
· 26% have posted photos of their children, including their names.
· Roughly a quarter of those surveyed don't know about
Facebook's privacy controls, even after Facebook did a campaign recently to publicize how to use them.
But those stats aren't even the alarming part. To me, the bigger problem is that the
industry is being so lax in educating people about what information is dangerous to post, what isn't -- and, more importantly, how their information is being shared and disseminated.
The
reason we don't see more action from the industry is obvious, but based in cynicism: the theory seems to be that if we tell people more about how their information is being used, they'll start holding
back information. Then, there will be less data out there, and if there's less data out there, there's less for those who are planning to make money out of social media to make money from. Maybe
telling consumers more about how their data is used will have that effect, at least initially, but that's the price the industry has to pay for long-term growth. Being deceitful isn't the answer.
However, I'm advocating that the industry take the long view, by remembering what social media was supposed to be about in the first place: transparency. It's time to start an industry-wide
initiative to tell consumers, in language they can understand, how and where their data is being used, to make it easy for them to opt out, to tell them the pros and cons of doing so, and then move on
to continuing the innovation that makes the industry so promising and, at many times, beneficial. You'll lose some users in the process, but you'll also garner the trust social media needs to continue
to flourish.
It's true that two weeks later, the central irony of the announcement of Facebook's Open Graph has yet to escape me: that the 800-lb. gorilla of social media isn't living
by the very rules of being open and honest on which so much social marketing is premised. (Or, it thinks it's being open and honest, but is so far down the track of technological complexity it's
forgotten how to describe what it does plain English.)
In his presentation today, Fox walked the audience briefly through the steps it would take for Facebook users to not have friends
divulge information about them in a more public fashion. Of the complex process he said: "[It's] an interface that's so confounding and filled with hurdles that it's almost intended to discourage
people from using the controls that are out there."
The Electronic Frontier Foundation has a shorter name for it: the "evil interface" -- which, I should point out, applies to any
interface that fails the simplicity test or outright tricks users when it comes to "sharing more info about themselves than they really want to." Lee Tien, an EFF senior staff attorney who also
appeared on the panel today, explained that the problem is not just one of "EI"'s, but also of jargon. The EFF has gone so far as to create a Facebook-to-English translator, explaining concepts like the "social plug-in" in language that most of us can
understand.
Good for it, but bad for social media. While Facebook's Open Graph announcement put a fine point on problems that have been gathering steam for some time, this problem is not
just Facebook's. As you'll note from some of what I've reported above, too much of the time it's outside parties who are taking the bull by the horns and giving guidance to consumers. If the entities
to whom you're giving lots of personal data need a platoon of watchdogs to police what they're doing, that's not good -- the signal it sends to consumers is anything but reassuring. It's time for the
industry to act.
(Editor's note: The agenda for June 17th's OMMA Social NY is shaping up. Take a look here.)