"We've heard your feedback loud and clear, and since we launched Google Buzz four days ago, we've been working around the clock to address the concerns you've raised. Today, we wanted to let you know about a number of changes we'll be making over the next few days based on all the feedback we've received.... We're very sorry for the concern we've caused and have been working hard ever since to improve things based on your feedback. We'll continue to do so."
Watching these events unfold, it struck me as ironic that the company known and revered both for its analytical superpowers and its "don't be evil" credo, failed on both accounts. And, in doing so, jeopardized its stellar brand.
Analytical Power Failure
Let's face it: we've all been smitten by Google's algorithms that deliver lightning fast, relevant searches. And, it's able to do so because it collects a ridiculous amount data: it knows what sites you visit, when and how frequently, what ads you click on, what products you buy and so on. It gathers up all those bits and bytes, crunches them, and synthesizes them into a coherent story. If you use its products -- Gmail, Gchat, Chrome, search, Voice, etc. -- it knows even more.
Google leveraged some of that data to create its automated circle of friends for new Buzz users, which irritated privacy advocates. But, interestingly, it only scratched the surface of its knowledge. It didn't appear to leverage data that would have told them, for instance, that the people I most frequently email don't participate in social networks (hence, the email). Or, that I don't allow social networks to go through my Gmail contacts to find friends. Data that could have created a more customized -- and safer -- environment for Buzz users.
Beyond its own data, Google modelers could have also leveraged market research that would have made Buzz a better product at the start. Data, for instance, that women are more privacy sensitive than men; that they equate privacy with safety. It's why they want control over whom they connect with online. Or data that shows consumers segment their real life connections into different social networks, protecting their personal lives from professional scrutiny.
"Don't be Evil" Failure
Google advises its employees in its code of conduct preface:
'Don't be evil.' Googlers generally apply those words to how we serve our users. But 'Don't be evil' is much more than that. Yes, it's about providing our users unbiased access to information, focusing on their needs and giving them the best products and services that we can. But it's also about doing the right thing more generally -- following the law, acting honorably and treating each other with respect.
The code then outlines in detail what it means to not be evil. In its list it specifically singles out protecting users' privacy and preserving their trust. And yet, the team that developed and tested Buzz seems to have forgotten this code. While using the data to provide a customized experience was not necessarily in violation of their code, forcing it on to a user, without the ability to opt in and displaying their personal information publicly most certainly was.
But, despite these initial failings, Google's response to the crisis is truer to its brand than its initial product launch. And, that response -- rapid and authentic -- appears to be redeeming its brand. It is as if the Google team took a page from Johnson & Johnson's 1982 Tylenol tampering crisis. Then, as now, a major, well-respected brand survived a crisis by being true to its brand and responding rapidly and with authenticity.
Faced with a similar crisis, what would your brand have done?