Search And The Moral Imperative

Large corporations continue to compile ever-larger databases of our personal information. China's irresistible economic pull causes organizations and governments to tactfully overlook moral objections. Facebook tells my friends if I buy a pair of shoes or a porn video [though the company just announced it is changing that policy to an opt-in rather than opt-out]. When it comes to search, do organizations have any moral obligations? And, if so, what are they?

This is, of course, a loaded question and a subjective debate. My Oxford defines "moral" as concerned with goodness or badness of character or disposition or with the distinction between right and wrong; virtuous in general conduct.

Right away, we've got terms that could be argued endlessly, before we even begin to discuss whether companies should be moral. Who defines good or bad? Who says what's right and what's wrong? What constitutes virtue?

As search becomes more and more integrated into our daily lives, and correspondingly more powerful, these issues only become thornier. Yahoo gets called "moral pygmies" for handing over information on dissidents to the Chinese government. Google indexes sites reported to be scamming consumers.



The backlash to this behavior, though, has been minimal at best. It seems that people only object to the behavior that directly affects them. Facebook didn't give people enough time to opt-out from its invasive word-of-automated-mouth advertising program, and the troops rallied: over 35,000 members in MoveOn's Facebook protest group by last weekend. That's more than 1,000 times the membership count of the largest FB group protesting Yahoo's actions in China.

Jim Collins and Jerry Porras made the argument in Built to Last that, for more than a century, companies with a strong purpose and values do significantly better over the long term, regardless of what the purpose and values are. The authenticity of the ideology is far more important than its content.

On the base of that tenet, it shouldn't matter to any search company whether people think they're "working altruistically for the good of humankind." The key is whether the company behaves consistently with a core ideology that it can stand behind, passionately. Was Facebook passionate about defaulting users into sharing purchase information? Or was it just chasing the American Monetization Dream?

The other issue deals with the balance between power and responsibility -- and, no, I'm not going to quote Spiderman here. Nevertheless, the giving of our personal information to any person, company or organization is an act of trust. The more information that entity amasses, the greater the obligation of stewardship.

In the book "The Search," John Battelle quotes Karl Schoenberger:

"Even for companies with the most noble of intentions, the unwritten laws of the free market do not provide a mechanism to reconcile the true cost of social responsibility with the fundamental need to be profitable... An organization's instinct to succeed prevails over any lofty principles it might espouse."

Schoenberger's cynical view might be reflective of consistent behavior, but it is most certainly not an immutable law of the universe. We're not talking about a living creature's survival instinct. An organization is made up of people, people who make decisions every day about the values they're going to live by and the priorities and motivations that drive them.

He says there is no mechanism; I say there is a simple mechanism. We decide what is and is not acceptable to us, and we live by it. We the users, we the search companies, we the stock market. We as a collective decide our moral landscape, and each of us can choose to create a society in which we're proud to live.

Next story loading loading..