Martin Kocher and his colleagues from LMU in Munich set up a study in which participants had to watch a video of a single roll of a die and then report on the number that came up. Depending on what they reported, there was a payoff. Researchers asked both individuals and small groups who had the opportunity to chat anonymously with each other before reporting. The result, “"Our findings are unequivocal: People are less likely to lie if they decide on their own." Even individuals who answered honestly independently started lying when they got in a group.
The researchers called this a “dishonesty shift.” They blame it on the shifting weight placed on the norm of honesty. Norms are those patterns we have that guide us in our behaviors and beliefs. But those norms may be different individually than they are when we’re part of a group
"Feedback is the decisive factor. Group-based decision-making involves an exchange of views that may alter the relative weight assigned to the relevant norm,” according to the study.
Let’s look at how this factor may play out. Individually, we may default to honesty. We do so because we’re unsure of the consequences of not being honest. But when we get in a group, we start talking to others and it’s easier to rationalize not being honest: “Well, if everyone’s going to lie, I might as well too.”
Why is this important? Because marketing is done in groups, by groups, to groups. The dynamics of group-based ethics could help to explain the most egregious breaches of ethics we see becoming more and more commonplace, either in corporations or in governments.
Four seminal studies in psychology and sociology shed further light on why groups tend to shift towards dishonesty. Let’s look at them individually.
In 1955, Solomon Asch showed that even if individually we believe something to be incorrect, if enough people around us have a different option, we’ll go with the group consensus rather than risk being the odd person out. In his famous study, he would surround a subject with “plants” who, when shown cards with three black lines of obviously differing lengths on it, would insist that three lines were equal. The subjects were then asked their opinion. In 75% of the cases, they’d go with the group rather than risk disagreement. As Asch said in his paper, quoting sociologist Gabriel Tarde, “Social man in a somnambulist.” We have about as much independent will as your average sleepwalker.
Then there's Stanley Milgram’s Obedience to Authority study, perhaps the most controversial and frightening of the group. When confronted by someone with an authoritative demeanor, a white coat and a clipboard, 63% of the subjects meekly followed directions and delivered what were supposed to be lethal levels of electrical shock to a hapless individual.
These results were so disheartening that we’ve been trying to debunk them ever since. But a follow-up study by Stanford psychology professor Philip Zimbardo -- where subjects were arbitrarily assigned roles as guards and inmates in a mock prison scenario -- was even more shocking.
We’re more likely to become monsters and abandon our personal ethics when we’re in a group than when we act alone. Whether it’s obedience to authority -- as Milgram was trying to prove -- or whether it’s social conformity taken to the extreme, we tend to do very bad things when we’re in bad company.
But how do we slip so far so quickly from our own personal ethical baseline? Here’s where the last study I’ll cite can shed a little light. Sociologist Mark Granovetter -- famous for his Strength of Weak Ties study -- also looked at the viral spreading of behaviors in groups.
I’ve talked about this in a previous column, but here’s the short version: If we have the choice between two options, each with accompanying social consequences, which option we choose may be driven by social conformity. If we see enough other people around us picking the more disruptive option (for example, starting a riot) we may follow suit. Even if we all have different thresholds -- which we do -- the nature of a crowd is such that those with the lowest threshold will pick the disruption option, setting into effect a bandwagon effect that eventually tips the entire group over the threshold.
These variables were all studied in isolation, because that’s how science works. But it’s when factors combine that we get the complexity that typifies the real world -- and the real marketplace. And there’s where predictability goes out the window. The group dynamics in play can create behavioral patterns that make no sense to the average person with the average degree of morality. But it’s happened before, it’s happening now, and it’s sure to happen again.