Passions, Reason & Moral Hypocrisy
Most of us think we are morally sound. If we see an injustice, we’ll step in, if we are given the opportunity to cheat, we won’t. Or so we say. Psychological research demonstrates that in certain situations we tend to twist our reasoning to position ourselves as morally superior to others even when have acted otherwise.
In one experiment conducted by David DeSteno and Piercarlo Valdesolo participants were told that they would be performing one of two tasks; the first was short and fun while the second was long and hard. To induce a small yet significant (and later very revealing) moral dilemma, DeSteno and Valdesolo let half participants decide which task they would preform, knowing that the other task would be allocated to another participant. (They also had the option of letting a computer randomly choose how the tasks would be distributed.) After they finished assigning the task, participants were asked to rate how fair they were. Meanwhile, the group of participants who were at the receiving end of the task allocation were asked to rate how fair the allocating-participants were. It doesn’t take a lot of foresight to see where this is going.
The first thing DeSteno and Valdesolo found was in line with their previous research: only about 8 percent of participants acted altruistically – what an objective set of eyes would call “fair”. Not a great start, and it gets worse. The second thing they found was that, “moral hypocrisy emerged in the control conditions; the same fairness transgression was judged to be substantially more moral when enacted by the self than when enacted by another.” In other words, participants who were in charge of allocating the tasks usually believed that they decided fairly no matter what their decision was. In sharp contrast were the participants who had no say in the process. They believed that the delegating participants were not fair. The lesson here is that we are all “moral hypocrites;” we claim to be morally sound, and when we’re not, we rationalize to improve our moral stature to others and ourselves. Again, not a big surprise.
What DeSteno and Valdesolo were really after was a better understanding of the dual-model process of moral judgment, which understands our moral judgments as products of both our intuitive and deliberate capacities. When it comes to assessing moral situations we have a gut-reaction immediately followed by a more deliberate line of reasoning. For example, when someone asks us if killing an innocent person is wrong you know right away that the answer is yes, but it usually takes a few moments to think of reasons for why this is true. This is not to say that these two systems (system 1 and system 2 as they are referred to in the popular literature) are neurologically separate, but it is to suggest that they are not necessarily on the same page at all times. Understanding their relationship is key to understanding how humans think about moral judgments.
To tease out how these two systems handle moral judgments DeSteno and Valdesolo incorporated a twist. They replicated the experiment but the second time around half of the participants had to make fairness judgments under cognitive load. (They had to memorize a string of digits. The idea here is that their “rational” brains will be busy memorizing the digits thereby freeing up the “intuitive” brain.) They found that under cognitive load, which made reasoning very difficult, the ratings were identical rendering no signs of “moral hypocrisy.”
DeSteno and Valdesolo conclude:
The present study provides strong evidence that moral hypocrisy is governed by a dual-process model of moral judgment wherein a propotent negative reaction to the thought of a fairness transgression operates in tandem with higher order processes to mediate decision making. Hypocrisy readily emerged under normal processing conditions, but disappeared under conditions of cognitive constraint. Inhibiting control prevented a tamping down or override of the intuitive aversive response to the transgression. Of import, these findings rule out the possibility that hypocrisy derives from differences in automatic affective reactions towards one’s own and others’ transgressions. Rather, when contemplating one’s own transgression, motives of rationalization and justification temper the initial negative response and lead to more lenient judgments. Motivated reasoning processes are not engaged when judging others’ violations, rendering the prepotent negative response more causally powerful and leading to harsher judgments.
So Freud had it backwards. It is our intuition – not just our rationality – that seems to have a more objective reaction to moral situations. However, understanding the relationship between the passions and reason is certainly not over. If anything it has just begun, in the context of empirical research at the least. From the ancient Greek philosophers to philosophers of the 21st century moral debates have almost always taken place in the abstract. Now there is plenty of promising science to be excited about. Are our moral judgments simply post-hoc justifications, the rational tail of the emotional dog? Or can our conscious deliberations inform, perhaps control, our moral intuitions. We’ll see what the data says.