Wired for Hypocrisy
Why it's so easy to justify our bad behavior.
It's not only the rich and famous who have made hypocrisy a fine art, though when advocates for the poor such as John Edwards build gargantuan homes, or family-values preachers such as Ted Haggard fess up to "sexual immorality", they sure make it seem so. But garden-variety hypocrisy is so rife—the "environmentalist" who bought a Hummer because, come to think of it, driving lots of kids to soccer practice would use much less gasoline than if every parent made the trip; the "humanitarian" who turns down the charitable appeal because, on second thought, it's much better for the poor to learn to fend for themselves—it seems as if the brain must have a special circuit for it.
That's pretty much the case, according to new research. Since actions cannot be undone, the only option when they conflict with beliefs—which produces the phenomenon called cognitive dissonance—is to alter the beliefs. When people experience cognitive dissonance, it turns out, brain activity causes us to back and fill, mentally. The result is that we change our beliefs so that they are once again aligned with our actions. Although well known in psychology—the idea that people change their attitudes to reduce the psychological pain of cognitive dissonance dates back to the 1950s—the phenomenon has been a mystery neurobiologically. That is, its brain basis has been a black hole. Which makes this first stab at an explanation particularly intriguing: the brain regions involved in resolving cognitive dissonance are so nimble, enabling us to find rationalizations like that of the Hummer-driving green, it's a wonder anyone can stick to his principles.
To investigate cognitive dissonance, neuroscientists at the University of California, Davis, led by Cameron Carter used functional magnetic resonance imaging (fMRI) to study the brains of volunteers who were made to experience the psychological pain of clashing beliefs and actions. Specifically, the volunteers spent 45 minutes doing a boring task inside the cramped fMRI tube, after which they answered written questions indicating how they felt about the experience, which they did not enjoy. To induce cognitive dissonance, the subjects were then asked to answer the questions again, and to say this time that they enjoyed being in the scanner. Some of them were told their answers were being read by a nervous patient who needed reassurance. The other participants were told that they would get $1 each time they answered the questions as though they were enjoying the scanner, but they were not given the worried-patient cover story.
While faking it, two brain regions were particularly active in both groups: the dorsal anterior cingulate cortex (dACC) and the anterior insula. One of the functions of the dACC is to detect conflicts between incompatible bits of information; it is especially active when a person lies. The anterior insular has a similar job description, monitoring psychological conflicts such as a clash between stated beliefs and true ones. The scientists, writing in Nature Neuroscience, call this extra activity in the dACC and insula "the neural representation of cognitive dissonance." Basically, "the more that participants in the dissonance group 'lied' [about enjoying the fMRI], the greater was…activation" of these regions: they detected when beliefs and actions parted ways.
To me, that finding isn't particularly noteworthy, since it is just another "neural correlates" discovery—that is, a finding about which brain regions are active during which mental activity. It was the next part of the experiment that caught my attention. Debriefed later about their true attitudes toward the scanner, participants asked to fake it for the worried patient changed their beliefs more than participants who were paid $1. In fact, the greater the activity in the dACC while faking their feelings about the scanner, the more participants later said they truly enjoyed it. The brain activity that accompanies cognitive dissonance had changed their minds about the experience of being in the fMRI.
The result shows "how and why people change their attitudes," said co-author Vincent van Veen, who is now at UC Berkeley. "It shows that the phenomenon of cognitive dissonance is real and is not just a figment of the imagination of social psychologists.…[And] it shows that the degree to which people's opinions are subsequently changed depends on how active their anterior cingulate cortex was." The power of brain activity to change the mind is reminiscent of how thinking about one's thoughts differently—which is what people with depression, for instance, learn to do in cognitive-behavior therapy—alters subsequent brain activity, which then makes people feel and think differently.
Although more research is needed to pinpoint the mechanism of dissonance-induced attitude change, what seems clear is that the magnitude of activations of the dACC and anterior insula predict attitude change. The greater the cognitive dissonance a person feels, the more likely he is to change his beliefs to accord with his actions. How convenient.