Lots has been written about why people believe things that fit their worldview even when those things have been disproved time and again (Iraqi weapons of mass destruction, anyone?). But there is a more benign form of this pathology, in which people would rather believe a “good story” than know whether it’s true or not.
This hit me when a friend emailed a film clip of what purports to be a charming music machine, built at the University of Iowa. To me, it looked computer generated, and indeed about three seconds of searching showed that it was yet another urban legend. The surprising part was how the sender, as well as those I had passed it on to myself, reacted when I pointed this out.
The essence was, “thanks a lot for bursting my bubble. I enjoyed it
a lot and wanted other people to enjoy it, too. Now you’ve ruined it."
What struck me about this was that intelligent, science-aware people want to believe cute stories, and are disappointed (even irate) if “science” (me) points out that the story isn’t true. The attitude is, “Who cares if it’s true?”
There seems to be a near-universal human need to hear (and tell, and believe) a good story, regardless of its truth. I’ll leave it to you to think how this can be exploited by, among others, prosecutors (by telling a more compelling story, they can lead juries to convict the innocent) and by cynical and manipulative political leaders.
But here’s one hint. In a 2005 study, scientists found that what you remember and believe about events during the Iraq war depends on your political views. Do you recall a suicide bomber nearing a Najaf checkpoint and blowing up U.S. soldiers? The execution of coalition POWs by Iraqis? The civilian uprising in Basra against Saddam’s Baathist party? All were initially reported by the press, but the last two were quickly retracted as being products of the fog of war.
Yet Americans (especially those who supported the invasion) tend to believe that the last two events occurred even when they recall the retraction. Germans and Australians who recall the retraction, in contrast, no longer believe the misinformation. “People build mental models,” Stephan Lewandowsky, a psychology professor at the University of Western Australia, Crawley, who led the study told me in 2005. “By the time they receive a retraction, the original misinformation has already become an integral part of that mental model, or world view, and disregarding it would leave the world view a shambles.” People therefore “continue to rely on misinformation even if they demonstrably remember and understand a subsequent retraction,” he and colleagues wrote.
The late New York Sen. Patrick Moynihan famously said that people are entitled to their own opinions, but not to their own facts. Believing in the magical music machine is obviously of less concern than believing erroneous information about the Iraq war. But the two beliefs show that people who care about facts are working against powerful elements of human nature. If a “good story” trumps the facts much of the time with many people, we are in deep trouble.