Don’t Confuse Me With the Facts!

By David L. Brown

factsThat headline may sound like a joke, but it often seems to be exactly the reaction many people have when faced with facts that may threaten their preconceived notions about how things work. A recent article in The Boston Globe (link here) sheds light on this mysterious effect.

Titled “How facts backfire,” the article is focused on voter opinions about political questions. It makes the unsettling conclusion that most people are unlikely to change their opinions when provided with contrary facts, and may actually tend to cling even more strongly to mistaken ideas. Here’s a take-away from the article:

Recently, a few political scientists have begun to discover a human tendency deeply discouraging to anyone with faith in the power of information. It’s this: Facts don’t necessarily have the power to change our minds. In fact, quite the opposite. In a series of studies in 2005 and 2006, researchers at the University of Michigan found that when misinformed people, particularly political partisans, were exposed to corrected facts in news stories, they rarely changed their minds. In fact, they often became even more strongly set in their beliefs. Facts, they found, were not curing misinformation. Like an underpowered antibiotic, facts could actually make misinformation even stronger.

“The general idea is that it’s absolutely threatening to admit you’re wrong,” says political scientist Brendan Nyhan, the lead researcher on the Michigan study. The phenomenon — known as “backfire” — is “a natural defense mechanism to avoid that cognitive dissonance.”

This concept helps shed some light on the subject of climate change denial, in which scientists are frustrated by the apparent effect that as they reveal more and more information about the dangers of global warming, large numbers among the public actually seem to become more skeptical. This is not least seen among politicians, broadcast and print commentators and other opinion leaders. It’s not uncommon today to hear or read statements such as “global warming is a hoax,” or “climate change has been disproved.” This is completely contrary to the mass of evidence.

The article points out that people today are deluged with “endless rumors, misinformation, and questionable variations on the truth,” making it easier than ever to be wrong.  It also makes people feel more certain that they are right.

And even more vexing is the fact that the most informed people are the most resistant to changing their mistaken ideas when provided with new information. The article describes a 2006 study by Charles Taber and Milton Lodge at Stony Brook University which “showed that politically sophisticated thinkers were even less open to new information than less sophisticated types. These people may be factually right about 90 percent of things, but their confidence makes it nearly impossible to correct the 10 percent on which they’re totally wrong.”

Hmm, that may explain a lot about some of the stuff you hear coming out of the mouths of supposedly well-informed people on “Meet the Press” and other venues. They appear to be completely confident in their statements, even though to others they sometimes seem more like delusional paranoid ravings than sound opinion.

Thanks to Google I found the original paper by Taber and Lodge, “Motivated Skepticism in the Evaluation of Political Beliefs” (PDF here), and it adds some interesting perspective to the subject. It seems that even scientists are subject to the backfire effect, as described in their paper:

Physicists do it. Psychologists do it. Even political scientists do it (cites withheld to protect the guilty among us). Research findings confirming a hypothesis are accepted more or less at face value, but when confronted with contrary evidence, we become “motivated skeptics,” mulling over possible reasons for the “failure,” picking apart possible flaws in the study, recoding variables, and only when all the counterarguing fails do we rethink our beliefs. Whether this systematic bias in how scientists deal with evidence is rational or not is debatable, though one negative consequence is that bad theories and weak hypotheses, like prejudices, persist longer then they should.

The backfire effect poses a serious problem for scientists, not only because their profession is fact-based but also due to their human tendency to hold onto their notions. The statement of Arthur Schopenhauer comes to mind: “All truth passes through three stages. First, it is ridiculed, second it is violently opposed, and third, it is accepted as self-evident.” The concept of backfire also resonates with the maxim of Ben Franklin, who said “So convenient a thing is it to be a rational creature, since it enables us to find or make a reason for everything one has a mind to.”

When scientists find it difficult to understand why their increasingly solid data and conclusions don’t seem to change opinions among the general public, the effect described in the Globe article could be the reason. If that’s the case, what can be done about it? Since the problem appears to lie in human psychology rather than the rational processes of logic and reason, the answer can’t be found in the stubborn ranks of the misinformed. And, simply piling more and more facts onto the table doesn’t work and may even have a negative effect. It’s a conundrum indeed.

The author of the Globe article, Joe Keohane, suggests that those who spread falsehoods might be subjected to shame, which could cause them to change their behavior. However, he concludes that the “shame-based solution” runs into the fact that “fast-talking political pundits have ascended to the realm of highly lucrative popular entertainment, while professional fact-checking operations languish in the dungeons of wonkery.”

I’m reminded of how the many magical feats of Moses (plagues of frogs, locusts, serpents, etc.) failed to convince Pharaoh to release the Hebrews and that each feat only “hardened Pharaoh’s heart.” Climate scientists probably face a similar impossible task in trying to convince the public about the importance of their work. In the end, Moses had to simply gather up his people and leave Egypt, thus:

mosesandtheredseabiblestory

Unfortunately, parting the Red Sea and leaving Egypt is not an option for climate scientists  today. I  guess we’ll have to wait for the jury to come out on global warming and climate change. That’s when the facts will become so manifest that they can no longer be denied.  Unfortunately, that will also be when it’s too late to do anything about it.

This entry was posted in Energy Technology, Politics, Psychology. Bookmark the permalink.