Monday, February 4, 2019

The Psychology of Blind Faith

When someone has “blind faith,” they tend to hold onto their beliefs even when there is significant evidence suggesting they are false. This is different from standard faith, which is based merely on the absence of evidence. So what drives people to be so unreasonable? Psychology has some interesting answers.
 
Cognitive Dissonance and Motivated Reasoning
 
When someone is faced with conflicting attitudes, beliefs, or behaviors, they experience a sensation of discomfort referred to as cognitive dissonance.[1] People with cognitive dissonance are motivated to resolve the conflicting notions in a manner which allows them to maintain consistency in their beliefs and attitudes.[2] For example, let’s say a creationist happens to discover my blog, and decides to thoroughly examine the resources I’ve compiled on my “Don’t Believe in Evolution?” page. They would be facing some serious evidence that their deeply held beliefs are wrong, yet they would probably find some way of dismissing it all in favor of their prior convictions. Given that their creationist beliefs have been reinforced for years, have become a major aspect of their identity, and are likely shared by many of their family and friends, letting go of them would be a difficult task. Thus, they partake in a behavior psychologists refer to as motivated reasoning, which is the unconscious tendency of individuals to fit their processing of information to conclusions that suit some end or goal.[3] As political scientist James Tabor put it, “they retrieve thoughts that are consistent with their previous beliefs, and that will lead them to build an argument and challenge what they're hearing."[4] 


Conflicting Brain
During the run-up to the 2004 presidential election, psychologist Drew Westen led a study wherein ardent George Bush and John Kerry supporters were each presented with statements from the candidates clearly showing that they were contradicting themselves.[5] To no one’s surprise, each group saw no fault in their favored candidate, yet remained critical of the other. [6] During the experiment, fMRI scans showed the brains of the participants’ were inactive in their reasoning centers, while areas associated with emotions and moral judgments lit up. [7] On top of that, once subjects came to a satisfactory conclusion about their candidate, the area of the brain associated with pleasure became active. [8] This likely acted to reinforce the feeling that their rationalizations were logically sound.
 
What this and other similar studies seem to conclude is that when people partake in motivated reasoning, they aren’t actually reasoning at all. To paraphrase an analogy made by psychologist Jonathan Haidt, ‘people may think they are being scientists, when in reality they are being more like lawyers.’[9] That is, people build a case to support a position they feel to be true vs. weigh the relative merits of differing arguments and supporting evidence.

Extraordinary Evidence Requires Extraordinary Rationalizations
So what happens when people are faced with really compelling evidence suggesting their deeply held beliefs are wrong? Not only do they find ways of discrediting the evidence, their rationalizations can actually convince them to adhere to their beliefs more fervently.[10] Such was the case in a study by political scientists Brendan Nyhan and Jason Reifler. In this study, they showed participants quotes by George W. Bush asserting that Saddam Hussein had WMDs, then quotes from the Bush-commissioned Iraq Survey Group report that refuted his assertions.[11] Conservative participants not only still believed Saddam Hussein had weapons of mass destruction, but they were even more convinced after reading the quotes from the Iraq Survey Group.[12]


The Education Effect
Surely a higher level of education dampens the effects of motivated reasoning, right? Well, not really. Studies show the higher the education level, the higher the bias.[13] This would be why in a 2008 Pew study, 19% of college educated Republicans believed humans were causing the earth to warm vs. 31% of non-college educated Republicans. [14] As political scientist Milton Lodge put it “People who have a dislike of some policy—for example, abortion—if they're unsophisticated they can just reject it out of hand, but if they're sophisticated, they can go one step further and start coming up with counterarguments."[15] This confirms a long-held observation of mine that smart people find smart reasons to believe stupid things.
 
Dunning-Kruger Effect
 

For those who are less educated on a subject, the Dunning-Kruger effect may still lead to a higher level of blind faith and motivated reasoning. The effect is the result of an illusory superiority bias, which leads ignorant and incompetent people to believe they are far more intelligent and competent than they actually are.[16] Conversely, those who are more intelligent and competent tend to underestimate their abilities. [17] I’ve seen many examples of this from creationists on the internet. One proclaimed that he knew more about evolution than anyone on the comment board. Yet, when I asked him about the 2nd best evidence for evolution besides the fossil record (i.e. genetic evidence), he became agitated and belligerent. Go figure.
 

Tu Quoque
I believe the psychological phenomena leading to blind faith are likely experienced by many theists who read my blog. However, some may argue that I am just as guilty of dismissing superior evidence and arguments as are theists. For many, this conviction that both sides of the culture war are equally pig headed and dismissive leads to what I have dubbed the “head in the sand effect.” That is, they ignore what I have to say because they believe I am far too certain of my convictions to present information accurately. To tell you the truth, people would be right to be skeptical of my motives given my obvious bias. However, there are some big differences between ardent Secular Humanists such as myself and close-minded theists:


  • We seek to believe in only that which is supported by the evidence, not merely what we wish to be true.
  • We are willing to change our minds or at least be skeptical of our views when presented with superior evidence.
  • We are very aware of the human tendency to rationalize, and actively seek to minimize its effects.

Conclusion
In the end, we are all guilty of motivated reasoning. We are human after all, and we have the exact same evolutionarily driven impulses and cognition patterns as the rest of our species. However, there are certainly some who are more apt to give in to these impulses and ignore evidence, while others are more adept at thinking critically about their own motivations.


Resources:
 
Fantastic example of motivated reasoning (Richard Dawkins dowsing experiment)
https://www.youtube.com/watch?v=DdjYGaINLwo&list=FLrT9OWupoCp5Jf8HwhyW-4A

2 comments:

  1. Hi, I loved your article! I was wondering, have you ever considered that there could be a difference in the brain function between someone who practices motivated reasoning, and someone who doesn't. I realise you say we all partake at some point, but I imagine one is for no gain, so more severe, and one is calculated, such as to gain an advantage. I'm thinking along the lines of mild Psychosis, any thoughts?

    ReplyDelete
    Replies
    1. Glad you liked it! Sorry it took so long to respond. In my opinion, I think it's all a natural reflex. Our worldview is threatened, and our brain tries to restore balance. The question is, what is going on in the brains of true self-skeptics who are able to allow their most deeply held beliefs to be challenged? I think it's a trained response that takes discipline.

      Delete