Saturday, October 1, 2011

Motivated Reasoning

One of my students (Thanks Kate!) found this article.  They are arguing for a thesis quite consistent with what I've been pressing in several recent posts:  


Boudry, Maarten and Johan Braeckman.  "How Convenient!  The Epistemic Rationale of Self-Validating Beliefs Systems.  forthcoming in Philosophical Psychology.  



One passage is particularly relevant to the resurrection discussions I've been in recently:  

According to cognitive dissonance theory (Festinger, Schachter, & Riecken, 1964; Aronson, 1992; Tavris & Aronson, 2008), when people are presented with new evidence that conflicts with their previously held beliefs, this results in a form of cognitive tension called “dissonance”. Importantly, the strength of this uncomfortable tension depends on the degree to which people have invested in their beliefs, for example by way of public commitment, or by the time and effort spent acting in accordance with these beliefs (Batson, 1975). If the psychological investment in a belief is high, people are more motivated to reduce dissonance by rationalizing away disconfirming data. In the refined version of dissonance theory, dissonance arises not so much because of two conflicting cognitions, but because adverse evidence conflicts with one’s self-esteem as a competent and reasonable person[1]. This accords with our earlier observation that, when people explain away unwelcome evidence, they do so in a way that allows them to uphold an illusion of objectivity. For example, if a psychic has publicly professed his powers and risks losing his credibility, he is unlikely to be put off his balance by blatant failure. Or if a believer has spent a substantial amount of time and money on astrology consults, typically no amount of rational argumentation and debunking efforts will make him renounce his beliefs. As Nicholas Humphrey noted: “psychic phenomena can, it seems, survive almost any amount of subsequent disgrace” (Humphrey, 1996, p. 150). By contrast, if the psychological stakes are low, as in the everyday situations we mentioned above, the motivation for belief perseverance will be greatly reduced. Consider another example related to paranormal beliefs: suppose that Anna and Paul both start to suspect that they have psychic powers, but their level of confidence is not very high. While Paul hastens to tell his friends that he may be psychic and even performs some psychic readings, Anna decides to conduct an experiment on herself at an early point, when her beliefs are still privately held. All other things being equal, it is much more likely that Anna will abandon her beliefs silently when she discovers that they do not pan out (Humphrey, 1996, p. 105), while Paul will rationalize his failures because he has already made a public commitment. Thus, we would predict that people with an inquisitive and cautious mindset are more likely to put their hunches to the test early on, and are less likely to be sucked into commitment to wrong beliefs like these. By contrast, people who rush to conclusions and start spreading the news right away will more often find themselves in a situation where they obstinately refuse to abandon a false belief.[2]
A classic illustration of cognitive dissonance can be found in the landmark study by Leon Festinger and his colleagues, who infiltrated a doomsday cult and observed the behavior of the followers when the prophesized end of the world failed to come true(Festinger, et al., 1964). The followers who had resigned from their jobs, given away their material belongings and were present at the arranged place and time with full conviction in their imminent salvation, became even more ardent believers after the prophecy failed, and started to proselytize even more actively for the cult. However, those for whom the cognitive stakes were lower (e.g. those who kept their belongings and stayed home in fearful expectation of what was supposedly to come), were more likely to abandon their beliefs afterwards.