Monday, July 5, 2010

Denialism

In the latter half of the 20th century, an important development in religious epistemology changed the sort of debate that theists and atheists have been having.  This much was not new:  for centuries, when believers were confronted with challenges, it has become common to simply deny that evidence, reason, logic, arguments, or justifications are relevant or applicable to something as marvelous and transcendent as God.  That anti-intellectual and arational trend in religious circles has been tempered by a more sober view in natural theology that God’s existence and nature can be known and understood through reason and that successful arguments for God’s existence can be given. 

But for many, including people who were once in the tradition, the natural theological project is dead.  The vast majority of philosophers, even ones who believe, do not think that a successful argument for God’s existence can be given.  Knowledge of God can be had by other less conventional methods, however.  Plantinga and the reformed epistemologists now claim that they know God by way of an inner voice, a sensus divinitatus, or the “witness of the Holy Spirit,” that informs them of God directly, non-inferentially, and they say, the knowledge comes in a way that external appeals to evidence, reason, or empirical facts cannot undermine.  The details need not concern us here. 

What should concern all of us, theists and atheists alike, is a strong disposition in human beings to get caught up in an ideology that renders us incapable of reasoning clearly, particularly about that ideology itself.  Beware of positions, like conspiracy theories, that build answers to why they don’t appear to be true into the essential claims of the ideology itself.  We are organic beings, with kludgey equipment that is prone to go off the rails frequently.  Given our fallible cognitive faculties, it is an enormous challenge to sustain any level of intellectual freedom and cognitive integrity.  We’re more prone by our natures to get it wrong, and get it wrong in a big way, than to get it right. 

One of the ways that an ideology infects our minds and consume us is by exploiting our propensity to explain away any counter evidence in order to hang onto views we are emotionally commited to.  We’ve all seen it, of course, and we’ve all felt the urge to hold onto some pet idea even when it is clear that it’s a mistake. 

But the nature of this impulse is coming into focus with recent efforts in empirical psychology.  Geoffrey Munro of Towson University recently showed that when we are confronted with scientific, empirical evidence that challenges a position we favor, we are more likely to reject science altogether and claim that it cannot be employed to address questions of that type at all. The Scientific Impotence Excuse:  Discounting Belief-Threatening Scientific Abstracts.  Munro took test subjects with views about stereotypes, such as homosexuality.  Subjects were tested beforehand to determine what views they held.  Then they were given fake abstracts of scientific studies that purported to either prove or disconfirm the stereotype.  So some studies indicated that homosexuals had a higher rate of mental illness, for example, while others indicated that their rate of mental illness was lower.  Not surprisingly, the subjects who read abstracts that supported their preconceived views concluded that their views had been vindicated.  But something remarkable happened with the the subjects who had their prior views challenged.  Rather than acknowledge that they were mistaken and change their minds, these subjects were much more likely to conclude that proving (or disproving) the thesis simply couldn’t be done by science.  They rejected science itself, rather than give up their cherished idea. 

Contradictions, counter-indications, improbabilities bother us.  They create cognitive dissonance.  See The Forbidden Conclusion.  Our minds need resolution to the conflict.  One way (the poor way, in this case) is to just reject the source of information that is creating the dissonance.  If scientific methods themselves are suspect, then there is less strain on my belief system when I continue to hold views that it rejects. 

Of course, there’s nothing wrong with seeking to have a more coherent, less contradictory worldview. Quite the contrary, we should all be doing more of that.  But the model of reality that we construct in our heads should be as consonant and responsive to as many of the known facts as possible.  One can have a highly internally consistent picture of reality that is detached from reality.  But what we should be striving to do is to incorporate as much of what known into whatever worldview we adopt. 

We can imagine some scheme whereby one of the subjects in the study might think their way out of the problem.  “Sure, this authentic looking and authoritative sounding scientific study says that homosexuals have a lower rate of mental illness, but I know different from my own experience.”  (Other studies have demonstrated how strong our tendency is to accept anecdotal and personal experience over abstract, scientific analyses.  See Jonathan Baron's  Thinking and Deciding.  But what our subject has failed to realize is how unreliable reasoning about general epidemiological and stratitistical trends from personal and anecdotal evidence can be.  Even if the scientific study is a fake, it’s methodology is superior to our subject’s method.  So it should have lead him to reject his prior view, not science itself. 

The application to religious belief is obvious.  Sustaining the view that there is an invisible, undetectable, almighty, all knowing, and infinitely loving being who exists in another plain of reality from ours is incredibly difficult in an age where science has shown us so much and when naturalism has “won” as the theologians lament.  Believing in God cannot be had easily or readily given the other things we know.  So for many people who believe, the answer is to simply reject the source that is telling them different. 

Recently in some of the debates about the resurrection, beleivers confronted me with the works of N.T. Wright, a Christian historical apologist.  One of Wright’s theses is that the New Testament Jews simply could not have come up with the idea of a bodily resurrection on their own.  The only way they would have ever produced the idea is if Jesus himself gave it to them by actually returning from the dead.  Wright presents a masterful argument filled with historical arguments and citations.  As long as the historical evidence appears to be in favor of his view, he’s eager to employ its methods.  But deep within his works, the truth about his commitment to historical methods and the resurrection comes out.  Ultimately when he is faced with serious historical challenges to making the case for the resurrection, Wright recommends that the real problem is the historical method and that we should put the belief in Jesus first. 

“If we attempt to argue for the historical truth of the resurrection on standard historical grounds, have we not allowed historical method, perhaps including its hidden Enlightenment roots, to become lord, to set the bounds of what we know, rather than allowing God himself, Jesus himself, and indeed the resurrection itself, to establish not only what we know but how we can know it?” (Jesus' Resurrection and Christian Origins)

That is, since we  know that Jesus was real, then we can be assured that the only acceptable historical methods for proving that Jesus was real must be ones that prove that he was real.  If our historical methods do not produce the correct conclusion, then it must be the methods, not the conclusion that are wrong. 

Wright gives us just one example of an academic scholar dressing the fallacy up to make it seem more presentable.  Putting lipstick on the pig, as it were.  The other instances of the comparable mistake in the religious rationalizations are countless. 


The hazards of simply rejecting reality when it doesn’t suit our preferences should be equally obvious.  The Munro study gives us a stern warning about the cognitive pitfalls we are prone to, and the application to religious cases shows us how seductive a supernatural ideology can be. 

Frequently, people make the charge against atheism or science that it is some form of religious faith too.  Perhaps they are thinking that if science is just as much an ideology, then there’s nothing so wrong with adopting an equally groundless religious one instead.  That would be a mistake.  But more important, what Munro and Wright show us is that there is a fundamental difference between science and religion that people are missing.  The point of religiousness is to believe particular doctrinal claims.  Believing is the whole point of religiousness as Wright’s drawing his line in the sand makes clear.  But science is a method for acquiring beliefs that is neutral with regard to what they are.  It tells us how to confirm or disconfirm what we think might be true.  Much of what religious institutions strive to do is to implant belief and then equip us with the means to reject anything that would conflict with them.  Preachers, priests, and rabbis cultivate believing of certain claims in their flocks.  Their charges are in need of protection; they need their faith strengthened against doubts that would undermine them.  Sermons, prayers, devotionals, and cermonies serve to fortify beliefs and behaviors in them that would not be sustained otherwise.  Doubt, criticisms, and objections are the point of the scientific method.  Finding reasons to reject a hypothesis makes it possible for us to make some provisional claims about what is true.  Without some methodological procedure for vetting hypotheses and separating the good from the bad, we can’t claim to have any justification for them.  The method of doubting is what justifies and keeps the floodgates of failed views closed.