Friday, June 1, 2007

Knowing Your Own Mind About God

In general, we think of ourselves as idealized reasoning agents. When we are making decisions, forming opinions about things, or sustaining a belief or behavior, we tend to think that some careful reasoning and attentiveness on our own part will help insure that the results are rational. We also have the view that the contents of our own minds, our motives, our reasons, and our beliefs are readily available to us, transparent, open, and incorrigible to introspection. You know your own mind better than anyone else, and you know your beliefs, your reasons, your motives, and the extent to which your decisions are reasonable.

Volumes and volumes of contemporary research in psychology and philosophy are making it clear that most, if not all of these assumptions are grossly mistaken. A recent article in New Scientist details a long list of ways in which we all make bad, irrational decisions.

Here are just a few of the highlights:

People are very bad at anticipating how happy a choice will make them, or how bad the consequences of some feared negative outcome will really be. We to think that winning the lottery will make us happier than it actually will, and we tend to think that a disaster like losing a leg will be make our lives much worse than it does.

Our emotions have a very strong impact on the outcomes of our decisions. For example, men who are mad will gamble much more and take bigger risks when they are angry.

Confirmation bias—emphasizing or selecting evidence that supports a pet belief while neglecting evidence that would refute it—affects us dramatically and makes it very hard for us to make decisions that adequately weigh all the alternatives. To make matters worse, we estimate that confirmation bias will affect other people’s decision making much more than it affects our own. Our cognitive constitution tries to latch onto examples or data that corroborates favored views that we have already made up our minds about. The tendency is very strong and it often requires a powerful force of will to resist it and actively seek out contrary opinions, alternative explanations, and different possibilities.

The full article is here:

Now let’s consider the God question. On the classic, old school theism model,

1a) the theist holds that a reasonable person who considers the right evidence objectively and rationally will be justified in believing that God exists,

1b) the atheist or agnostic holds that a reasonable person who considers the right evidence objectively will be justified in believing that God does not exist, or that God’s existence cannot be known respectively.
2a) the theist holds that a person who considers that evidence and doesn’t conclude that God exists is being irrational.
2b) the atheist holds that a person who considers all the relevant evidence and doesn’t conclude that God doesn’t exist is being irrational. The agnostic holds that it is irrational not to be agnostic from the evidence.
3) knowing whether or not you believe in God and what sort of belief that is simply a matter of introspecting your own thoughts, and
4) knowing what your reasons are for believing in God (or not) is also merely a matter of introspecting and it will be clear to you what your grounds or reasons for believing are.

Today, despite many developments in what is being called post-evidentialist or post-modernist theism, probably most of the people engaged in this discussion about God either explicitly or implicitly endorse either 1a), 2a), 3) and 4), or 1b, 2b), 3) and 4).

There is a lot to comment on here. But let’s focus on 3) and 4). A number of developments in experimental psychology, cognitive research, and epistemology have made it increasingly clear that 3) and 4) are mistaken. That is, there are good reasons to think that in many cases, what you believe is not actually available to introspection, and the grounds or reasons for your beliefs are either not available to introspection, or introspection is not a reliable or accurate means of determining the grounds of your belief.

I’ll just sketch out a few of the more interesting cases and arguments that seem to support these conclusions. What people will report they believe, it turns out, is highly influencable by environmental factors, priming, context, and expectations. In a number of important experiments, it has been shown that when an image of something that test subjects find objectionable is flashed at them for an interval that is too short for them to be consciously aware of it (approx. less than 250 milliseconds), they will then respond differently to questions or tasks put to them than they do when they are not primed with the fast image. What this suggests is that there are cognitive gears set in motion below the conscious threshold that affect what we experience or are conscious of, but we are completely unaware of these mechanisms. It would seem to follow then, that your introspections of what you believe or what you experience are relatively late stage results of processes that occur without your control, supervision, or access. And your reports about what you believe and why you believe it may or may not align with what is really going on in your head.

A couple of other examples deserve consideration. There are cases where patients, particularly some kinds of stroke victims, will report that they are in pain but there in nothing particularly unpleasant about it. They can recognize that they are experiencing pain, but it lacks the painful affect. There are also cases, now famous in the philosophy literature, of people with blind sight. They report, and insist, that they are blind. But when asked to guess, or given visual tasks that they attempt like counting objects, they will consistently offer the correct answers. And there are cases of the reverse where someone insists that they are not blind, but when given visual tasks it is clear that they cannot see anything. When asked why they didn’t succeed at the counting task or navigating around objects, they will continue to say that they can see but that they were distracted or confused, or they will make some other excuse.

There is a great deal more to be said here about these case and their implications. But an important point that I want to draw out is that our common sense view about being able to know what we believe and being able to know the reasons or causes that lead us to believe it, is simply not trustworthy. Your own mind is simply not as transparent or accessible to you as you thought it was. And this point is particularly important for the question of believing in God. Most people will readily admit that the existence of God is a matter of incredible emotional, psychological, and personal importance. Even without third party neuroscience researchers to test and examine our reports about our beliefs, we all know that when it comes to God, there are powerful sub-conscious, or non-rational aspects of our cognitive constitutions at work. I’ve called this deep felt need that we have for there to be a God The Urge in previous posts.

So here’s the point: since we are not very good at knowing our own minds, and since we all seem to have The Urge, it stands to reason that the rationality of religious beliefs are prima facie suspect. And they should be suspicious to you even if you have thought hard about it and it still seems to you that you have good reasons for believing and that those reasons are why you believe. Strange things happen in the recesses of the human mind/brain. And a lot of very careful research and arguments are starting to suggest that there is an evolutionary, biological foundation of religious belief. I would also submit that the near universal subscription across cultures and across time to beliefs about some sort of afterlife, some sort of higher, supernatural power, screams out for an evolutionary, biological, or neurological explanation. In human history, we just don’t find that many people in that many cultures and eras in such deep agreement about anything. That they all believe in some kind of God or gods and the afterlife, and that they spend their time bickering about the details, suggests that the rudiments of belief belong to something much more basic than our higher, rationalistic intellects.

I am not arguing that rational autonomy is altogether impossible, although I think for many people concerning many beliefs, particularly religious beliefs, it is. But what is becoming clear as science allows us to understand ourselves better, including the deepest, most private parts of our minds, is that achieving rational autonomy is much, much harder than we assumed for centuries. And one of the lessons here is that achieving intellectual discipline and freedom has to be a higher priority in your mental life than adherence to an ideology. Being an atheist or a theist has to come second to being a clear, objective, careful, and diligent thinker. Otherwise it the ideology that’s believing you, not you in charge of your own mind and beliefs.


Howaminotmyself? said...

Nice post. I think you have a sketch of a very powerful argument here. Though you need to be careful that it doesn't come back to haunt you.

There has been a lot of discussion in recent Philosophy (post 1950) concerning the capacity of our rational enterprises and the fallibility of our heuristic devises when it comes to generating beliefs. Much of this talk has been born from revelations in psychology and neuroscience and has drastically altered what is known in philosophy as Normativity. Your post highlights these developments in a such a way as to bring it to bear on the god discussion, and rightly so. It is high time that "the god urge" is made the direct object of our investigation.

However by brining in to question our methods of reasoning and modes of doxastic generation, particularly reasoning about god, you run the risk of pulling the rug from beneath our feet. By pointing out the flaws in our rational capacity you weaken our only method of evaluation. Calling in to question our reason, casts doubt on the question itself.

I don't think you have fallen ill of this, and I think that you can easily isolate the methods you use to evaluate beliefs from your belief generating processes, such that the failure of one will not entail a failure in the other. What we really need is a robust empirical inquiry in to the impact that the "god urge" has on our doxastic structures. To what extent, if any, does it affect our reasoning, and to what extent does it affect our beliefs. Should it only be affectatious on the latter then a remedy should be available. However should it influence the former we will find ourselves in quite the quandary.


Anonymous said...

The future of philosophy and religion and many other disciplines are going to be determined upon what we learn about the brain.

How the brain interacts in the world, and in general, is really the locus of every philosophical battle that's ever been waged.

For me, the question of human belief is the same question as human autonomy, or Free Will.

That is, do human beings have any control or deontological responsibility over their beliefs?

More and more the answer appears to be yes, and NO.

If Free Will and Belief are all results of biology, we have to recognize that not all biological systems are equal.

This is to say that A may be more able to form independent beliefs in his or her cultural context than B.

For example, Albert Einstein has more Free Will in belief formation than Charlie Manson.

The schizophrenic is less free than the computer programmer.

We live in a world of determined subjects, compatilbilistic subjects, and subjects with the biological capacity for total free will (indeterministic).

Everything follows from biology.

Hence, theists are such that they become deterministically addicted to cultural stereotypes because they lack the biological ability to question experiential indoctrination.

Therefore, if human beings are evolving biologically the ability to experience more freedom, ultimately theism will die off in the gene pool.