Wednesday, May 14, 2008

Reining In the Fallacious Human Belief Machine

We are aggressive belief machines. We form beliefs at the slightest provocation, with little or no evidence, and on the basis of wild speculations. We construct elaborate causal explanations for phenomena whose existence is under supported by the evidence. We mistake correlations for causation. And we leap to theorize about some alleged pattern of events before we have evidence that anything has occurred.

Against the backdrop of all of this promiscuous and faulty belief formation, we do a systematically bad job of inferring conclusions from the evidence. Just to identify a few of the long list of fallacious inferences that we regularly commit:

Confirmation bias: we have a pronounced tendency to pick out those events that we think provide us with evidence for a favored hypothesis while neglecting to find or ignoring evidence that would disprove it.

Magical thinking: We are highly prone to attribute magical powers of causation to our thoughts. We suspect that our negative thoughts about a person are causally responsible for bad things that might happen to them. And we credit positive outcomes to the fervent optimistic wishes we harbored beforehand in our minds.

Bias Blindness: We readily blame the reasoning mistakes of others on the presence of biases in their thinking while failing to acknowledge our own propensity to do the same.

Mistaking Changes in Self for Changes in the World: Upon noticing some different trend in events we are prone to attribute those changes to the presence of causal factors outside instead of recognizing that changes in our own thinking and attention would produce the same appearance of change.

The Failure of Introspection: We are demonstrably bad at identifying stimuli that have an important influence on our responses and beliefs. We are frequently unaware of our own stimulus responses that contribute to belief. And people are very bad judges of what they believe and why they believe it on the basis of introspection.

These fallacies and many more are corroborated by hundreds of carefully constructed psychological studies. The mistakes are manifest nowhere more flagrantly than in our religious beliefs. We form fallacious beliefs about miracles, prayers, blessings, punishments, God’s guiding hand in our lives, communications with God, evidence for God’s existence, attacks on non-believers and those of different religious faiths, and so on.

The single most effective and important tool that people have discovered for finding their way through the dark jungle of our faulty belief tendencies, biases, and mistakes is science. One and only one epistemological virtue governs the conduct of science, and it is this idea that represents our single greatest hope for liberation as a species: For every hypothesis that we take to be true we must do everything in our power to find disconfirming evidence, if it is out there, to undermine it. Only when we have repeatedly vetted an idea from every angle, sought out all the possible alternative explanations, and tried to disprove it every way we can do we provisionally accept it as supported by the evidence. And still we must remain prepared to abandon the belief if new evidence demands it. The goal in science is always to try to pull the rug out from under our feet. We must find new, better information that will overturn what we take to be true. Believing is easy. But resisting the temptation to believe and exhausting every available bit of evidence to disconfirm takes vigilance, self-discipline, and a rejection of dogma.

Religious believing, by contrast, tends to exhibit every epistemic vice that science strives to eliminate. Believing by faith, ignoring the evidence, overcoming doubts, refusing to change your mind in the light of new information are actively sought within religion instead of being a source of embarrassment. Adherents cling to the edicts of authority, instead of holding all claims up to the harsh light of blind, peer-review. With a thousand actions, when we practice religion we train ourselves to be poor reasoners. We chant slogans that we don’t believe, we stifle questions and skeptical inquiry, and we struggle to accept that which we know isn’t so. Religious believing represents our worst epistemic habits and vices, science embodies our best.

The insidious and widely popular view that religious believing and science are compatible disguises a dangerous urge in us that would destroy the one greatest hope for humanity. Science and religion are not compatible. They represent fundamentally different approaches to the question of human knowledge. Religion would have us accept on authority without analysis, suppressing doubts. Religion would have us accept the principles of dogmatic doctrine and then make everything else we encounter conform to that unyielding and indefeasible set of claims. Science is a method for analyzing, testing, corroborating, and rejecting hypotheses. Science is never about slavishly conforming to ideas that we have resolved to hold on to no matter what evidence arises. No claim in science is immune to criticism, disproof, and analysis. And if any hypothesis cannot withstand the harsh light of disconfirmation, then it must be rejected. More than anything else that humans have discovered, the scientific method for understanding the world has proven to be successful, accurate, and beneficial.

Here’s a very short list of some of the research on the failings of human reasoning:

Boynton, David M. “Superstitious responding and frequency matching in the positive bias and gambler’s fallacy effects,” Organizational Behavior and Human Decision Processes 91 (2003) 119–127

Eibach, Richard P., Libby, Lisa K., and Gilovich, Thomas D., “When Change in the Self Is Mistaken for Change in the World”

Emily Pronin, Daniel Y. Lin, Lee Ross, “The Bias Blind Spot: Perceptions of Bias in Self Versus Others,” PERSONALITY AND SOCIAL PSYCHOLOGY BULLETIN, 369-381

Fessler, Daniel M.T. *, Pillsworth, Elizabeth G., Flamson, Thomas J., “Angry men and disgusted women: An evolutionary approach to the influence of emotions on risk taking,” Organizational Behavior and Human Decision Processes 95 (2004) 107–123

Gilovich, Thomas, “Biased Evaluation and Persistence in Gambling,” Journal of Personality and Social Psychology, 1983, Vol. 44, No. 6, 1110-1126

Gilovich, Thomas and Kenneth Savitzky, "Like Goes with Like: The role of representativeness in erroneous and pseudoscientific beliefs," Skeptical Inquirer, March April 1996.

Gilovich, Thomas, Vallone, Robert, and Tversky, Amos. “The Hot Hand in Basketball: On the Misperception of Random Sequences,” Cognitive Psychology, 17, 295-314 (1985)

Kermer ,Deborah A., Driver-Linn, Erin, Wilson, Timothy D., and Gilbert Daniel T., Loss Aversion Is an Affective Forecasting Error,” Psychological Science Volume 17—7—Number 8 649-653

McCloskey, “Na├»ve Theory of Motion.” in D. Gentner and A. Stevens (eds), Mental Models, Hillsdale:Erlbaum, pp. 229-324.

Nisbett, Richard E. and Wilson, Timothy DeCamp, Telling More Than We Can Know: Verbal Reports on Mental Processes” Psychological Review, American Psychological Association, Inc., Vol. 94, No. 3, May, 1977

Pronin, Emily, “Perception and misperception of bias in human judgment” Trends in Cognitive Sciences Vol.11 No.1 37-43.

Pronin, Emily, Berger, Jonah, and Molouki, Sarah, “Alone in a Crowd of Sheep: Asymmetric Perceptions of Conformity and Their Roots in an Introspection Illusion,” Journal of Personality and Social Psychology, Vol. 92, No. 4, 585-595, 2007.

Pronin, Emily, Gilovich, Thomas, Ross, Lee, Objectivity in the Eye of the Beholder: Divergent Perceptions of Bias in Self Versus Others,” Psychological Review, American Psychological Association 2004, Vol. 111, No. 3, 781–799

Pronin, Wegner, McCarthy, Rodriguez, “Everyday Magical Powers: The Role of Apparent Mental Causation in the Overestimation of Personal Influence” Journal of Personality and Social Psychology, 2006, Vol. 91, No. 2, 218–231

Schwartz, Barry. “The Tyranny of Choice,” Scientific American, April 2004. 70-75

Tversky, Amos and Kahneman, Daniel, “Judgment under Uncertainty: Heuristics and Biases,” Science, New Series, Vol. 185, No. 4157, (Sep. 27, 1974), pp. 1124-1131

Tversky, Amos and Kahneman, Daniel, “The Framing of Decisions and the Psychology of Choice,” Science, New Series, Vol. 211, No. 4481, (Jan. 30, 1981), pp. 453-458

Wason, P.C., Shapiro, D. (1966). "Reasoning", in Foss, B. M.: New horizons in psychology. Harmondsworth: Penguin.

Wason, P.C. (1971). "Natural and contrived experience in a reasoning problem". Quarterly Journal of Experimental Psychology 23: 63–71.

Wheatley, Thalia, and Haidt, Jonathan, “Hypnotic Disgust Makes Moral Judgments More Severe,” 2005 American Psychological Society 16—Number 10 780-784

Willis, Janine and Todorov, Alexander, “First Impressions Making Up Your Mind After a 100-Ms Exposure to a Face, ‘ Volume 17, 2006 Association for Psychological Science 7


Eric Sotnak said...

An anti-theistic argument might be spawned from this -- somewhat of a counterweight to Alvin Plantinga's argument against epistemic naturalism developed in his "warrant trilogy". A simplified version of the argument might go like this:

(1) If God exists, then God wants us to believe in him.
(2) If God wants us to believe in him, then he has made us in such a way that non-rational influences on belief formation are minimal.
(3) Non-rational influences on belief formation are not minimal.
(3) God does not exist.

Matt McCormick said...
This comment has been removed by the author.
Matt McCormick said...

As usual, Prof. Sotnak is right on the money.

I've been thinking about some related stuff while discussing reliabilism in my classes. Reliabilism is the view, roughly, that a person is justified in having a belief when that belief was produced by reliable cognitive faculties. And reliability can be measured empirically in terms of determining the rate at which a given method or belief forming mechanism actually gets it right overall.

The argument against naturalism that Eric alludes to here is roughly that we would not expect evolution to produce beings with reliable cognitive belief forming faculties such as ours if it is left to itself. But we do have reliable belief forming faculties, so we must have been produced by source that designed cognitive reliability into the system.

What Eric's argument and my long list of fallacies and studies make clear is that in a surprisingly large range of very common situations humans do not reliably form true beliefs. Some form of reliabilism is probably the best game in town for explaining what it would be to have a justified belief, and knowledge. But the empirical studies about the foibles of human reason give us good grounds for being unreliabilists. (And now we should wonder about our ability to form reliable beliefs about our unreliability.)


Anonymous said...

Only a slight tangent, but human perceptual illusions can be studied, and are being studied, to see just how we misperceive things. Besides, they're fun.

Illusion Sciences

Best visual illusions of the year contest

Anonymous said...

Awareness Test

Anonymous said...

Great article as for me. I'd like to read something more concerning that topic.
BTW look at the design I've made myself London escorts