We are aggressive belief machines. We form beliefs at the slightest provocation, with little or no evidence, and on the basis of wild speculations. We construct elaborate causal explanations for phenomena whose existence is under supported by the evidence. We mistake correlations for causation. And we leap to theorize about some alleged pattern of events before we have evidence that anything has occurred.
Against the backdrop of all of this promiscuous and faulty belief formation, we do a systematically bad job of inferring conclusions from the evidence. Just to identify a few of the long list of fallacious inferences that we regularly commit:
Confirmation bias: we have a pronounced tendency to pick out those events that we think provide us with evidence for a favored hypothesis while neglecting to find or ignoring evidence that would disprove it.
Magical thinking: We are highly prone to attribute magical powers of causation to our thoughts. We suspect that our negative thoughts about a person are causally responsible for bad things that might happen to them. And we credit positive outcomes to the fervent optimistic wishes we harbored beforehand in our minds.
Bias Blindness: We readily blame the reasoning mistakes of others on the presence of biases in their thinking while failing to acknowledge our own propensity to do the same.
Mistaking Changes in Self for Changes in the World: Upon noticing some different trend in events we are prone to attribute those changes to the presence of causal factors outside instead of recognizing that changes in our own thinking and attention would produce the same appearance of change.
The Failure of Introspection: We are demonstrably bad at identifying stimuli that have an important influence on our responses and beliefs. We are frequently unaware of our own stimulus responses that contribute to belief. And people are very bad judges of what they believe and why they believe it on the basis of introspection.
These fallacies and many more are corroborated by hundreds of carefully constructed psychological studies. The mistakes are manifest nowhere more flagrantly than in our religious beliefs. We form fallacious beliefs about miracles, prayers, blessings, punishments, God’s guiding hand in our lives, communications with God, evidence for God’s existence, attacks on non-believers and those of different religious faiths, and so on.
The single most effective and important tool that people have discovered for finding their way through the dark jungle of our faulty belief tendencies, biases, and mistakes is science. One and only one epistemological virtue governs the conduct of science, and it is this idea that represents our single greatest hope for liberation as a species: For every hypothesis that we take to be true we must do everything in our power to find disconfirming evidence, if it is out there, to undermine it. Only when we have repeatedly vetted an idea from every angle, sought out all the possible alternative explanations, and tried to disprove it every way we can do we provisionally accept it as supported by the evidence. And still we must remain prepared to abandon the belief if new evidence demands it. The goal in science is always to try to pull the rug out from under our feet. We must find new, better information that will overturn what we take to be true. Believing is easy. But resisting the temptation to believe and exhausting every available bit of evidence to disconfirm takes vigilance, self-discipline, and a rejection of dogma.
Religious believing, by contrast, tends to exhibit every epistemic vice that science strives to eliminate. Believing by faith, ignoring the evidence, overcoming doubts, refusing to change your mind in the light of new information are actively sought within religion instead of being a source of embarrassment. Adherents cling to the edicts of authority, instead of holding all claims up to the harsh light of blind, peer-review. With a thousand actions, when we practice religion we train ourselves to be poor reasoners. We chant slogans that we don’t believe, we stifle questions and skeptical inquiry, and we struggle to accept that which we know isn’t so. Religious believing represents our worst epistemic habits and vices, science embodies our best.
The insidious and widely popular view that religious believing and science are compatible disguises a dangerous urge in us that would destroy the one greatest hope for humanity. Science and religion are not compatible. They represent fundamentally different approaches to the question of human knowledge. Religion would have us accept on authority without analysis, suppressing doubts. Religion would have us accept the principles of dogmatic doctrine and then make everything else we encounter conform to that unyielding and indefeasible set of claims. Science is a method for analyzing, testing, corroborating, and rejecting hypotheses. Science is never about slavishly conforming to ideas that we have resolved to hold on to no matter what evidence arises. No claim in science is immune to criticism, disproof, and analysis. And if any hypothesis cannot withstand the harsh light of disconfirmation, then it must be rejected. More than anything else that humans have discovered, the scientific method for understanding the world has proven to be successful, accurate, and beneficial.
Here’s a very short list of some of the research on the failings of human reasoning:
Boynton, David M. “Superstitious responding and frequency matching in the positive bias and gambler’s fallacy effects,” Organizational Behavior and Human Decision Processes 91 (2003) 119–127
Eibach, Richard P., Libby, Lisa K., and Gilovich, Thomas D., “When Change in the Self Is Mistaken for Change in the World”
Emily Pronin, Daniel Y. Lin, Lee Ross, “The Bias Blind Spot: Perceptions of Bias in Self Versus Others,” PERSONALITY AND SOCIAL PSYCHOLOGY BULLETIN, 369-381
Fessler, Daniel M.T. *, Pillsworth, Elizabeth G., Flamson, Thomas J., “Angry men and disgusted women: An evolutionary approach to the influence of emotions on risk taking,” Organizational Behavior and Human Decision Processes 95 (2004) 107–123
Gilovich, Thomas, “Biased Evaluation and Persistence in Gambling,” Journal of Personality and Social Psychology, 1983, Vol. 44, No. 6, 1110-1126
Gilovich, Thomas and Kenneth Savitzky, "Like Goes with Like: The role of representativeness in erroneous and pseudoscientific beliefs," Skeptical Inquirer, March April 1996.
Gilovich, Thomas, Vallone, Robert, and Tversky, Amos. “The Hot Hand in Basketball: On the Misperception of Random Sequences,” Cognitive Psychology, 17, 295-314 (1985)
Kermer ,Deborah A., Driver-Linn, Erin, Wilson, Timothy D., and Gilbert Daniel T., Loss Aversion Is an Affective Forecasting Error,” Psychological Science Volume 17—7—Number 8 649-653
McCloskey, “Naïve Theory of Motion.” in D. Gentner and A. Stevens (eds), Mental Models, Hillsdale:Erlbaum, pp. 229-324.
Nisbett, Richard E. and Wilson, Timothy DeCamp, “Telling More Than We Can Know: Verbal Reports on Mental Processes” Psychological Review, American Psychological Association, Inc., Vol. 94, No. 3, May, 1977
Pronin, Emily, “Perception and misperception of bias in human judgment” Trends in Cognitive Sciences Vol.11 No.1 37-43.
Pronin, Emily, Berger, Jonah, and Molouki, Sarah, “Alone in a Crowd of Sheep: Asymmetric Perceptions of Conformity and Their Roots in an Introspection Illusion,” Journal of Personality and Social Psychology, Vol. 92, No. 4, 585-595, 2007.
Pronin, Emily, Gilovich, Thomas, Ross, Lee, Objectivity in the Eye of the Beholder: Divergent Perceptions of Bias in Self Versus Others,” Psychological Review, American Psychological Association 2004, Vol. 111, No. 3, 781–799
Pronin, Wegner, McCarthy, Rodriguez, “Everyday Magical Powers: The Role of Apparent Mental Causation in the Overestimation of Personal Influence” Journal of Personality and Social Psychology, 2006, Vol. 91, No. 2, 218–231
Schwartz, Barry. “The Tyranny of Choice,” Scientific American, April 2004. 70-75
Tversky, Amos and Kahneman, Daniel, “Judgment under Uncertainty: Heuristics and Biases,” Science, New Series, Vol. 185, No. 4157, (Sep. 27, 1974), pp. 1124-1131
Tversky, Amos and Kahneman, Daniel, “The Framing of Decisions and the Psychology of Choice,” Science, New Series, Vol. 211, No. 4481, (Jan. 30, 1981), pp. 453-458
Wason, P.C., Shapiro, D. (1966). "Reasoning", in Foss, B. M.: New horizons in psychology. Harmondsworth: Penguin.
Wason, P.C. (1971). "Natural and contrived experience in a reasoning problem". Quarterly Journal of Experimental Psychology 23: 63–71.
Wheatley, Thalia, and Haidt, Jonathan, “Hypnotic Disgust Makes Moral Judgments More Severe,” 2005 American Psychological Society 16—Number 10 780-784