Confirmation bias refers to a type of selective thinking whereby one tends to notice and look for what confirms
one’s beliefs, and to ignore, not look for, or undervalue the relevance of what contradicts one’s beliefs. For
example, if someone who works in a hospital emergency room believes that during a full moon there is an
increase in admissions, she will take notice of admissions during a full moon. However, she will be inattentive to
the phase of the moon when accidents occur during other times of the month. A tendency to do this over time
unjustifiably strengthens one’s belief in the relationship between the full moon and emergency room admissions.
Most people don’t seem to realize how easy it is to find supportive evidence for almost any belief. By ignoring contrary evidence or by making no effort to find such evidence, one can convince oneself of almost anything. For example, when Joseph Banks
Rhine, one of the pioneers in ESP research, found that some test subjects failed to do better than would be
expected by chance, he explained the data away as being due to unconscious direction to avoid the target.
Parapsychologists have even given this alleged phenomenon a name: psi-missing. (Psi is a word used in
parapsychology to refer to any kind of paranormal phenomena.) Rhine even claimed that subjects who weren’t
performing in such a way as to support his ESP hypothesis didn't like him and were consciously guessing
incorrectly to spite him (Park 2000: 42). When skeptics could not replicate ESP experiments, parapsychologists
attributed this to the experimenter effect, which they defined to mean that believers in ESP got positive results,
while non-believers get negative results because their telepathic effect is different on the subjects. When the laws of
chance predict that over a long period of time a person will guess at something at a chance rate, parapsychologists
take doing better than chance over a short period as a time when ESP was working. If a psychic is caught cheating,
the defender of the paranormal will say that that doesn’t mean she always cheats and that some of her feats may
still be genuinely psychic. If you are clever enough, you will be able to rationalize any data that seem to contradict
your belief and find more support in the data rather than less.
This tendency to give more attention and weight to data that support our preconceptions and beliefs than we do
to contrary data is especially pernicious when our beliefs are little more than prejudices. If our beliefs are firmly
established upon solid evidence and valid confirmatory experiments, the tendency to give more attention and
weight to data that fits with our beliefs should not lead us astray. Of course, if we become blinded to evidence truly
refuting a favored hypothesis, we have crossed the line from reasonableness to closed-mindedness.
Numerous studies have demonstrated that people generally give an excessive amount of value to confirmatory
information, i.e., data which is positive or which supports a position (Gilovich 1993). Thomas Gilovich speculates
that the “most likely reason for the excessive influence of confirmatory information is that it is easier to deal with
cognitively.” It is much easier to see how a piece of data supports a position than it is to see how it might count
against the position. Consider how dowsers are convinced that they have paranormal powers that guide them in
finding water with a bent stick. The belief in their power is based upon remembering the times they found water.
However, dowsers and their advocates don’t keep track of failures. When tested under controlled conditions, they
have failed to perform at anything better than a chance rate. (Controlled experiments are discussed in chapter
eight.)
This tendency to give more attention and weight to the positive and the confirmatory has been shown to
influence memory. When digging into our memories for data relevant to a position, we are more likely to recall
data that confirm the position (Gilovich).
Researchers are sometimes guilty of confirmation bias by setting up experiments or framing their data in ways
that will tend to confirm their hypotheses. They compound the problem by proceeding in ways that avoid dealing
with data that would contradict their hypotheses. For example, American anthropologists generally accept what is
known as the “Clovis model.” Some 11,000 years ago, according to this model, people from Northeast Asia entered
the Americas and spread across the Great Plains, the Southwest, and eventually to the East. These peoples are
considered to be the ancestors of today’s Native Americans. Anthropologists had no problem piling up the
evidence in support of the Clovis model. However, few bothered to look for anything older and did not excavate
for sites beneath the Clovis limit. Recently, excavations at Monte Verde in Chile and Meadowcroft Rockshelter in
Avella, Pennsylvania, have led to discoveries that may set back the time of the earliest settlers from one to several
thousand years. Even more interesting is that skulls that had been assumed to be of the stock from which Native
American descended are being re-examined and there is now some doubt as to the racial origins of the skulls.12
Experimenters might avoid or reduce confirmation bias by collaborating with colleagues who hold contrary
hypotheses. By jointly working on the design of an experiment and on the analysis of the resulting data, the
experimenters might keep each other from inadvertently biasing the study. For example, Peter Brugger, a
neuroscientist who is skeptical of ESP claims, joined with noted parapsychologist John Palmer to conduct a series
of studies to test, among other things, Brugger’s hypothesis that some individuals who seem to show paranormal
abilities in experiments on telepathy are actually subconsciously recognizing hidden patterns. The two-year project
was funded by the Cogito Foundation on condition that both a parapsychologist and a skeptic be involved. (Results
will not be available until after 2006.)
No comments:
Post a Comment