Confirmation bias
Confirmation bias (also confirmatory bias, myside bias,[a] or congeniality bias[2]) is the tendency to search for, interpret, favor, and recall information in a way that confirms or supports one's prior beliefs or values.[3] People display this bias when they select information that supports their views, ignoring contrary information, or when they interpret ambiguous evidence as supporting their existing attitudes. The effect is strongest for desired outcomes, for emotionally charged issues, and for deeply entrenched beliefs. Confirmation bias is insuperable for most people, but they can manage it, for example, by education and training in critical thinking skills.
Biased search for information, biased interpretation of this information, and biased memory recall, have been invoked to explain four specific effects:
- attitude polarization(when a disagreement becomes more extreme even though the different parties are exposed to the same evidence)
- belief perseverance (when beliefs persist after the evidence for them is shown to be false)
- the irrational primacy effect (a greater reliance on information encountered early in a series)
- illusory correlation (when people falsely perceive an association between two events or situations).
A series of psychological experiments in the 1960s suggested that people are biased toward confirming their existing beliefs. Later work re-interpreted these results as a tendency to test ideas in a one-sided way, focusing on one possibility and ignoring alternatives. Explanations for the observed biases include wishful thinking and the limited human capacity to process information. Another proposal is that people show confirmation bias because they are pragmatically assessing the costs of being wrong, rather than investigating in a neutral, scientific way.
Flawed
Definition and context
Confirmation bias, a phrase coined by English psychologist
Confirmation biases are effects in information processing. They differ from what is sometimes called the behavioral confirmation effect, commonly known as self-fulfilling prophecy, in which a person's expectations influence their own behavior, bringing about the expected result.[5]
Some psychologists restrict the term "confirmation bias" to selective collection of evidence that supports what one already believes while ignoring or rejecting evidence that supports a different conclusion. Others apply the term more broadly to the tendency to preserve one's existing beliefs when searching for evidence, interpreting it, or recalling it from memory.[6][b] Confirmation bias is a result of automatic, unintentional strategies rather than deliberate deception.[8][9]
Types
Biased search for information
Experiments have found repeatedly that people tend to test hypotheses in a one-sided way, by searching for evidence consistent with their current
The preference for positive tests in itself is not a bias, since positive tests can be highly informative.[16] However, in combination with other effects, this strategy can confirm existing beliefs or assumptions, independently of whether they are true.[8] In real-world situations, evidence is often complex and mixed. For example, various contradictory ideas about someone could each be supported by concentrating on one aspect of his or her behavior.[11] Thus any search for evidence in favor of a hypothesis is likely to succeed.[8] One illustration of this is the way the phrasing of a question can significantly change the answer.[11] For example, people who are asked, "Are you happy with your social life?" report greater satisfaction than those asked, "Are you unhappy with your social life?"[17]
Even a small change in a question's wording can affect how people search through available information, and hence the conclusions they reach. This was shown using a fictional child custody case.[18] Participants read that Parent A was moderately suitable to be the guardian in multiple ways. Parent B had a mix of salient positive and negative qualities: a close relationship with the child but a job that would take them away for long periods of time. When asked, "Which parent should have custody of the child?" the majority of participants chose Parent B, looking mainly for positive attributes. However, when asked, "Which parent should be denied custody of the child?" they looked for negative attributes and the majority answered that Parent B should be denied custody, implying that Parent A should have custody.[18]
Similar studies have demonstrated how people engage in a biased search for information, but also that this phenomenon may be limited by a preference for genuine diagnostic tests. In an initial experiment, participants rated another person on the
Personality traits influence and interact with biased search processes.[21] Individuals vary in their abilities to defend their attitudes from external attacks in relation to selective exposure. Selective exposure occurs when individuals search for information that is consistent, rather than inconsistent, with their personal beliefs.[22] An experiment examined the extent to which individuals could refute arguments that contradicted their personal beliefs.[21] People with high confidence levels more readily seek out contradictory information to their personal position to form an argument. This can take the form of an oppositional news consumption, where individuals seek opposing partisan news in order to counterargue.[23] Individuals with low confidence levels do not seek out contradictory information and prefer information that supports their personal position. People generate and evaluate evidence in arguments that are biased towards their own beliefs and opinions.[24] Heightened confidence levels decrease preference for information that supports individuals' personal beliefs.
Another experiment gave participants a complex rule-discovery task that involved moving objects simulated by a computer.[25] Objects on the computer screen followed specific laws, which the participants had to figure out. So, participants could "fire" objects across the screen to test their hypotheses. Despite making many attempts over a ten-hour session, none of the participants figured out the rules of the system. They typically attempted to confirm rather than falsify their hypotheses, and were reluctant to consider alternatives. Even after seeing objective evidence that refuted their working hypotheses, they frequently continued doing the same tests. Some of the participants were taught proper hypothesis-testing, but these instructions had almost no effect.[25]
Biased interpretation of information
Smart people believe weird things because they are skilled at defending beliefs they arrived at for non-smart reasons.
Confirmation biases are not limited to the collection of evidence. Even if two individuals have the same information, the way they interpret it can be biased.
A team at
The participants, whether supporters or opponents, reported shifting their attitudes slightly in the direction of the first study they read. Once they read the more detailed descriptions of the two studies, they almost all returned to their original belief regardless of the evidence provided, pointing to details that supported their viewpoint and disregarding anything contrary. Participants described studies supporting their pre-existing view as superior to those that contradicted it, in detailed and specific ways.[27][29] Writing about a study that seemed to undermine the deterrence effect, a death penalty proponent wrote, "The research didn't cover a long enough period of time," while an opponent's comment on the same study said, "No strong evidence to contradict the researchers has been presented."[27] The results illustrated that people set higher standards of evidence for hypotheses that go against their current expectations. This effect, known as "disconfirmation bias", has been supported by other experiments.[30]
Another study of biased interpretation occurred during the 2004 U.S. presidential election and involved participants who reported having strong feelings about the candidates. They were shown apparently contradictory pairs of statements, either from Republican candidate George W. Bush, Democratic candidate John Kerry or a politically neutral public figure. They were also given further statements that made the apparent contradiction seem reasonable. From these three pieces of information, they had to decide whether each individual's statements were inconsistent.[31]: 1948 There were strong differences in these evaluations, with participants much more likely to interpret statements from the candidate they opposed as contradictory.[31]: 1951
In this experiment, the participants made their judgments while in a magnetic resonance imaging (MRI) scanner which monitored their brain activity. As participants evaluated contradictory statements by their favored candidate, emotional centers of their brains were aroused. This did not happen with the statements by the other figures. The experimenters inferred that the different responses to the statements were not due to passive reasoning errors. Instead, the participants were actively reducing the cognitive dissonance induced by reading about their favored candidate's irrational or hypocritical behavior.[31]: 1956
Biases in belief interpretation are persistent, regardless of intelligence level. Participants in an experiment took the SAT test (a college admissions test used in the United States) to assess their intelligence levels. They then read information regarding safety concerns for vehicles, and the experimenters manipulated the national origin of the car. American participants provided their opinion if the car should be banned on a six-point scale, where one indicated "definitely yes" and six indicated "definitely no". Participants firstly evaluated if they would allow a dangerous German car on American streets and a dangerous American car on German streets. Participants believed that the dangerous German car on American streets should be banned more quickly than the dangerous American car on German streets. There was no difference among intelligence levels at the rate participants would ban a car.[24]
Biased interpretation is not restricted to emotionally significant topics. In another experiment, participants were told a story about a theft. They had to rate the evidential importance of statements arguing either for or against a particular character being responsible. When they hypothesized that character's guilt, they rated statements supporting that hypothesis as more important than conflicting statements.[32]
Biased memory recall of information
People may remember evidence selectively to reinforce their expectations, even if they gather and interpret evidence in a neutral manner. This effect is called "selective recall", "confirmatory memory", or "access-biased memory".[33] Psychological theories differ in their predictions about selective recall. Schema theory predicts that information matching prior expectations will be more easily stored and recalled than information that does not match.[34] Some alternative approaches say that surprising information stands out and so is memorable.[34] Predictions from both these theories have been confirmed in different experimental contexts, with no theory winning outright.[35]
In one study, participants read a profile of a woman which described a mix of introverted and extroverted behaviors.[36] They later had to recall examples of her introversion and extroversion. One group was told this was to assess the woman for a job as a librarian, while a second group were told it was for a job in real estate sales. There was a significant difference between what these two groups recalled, with the "librarian" group recalling more examples of introversion and the "sales" groups recalling more extroverted behavior.[36] A selective memory effect has also been shown in experiments that manipulate the desirability of personality types.[34][37] In one of these, a group of participants were shown evidence that extroverted people are more successful than introverts. Another group were told the opposite. In a subsequent, apparently unrelated study, participants were asked to recall events from their lives in which they had been either introverted or extroverted. Each group of participants provided more memories connecting themselves with the more desirable personality type, and recalled those memories more quickly.[38]
Changes in emotional states can also influence memory recall.[39][40] Participants rated how they felt when they had first learned that O. J. Simpson had been acquitted of murder charges.[39] They described their emotional reactions and confidence regarding the verdict one week, two months, and one year after the trial. Results indicated that participants' assessments for Simpson's guilt changed over time. The more that participants' opinion of the verdict had changed, the less stable were the participant's memories regarding their initial emotional reactions. When participants recalled their initial emotional reactions two months and a year later, past appraisals closely resembled current appraisals of emotion. People demonstrate sizable myside bias when discussing their opinions on controversial topics.[24] Memory recall and construction of experiences undergo revision in relation to corresponding emotional states.
Myside bias has been shown to influence the accuracy of memory recall.[40] In an experiment, widows and widowers rated the intensity of their experienced grief six months and five years after the deaths of their spouses. Participants noted a higher experience of grief at six months rather than at five years. Yet, when the participants were asked after five years how they had felt six months after the death of their significant other, the intensity of grief participants recalled was highly correlated with their current level of grief. Individuals appear to utilize their current emotional states to analyze how they must have felt when experiencing past events.[39] Emotional memories are reconstructed by current emotional states.
One study showed how selective memory can maintain belief in extrasensory perception (ESP).[41] Believers and disbelievers were each shown descriptions of ESP experiments. Half of each group were told that the experimental results supported the existence of ESP, while the others were told they did not. In a subsequent test, participants recalled the material accurately, apart from believers who had read the non-supportive evidence. This group remembered significantly less information and some of them incorrectly remembered the results as supporting ESP.[41]
Individual differences
Myside bias was once believed to be correlated with intelligence; however, studies have shown that myside bias can be more influenced by ability to rationally think as opposed to level of intelligence.[24] Myside bias can cause an inability to effectively and logically evaluate the opposite side of an argument. Studies have stated that myside bias is an absence of "active open-mindedness", meaning the active search for why an initial idea may be wrong.[42] Typically, myside bias is operationalized in empirical studies as the quantity of evidence used in support of their side in comparison to the opposite side.[43]
A study has found individual differences in myside bias. This study investigates individual differences that are acquired through learning in a cultural context and are mutable. The researcher found important individual difference in argumentation. Studies have suggested that individual differences such as deductive reasoning ability, ability to overcome belief bias, epistemological understanding, and thinking disposition are significant predictors of the reasoning and generating arguments, counterarguments, and rebuttals.[44][45][46]
A study by Christopher Wolfe and Anne Britt also investigated how participants' views of "what makes a good argument?" can be a source of myside bias that influences the way a person formulates their own arguments.[43] The study investigated individual differences of argumentation schema and asked participants to write essays. The participants were randomly assigned to write essays either for or against their preferred side of an argument and were given research instructions that took either a balanced or an unrestricted approach. The balanced-research instructions directed participants to create a "balanced" argument, i.e., that included both pros and cons; the unrestricted-research instructions included nothing on how to create the argument.[43]
Overall, the results revealed that the balanced-research instructions significantly increased the incidence of opposing information in arguments. These data also reveal that personal belief is not a source of myside bias; however, that those participants, who believe that a good argument is one that is based on facts, are more likely to exhibit myside bias than other participants. This evidence is consistent with the claims proposed in Baron's article—that people's opinions about what makes good thinking can influence how arguments are generated.[43]
Discovery
Informal observations
Before psychological research on confirmation bias, the phenomenon had been observed throughout history. Beginning with the Greek historian
Untruth naturally afflicts historical information. There are various reasons that make this unavoidable. One of them is partisanship for opinions and schools. ... if the soul is infected with partisanship for a particular opinion or sect, it accepts without a moment's hesitation the information that is agreeable to it. Prejudice and partisanship obscure the critical faculty and preclude critical investigation. The result is that falsehoods are accepted and transmitted.
In the Novum Organum, English philosopher and scientist Francis Bacon (1561–1626)[50] noted that biased assessment of evidence drove "all superstitions, whether in astrology, dreams, omens, divine judgments or the like".[51] He wrote:[51]
The human understanding when it has once adopted an opinion ... draws all things else to support and agree with it. And though there be a greater number and weight of instances to be found on the other side, yet these it either neglects or despises, or else by some distinction sets aside or rejects[.]
In the second volume of his The World as Will and Representation (1844), German philosopher Arthur Schopenhauer observed that "An adopted hypothesis gives us lynx-eyes for everything that confirms it and makes us blind to everything that contradicts it."[52]
In his essay (1897) What Is Art?, Russian novelist Leo Tolstoy wrote:[53]
I know that most men—not only those considered clever, but even those who are very clever, and capable of understanding most difficult scientific, mathematical, or philosophic problems—can very seldom discern even the simplest and most obvious truth if it be such as to oblige them to admit the falsity of conclusions they have formed, perhaps with much difficulty—conclusions of which they are proud, which they have taught to others, and on which they have built their lives.
In his essay (1894) The Kingdom of God Is Within You, Tolstoy had earlier written:[54]
The most difficult subjects can be explained to the most slow-witted man if he has not formed any idea of them already; but the simplest thing cannot be made clear to the most intelligent man if he is firmly persuaded that he knows already, without a shadow of doubt, what is laid before him.
Hypothesis-testing (falsification) explanation (Wason)
In Peter Wason's initial experiment published in 1960 (which does not mention the term "confirmation bias"), he repeatedly challenged participants to identify a rule applying to triples of numbers. They were told that (2,4,6) fits the rule. They generated triples, and the experimenter told them whether each triple conformed to the rule.[3]: 179
The actual rule was simply "any ascending sequence", but participants had great difficulty in finding it, often announcing rules that were far more specific, such as "the middle number is the average of the first and last".[55] The participants seemed to test only positive examples—triples that obeyed their hypothesized rule. For example, if they thought the rule was, "Each number is two greater than its predecessor," they would offer a triple that fitted (confirmed) this rule, such as (11,13,15) rather than a triple that violated (falsified) it, such as (11,12,19).[56]
Wason interpreted his results as showing a preference for confirmation over falsification, hence he coined the term "confirmation bias".[c][58] Wason also used confirmation bias to explain the results of his selection task experiment.[59] Participants repeatedly performed badly on various forms of this test, in most cases ignoring information that could potentially refute (falsify) the specified rule.[60][61]
Hypothesis testing (positive test strategy) explanation (Klayman and Ha)
Klayman and Ha's 1987 paper argues that the Wason experiments do not actually demonstrate a bias towards confirmation, but instead a tendency to make tests consistent with the working hypothesis.
In light of this and other critiques, the focus of research moved away from confirmation versus falsification of an hypothesis, to examining whether people test hypotheses in an informative way, or an uninformative but positive way. The search for "true" confirmation bias led psychologists to look at a wider range of effects in how people process information.[66]
Information processing explanations
There are currently three main information processing explanations of confirmation bias, plus a recent addition.
Cognitive versus motivational
According to Robert MacCoun, most biased evidence processing occurs through a combination of "cold" (cognitive) and "hot" (motivated) mechanisms.[67]
Cognitive explanations for confirmation bias are based on limitations in people's ability to handle complex tasks, and the shortcuts, called
Motivational explanations involve an effect of
Cost-benefit
Explanations in terms of
Exploratory versus confirmatory
Psychologists
Make-believe
Developmental psychologist Eve Whitmore has argued that beliefs and biases involved in confirmation bias have their roots in childhood coping through make-believe, which becomes "the basis for more complex forms of self-deception and illusion into adulthood." The friction brought on by questioning as an adolescent with developing critical thinking can lead to the rationalization of false beliefs, and the habit of such rationalization can become unconscious over the years.[81]
Real-world effects
Social media
In social media, confirmation bias is amplified by the use of filter bubbles, or "algorithmic editing", which displays to individuals only information they are likely to agree with, while excluding opposing views.[82] Some have argued that confirmation bias is the reason why society can never escape from filter bubbles, because individuals are psychologically hardwired to seek information that agrees with their preexisting values and beliefs.[83] Others have further argued that the mixture of the two is degrading democracy—claiming that this "algorithmic editing" removes diverse viewpoints and information—and that unless filter bubble algorithms are removed, voters will be unable to make fully informed political decisions.[84][82]
The rise of social media has contributed greatly to the rapid spread of fake news, that is, false and misleading information that is presented as credible news from a seemingly reliable source. Confirmation bias (selecting or reinterpreting evidence to support one's beliefs) is one of three main hurdles cited as to why critical thinking goes astray in these circumstances. The other two are shortcut heuristics (when overwhelmed or short of time, people rely on simple rules such as group consensus or trusting an expert or role model) and social goals (social motivation or peer pressure can interfere with objective analysis of facts at hand).[85]
In combating the spread of fake news, social media sites have considered turning toward "digital nudging".[86] This can currently be done in two different forms of nudging. This includes nudging of information and nudging of presentation. Nudging of information entails social media sites providing a disclaimer or label questioning or warning users of the validity of the source while nudging of presentation includes exposing users to new information which they may not have sought out but could introduce them to viewpoints that may combat their own confirmation biases.[87]
Science and scientific research
A distinguishing feature of scientific thinking is the search for confirming or supportive evidence (inductive reasoning) as well as falsifying evidence (deductive reasoning).[88][89]
Many times in the history of science, scientists have resisted new discoveries by selectively interpreting or ignoring unfavorable data.[3]: 192–194 Several studies have shown that scientists rate studies that report findings consistent with their prior beliefs more favorably than studies reporting findings inconsistent with their previous beliefs.[9][90][91]
However, assuming that the research question is relevant, the experimental design adequate and the data are clearly and comprehensively described, the empirical data obtained should be important to the scientific community and should not be viewed prejudicially, regardless of whether they conform to current theoretical predictions.[91] In practice, researchers may misunderstand, misinterpret, or not read at all studies that contradict their preconceptions, or wrongly cite them anyway as if they actually supported their claims.[92]
Further, confirmation biases can sustain scientific theories or research programs in the face of inadequate or even contradictory evidence.[60][93] The discipline of parapsychology is often cited as an example.[94]
An experimenter's confirmation bias can potentially affect which data are reported. Data that conflict with the experimenter's expectations may be more readily discarded as unreliable, producing the so-called file drawer effect. To combat this tendency, scientific training teaches ways to prevent bias.[95] For example, experimental design of randomized controlled trials (coupled with their systematic review) aims to minimize sources of bias.[95][96]
The social process of peer review aims to mitigate the effect of individual scientists' biases, even though the peer review process itself may be susceptible to such biases[97][98][91][99][100] Confirmation bias may thus be especially harmful to objective evaluations regarding nonconforming results since biased individuals may regard opposing evidence to be weak in principle and give little serious thought to revising their beliefs.[90] Scientific innovators often meet with resistance from the scientific community, and research presenting controversial results frequently receives harsh peer review.[101]
Finance
Confirmation bias can lead investors to be overconfident, ignoring evidence that their strategies will lose money.[10][102] In studies of political stock markets, investors made more profit when they resisted bias. For example, participants who interpreted a candidate's debate performance in a neutral rather than partisan way were more likely to profit.[103] To combat the effect of confirmation bias, investors can try to adopt a contrary viewpoint "for the sake of argument".[104] In one technique, they imagine that their investments have collapsed and ask themselves why this might happen.[10]
Medicine and health
Cognitive biases are important variables in clinical decision-making by medical general practitioners (GPs) and medical specialists. Two important ones are confirmation bias and the overlapping availability bias. A GP may make a diagnosis early on during an examination, and then seek confirming evidence rather than falsifying evidence. This cognitive error is partly caused by the availability of evidence about the supposed disorder being diagnosed. For example, the client may have mentioned the disorder, or the GP may have recently read a much-discussed paper about the disorder. The basis of this cognitive shortcut or heuristic (termed anchoring) is that the doctor does not consider multiple possibilities based on evidence, but prematurely latches on (or anchors to) a single cause.[105] In emergency medicine, because of time pressure, there is a high density of decision-making, and shortcuts are frequently applied. The potential failure rate of these cognitive decisions needs to be managed by education about the 30 or more cognitive biases that can occur, so as to set in place proper debiasing strategies.[106] Confirmation bias may also cause doctors to perform unnecessary medical procedures due to pressure from adamant patients.[107]
Raymond Nickerson, a psychologist, blames confirmation bias for the ineffective medical procedures that were used for centuries before the arrival of scientific medicine.[3]: 192 If a patient recovered, medical authorities counted the treatment as successful, rather than looking for alternative explanations such as that the disease had run its natural course. Biased assimilation is a factor in the modern appeal of alternative medicine, whose proponents are swayed by positive anecdotal evidence but treat scientific evidence hyper-critically.[108][109][110]
Politics, law and policing
Nickerson argues that reasoning in judicial and political contexts is sometimes subconsciously biased, favoring conclusions that judges, juries or governments have already committed to.[3]: 191–193 Since the evidence in a jury trial can be complex, and jurors often reach decisions about the verdict early on, it is reasonable to expect an attitude polarization effect. The prediction that jurors will become more extreme in their views as they see more evidence has been borne out in experiments with mock trials.[114][115] Both inquisitorial and adversarial criminal justice systems are affected by confirmation bias.[116]
Confirmation bias can be a factor in creating or extending conflicts, from emotionally charged debates to wars: by interpreting the evidence in their favor, each opposing party can become overconfident that it is in the stronger position.
A two-decade study of political pundits by Philip E. Tetlock found that, on the whole, their predictions were not much better than chance. Tetlock divided experts into "foxes" who maintained multiple hypotheses, and "hedgehogs" who were more dogmatic. In general, the hedgehogs were much less accurate. Tetlock blamed their failure on confirmation bias, and specifically on their inability to make use of new information that contradicted their existing theories.[119]
In police investigations, a detective may identify a suspect early in an investigation, but then sometimes largely seek supporting or confirming evidence, ignoring or downplaying falsifying evidence.[120]
Social psychology
Social psychologists have identified two tendencies in the way people seek or interpret information about themselves.
Mass delusions
Confirmation bias can play a key role in the propagation of
For another example, in the Seattle windshield pitting epidemic, there seemed to be a "pitting epidemic" in which windshields were damaged due to an unknown cause. As news of the apparent wave of damage spread, more and more people checked their windshields, discovered that their windshields too had been damaged, thus confirming belief in the supposed epidemic. In fact, the windshields were previously damaged, but the damage went unnoticed until people checked their windshields as the delusion spread.[129]
Paranormal beliefs
One factor in the appeal of alleged psychic readings is that listeners apply a confirmation bias which fits the psychic's statements to their own lives.[130] By making a large number of ambiguous statements in each sitting, the psychic gives the client more opportunities to find a match. This is one of the techniques of cold reading, with which a psychic can deliver a subjectively impressive reading without any prior information about the client.[130] Investigator James Randi compared the transcript of a reading to the client's report of what the psychic had said, and found that the client showed a strong selective recall of the "hits".[131]
As a striking illustration of confirmation bias in the real world, Nickerson mentions numerological pyramidology: the practice of finding meaning in the proportions of the Egyptian pyramids.[3]: 190 There are many different length measurements that can be made of, for example, the Great Pyramid of Giza and many ways to combine or manipulate them. Hence it is almost inevitable that people who look at these numbers selectively will find superficially impressive correspondences, for example with the dimensions of the Earth.[3]: 190
Recruitment and selection
Unconscious cognitive bias (including confirmation bias) in job recruitment affects hiring decisions and can potentially prohibit a diverse and inclusive workplace. There are a variety of unconscious biases that affects recruitment decisions but confirmation bias is one of the major ones, especially during the interview stage.[132] The interviewer will often select a candidate that confirms their own beliefs, even though other candidates are equally or better qualified.
Associated effects and outcomes
Polarization of opinion
When people with opposing views interpret new information in a biased way, their views can move even further apart. This is called "attitude polarization".[133] The effect was demonstrated by an experiment that involved drawing a series of red and black balls from one of two concealed "bingo baskets". Participants knew that one basket contained 60 percent black and 40 percent red balls; the other, 40 percent black and 60 percent red. The experimenters looked at what happened when balls of alternating color were drawn in turn, a sequence that does not favor either basket. After each ball was drawn, participants in one group were asked to state out loud their judgments of the probability that the balls were being drawn from one or the other basket. These participants tended to grow more confident with each successive draw—whether they initially thought the basket with 60 percent black balls or the one with 60 percent red balls was the more likely source, their estimate of the probability increased. Another group of participants were asked to state probability estimates only at the end of a sequence of drawn balls, rather than after each ball. They did not show the polarization effect, suggesting that it does not necessarily occur when people simply hold opposing positions, but rather when they openly commit to them.[134]
A less abstract study was the Stanford biased interpretation experiment, in which participants with strong opinions about the death penalty read about mixed experimental evidence. Twenty-three percent of the participants reported that their views had become more extreme, and this self-reported shift correlated strongly with their initial attitudes.[27] In later experiments, participants also reported their opinions becoming more extreme in response to ambiguous information. However, comparisons of their attitudes before and after the new evidence showed no significant change, suggesting that the self-reported changes might not be real.[30][133][135] Based on these experiments, Deanna Kuhn and Joseph Lao concluded that polarization is a real phenomenon but far from inevitable, only happening in a small minority of cases, and it was prompted not only by considering mixed evidence, but by merely thinking about the topic.[133]
Charles Taber and Milton Lodge argued that the Stanford team's result had been hard to replicate because the arguments used in later experiments were too abstract or confusing to evoke an emotional response. The Taber and Lodge study used the emotionally charged topics of
The backfire effect is a name for the finding that given evidence against their beliefs, people can reject the evidence and believe even more strongly.[136][137] The phrase was coined by Brendan Nyhan and Jason Reifler in 2010.[138] However, subsequent research has since failed to replicate findings supporting the backfire effect.[139] One study conducted out of the Ohio State University and George Washington University studied 10,100 participants with 52 different issues expected to trigger a backfire effect. While the findings did conclude that individuals are reluctant to embrace facts that contradict their already held ideology, no cases of backfire were detected.[140] The backfire effect has since been noted to be a rare phenomenon rather than a common occurrence[141] (compare the boomerang effect).
Persistence of discredited beliefs
Beliefs can survive potent logical or empirical challenges. They can survive and even be bolstered by evidence that most uncommitted observers would agree logically demands some weakening of such beliefs. They can even survive the total destruction of their original evidential bases.
—Lee Ross and Craig Anderson[142]
Confirmation biases provide one plausible explanation for the persistence of beliefs when the initial evidence for them is removed or when they have been sharply contradicted.[3]: 187 This belief perseverance effect has been first demonstrated experimentally by Festinger, Riecken, and Schachter. These psychologists spent time with a cult whose members were convinced that the world would end on 21 December 1954. After the prediction failed, most believers still clung to their faith. Their book describing this research is aptly named When Prophecy Fails.[143]
The term belief perseverance, however, was coined in a series of experiments using what is called the "debriefing paradigm": participants read fake evidence for a hypothesis, their attitude change is measured, then the fakery is exposed in detail. Their attitudes are then measured once more to see if their belief returns to its previous level.[142]
A common finding is that at least some of the initial belief remains even after a full debriefing.[144] In one experiment, participants had to distinguish between real and fake suicide notes. The feedback was random: some were told they had done well while others were told they had performed badly. Even after being fully debriefed, participants were still influenced by the feedback. They still thought they were better or worse than average at that kind of task, depending on what they had initially been told.[145]
In another study, participants read job performance ratings of two firefighters, along with their responses to a risk aversion test.[142] This fictional data was arranged to show either a negative or positive association: some participants were told that a risk-taking firefighter did better, while others were told they did less well than a risk-averse colleague.[146] Even if these two case studies were true, they would have been scientifically poor evidence for a conclusion about firefighters in general. However, the participants found them subjectively persuasive.[146] When the case studies were shown to be fictional, participants' belief in a link diminished, but around half of the original effect remained.[142] Follow-up interviews established that the participants had understood the debriefing and taken it seriously. Participants seemed to trust the debriefing, but regarded the discredited information as irrelevant to their personal belief.[146]
The
Preference for early information
Experiments have shown that information is weighted more strongly when it appears early in a series, even when the order is unimportant. For example, people form a more positive impression of someone described as "intelligent, industrious, impulsive, critical, stubborn, envious" than when they are given the same words in reverse order.
One demonstration of irrational primacy used colored chips supposedly drawn from two urns. Participants were told the color distributions of the urns, and had to estimate the probability of a chip being drawn from one of them.[148] In fact, the colors appeared in a prearranged order. The first thirty draws favored one urn and the next thirty favored the other.[3]: 187 The series as a whole was neutral, so rationally, the two urns were equally likely. However, after sixty draws, participants favored the urn suggested by the initial thirty.[148]
Another experiment involved a slide show of a single object, seen as just a blur at first and in slightly better focus with each succeeding slide.[148] After each slide, participants had to state their best guess of what the object was. Participants whose early guesses were wrong persisted with those guesses, even when the picture was sufficiently in focus that the object was readily recognizable to other people.[3]: 187
Illusory association between events
Illusory correlation is the tendency to see non-existent correlations in a set of data.
Another study recorded the symptoms experienced by arthritic patients, along with weather conditions over a 15-month period. Nearly all the patients reported that their pains were correlated with weather conditions, although the real correlation was zero.[151]
Days | Rain | No rain |
---|---|---|
Arthritis | 14 | 6 |
No arthritis | 7 | 2 |
This effect is a kind of biased interpretation, in that objectively neutral or unfavorable evidence is interpreted to support existing beliefs. It is also related to biases in hypothesis-testing behavior.[152] In judging whether two events, such as illness and bad weather, are correlated, people rely heavily on the number of positive-positive cases: in this example, instances of both pain and bad weather. They pay relatively little attention to the other kinds of observation (of no pain and/or good weather).[153] This parallels the reliance on positive tests in hypothesis testing.[152] It may also reflect selective recall, in that people may have a sense that two events are correlated because it is easier to recall times when they happened together.[152]
See also
- Apophenia
- Cherry picking
- Cognitive bias mitigation
- Denialism
- List of cognitive biases
- Observer-expectancy effect
- Selective perception
- Semmelweis reflex
Notes
- ^ David Perkins, a professor and researcher at the Harvard Graduate School of Education, coined the term "myside bias" referring to a preference for "my" side of an issue.[1]
- ^ "Assimilation bias" is another term used for biased interpretation of evidence.[7]
- ^ Wason also used the term "verification bias".[57]
References
Citations
- ^ Baron 2000, p. 195.
- PMID 19586162
- ^ a b c d e f g h i j k l m n o p Nickerson 1998, pp. 175–220
- ^ Plous 1993, p. 233
- OCLC 42823720
- ^ Risen & Gilovich 2007
- ^ Risen & Gilovich 2007, p. 113.
- ^ a b c Oswald & Grosjean 2004, pp. 82–83
- ^ a b Hergovich, Schott & Burger 2010
- ^ Wall Street Journal, archivedfrom the original on 14 February 2015, retrieved 13 June 2010
- ^ a b c d Kunda 1999, pp. 112–115
- ^ a b Baron 2000, pp. 162–64
- ^ Kida 2006, pp. 162–65
- ISSN 1939-1315
- ISSN 1939-1315
- ^ (PDF) from the original on 1 October 2011, retrieved 14 August 2009
- , pp. 63–65
- ^ , pp. 63–65
- , p. 131
- ^ a b Kunda 1999, pp. 117–18
- ^ PMID 15536240
- S2CID 143133082
- ISBN 978-91-88212-95-5, archivedfrom the original on 6 April 2023, retrieved 16 October 2021
- ^ S2CID 14505370
- ^ S2CID 145419628
- ^ Kida 2006, p. 157
- ^ S2CID 7465318
- ^ a b Baron 2000, pp. 201–202
- ^ Vyse 1997, p. 122
- ^ S2CID 3770487
- ^ S2CID 8625992
- ^ Gadenne, V.; Oswald, M. (1986), "Entstehung und Veränderung von Bestätigungstendenzen beim Testen von Hypothesen [Formation and alteration of confirmatory tendencies during the testing of hypotheses]", Zeitschrift für Experimentelle und Angewandte Psychologie, 33: 360–374 via Oswald & Grosjean 2004, p. 89
- OCLC 55078722
- ^ a b c Oswald & Grosjean 2004, pp. 88–89
- ^ , p. 231
- ^ Kunda 1999, pp. 225–232
- PMID 2213492
- ^ S2CID 22743423
- ^ S2CID 24729233
- ^ , p. 121
- ^ (PDF) from the original on 4 March 2016, retrieved 11 November 2014
- ^ Thucydides 4.108.4.
- ^ Alighieri, Dante. Paradiso canto XIII: 118–120. Trans. Allen Mandelbaum.
- ^ Ibn Khaldun (1958), The Muqadimmah, Princeton, NJ: Princeton University Press, p. 71.
- ^ a b Baron 2000, pp. 195–196.
- ^ a b Bacon, Francis (1620). Novum Organum. reprinted in Burtt, E. A., ed. (1939), The English philosophers from Bacon to Mill, New York: Random House, p. 36 via Nickerson 1998, p. 176.
- ^ Schopenhauer, Arthur (2011) [1844], Carus, David; Aquila, Richard E. (eds.), The World as Will and Presentation, vol. 2, New York: Routledge, p. 246.
- ^ Tolstoy, Leo (1896). What Is Art? ch. 14 p. 143 Archived 17 August 2021 at the Wayback Machine. Translated from Russian by Aylmer Maude, New York, 1904. Project Gutenberg edition Archived 7 August 2021 at the Wayback Machine released 23 March 2021. Retrieved 17 August 2021.
- ^ Tolstoy, Leo (1894). The Kingdom of God Is Within You p. 49 Archived 17 August 2021 at the Wayback Machine. Translated from Russian by Constance Garnett, New York, 1894. Project Gutenberg edition Archived 17 August 2021 at the Wayback Machine released 26 July 2013. Retrieved 17 August 2021.
- ^ Wason 1960
- ^ Lewicka 1998, p. 238
- ^ Poletiek 2001, p. 73.
- ^ Oswald & Grosjean 2004, pp. 79–96
- S2CID 1212273
- ^ OCLC 72151566
- OCLC 33832963
- ^ Oswald & Grosjean 2004, pp. 81–82, 86–87
- ^ Plous 1993, p. 233
- ^ Lewicka 1998, p. 239
- S2CID 143148831(Experiment IV)
- ^ Oswald & Grosjean 2004, pp. 86–89
- ^ MacCoun 1998
- ^ Friedrich 1993, p. 298
- ^ Kunda 1999, p. 94
- ^ Baron 2000, p. 206
- OCLC 55124398
- S2CID 143957893
- ISSN 0022-3514
- ^ Oswald & Grosjean 2004, pp. 91–93
- ^ Friedrich 1993, pp. 299, 316–317
- OCLC 34731629 via Oswald & Grosjean 2004, pp. 91–93
- ^ (PDF) from the original on 9 September 2020, retrieved 25 September 2019
- ISBN 978-0-521-52718-7
- ISBN 978-0-14-103916-9
- ISBN 978-0-470-13749-9
- ^ American Psychological Association (2018), "Why we're susceptible to fake news – and how to defend against it", Skeptical Inquirer, 42 (6): 8–9
- ^ a b Pariser, Eli (2 May 2011), "Ted talk: Beware online "filter bubbles"", TED: Ideas Worth Spreading, archived from the original on 22 September 2017, retrieved 1 October 2017
- ^ Self, Will (28 November 2016), "Forget fake news on Facebook – the real filter bubble is you", NewStatesman, archived from the original on 11 November 2017, retrieved 24 October 2017
- ^ Pariser, Eli (7 May 2015), "Did Facebook's big study kill my filter bubble thesis?", Wired, archived from the original on 11 November 2017, retrieved 24 October 2017
- ^ Kendrick, Douglas T.; Cohen, Adam B.; Neuberg, Steven L.; Cialdini, Robert B. (2020), "The science of anti-science thinking", Scientific American, 29 (4, Fall, Special Issue): 84–89
- SSRN 2708250
- PMID 33693334
- S2CID 9703186
- JSTOR 2094423
- ^ a b Koehler 1993
- ^ a b c Mahoney 1977
- PMID 31498834
- ^ Ball, Phillip (14 May 2015), "The trouble with scientists: How one psychologist is tackling human biases in science", Nautilus, archived from the original on 7 October 2019, retrieved 6 October 2019
- OCLC 69423179,
Some of the worst examples of confirmation bias are in research on parapsychology ... Arguably, there is a whole field here with no powerful confirming data at all. But people want to believe, and so they find ways to believe.
- ^ ISBN 978-0-521-60834-3
- PMID 11440947
- PMID 16830675
- PMID 21098355
- ^ Bartlett, Steven James, "The psychology of abuse in publishing: Peer review and editorial bias," Chap. 7, pp. 147–177, in Steven James Bartlett, Normality does not equal mental health: The need to look elsewhere for standards of good psychological health. Santa Barbara, CA: Praeger, 2011.
- PMID 2304222
- OCLC 61864118
- S2CID 153379653
- OCLC 277205993
- ISBN 978-1-921215-69-8
- PMID 12414468.
- PMID 28614014
- ^ Goldacre 2008, p. 233
- ISBN 978-0-593-06129-9
- PMID 15208545
- OCLC 474568621
- OCLC 32699443
- OCLC 602015097
- , pp. 193–194
- OCLC 37180929
- SSRN 1619124,
Quote: Both adversarial and inquisitorial systems seem subject to the dangers of tunnel vision or confirmation bias.
- ^ Baron 2000, pp. 191, 195
- ^ Kida 2006, p. 155
- OCLC 56825108
- doi:10.1037/a0017881
- ^ PMID 2810025
- ^ ISSN 0022-1031
- S2CID 144945319
- ISSN 0022-0167
- ISSN 0022-3514
- PMID 1142062
- ^ Lidén, Moa (2018). "3.2.4.1" (PDF). Confirmation bias in criminal cases (Thesis). Department of Law, Uppsala University. Archived (PDF) from the original on 20 February 2020. Retrieved 20 February 2020.
- ^ Trevor-Roper, H.R. (1969). The European witch-craze of the sixteenth and seventeenth centuries and other essays. London: HarperCollins.[ISBN missing]
- ^ Chrisler, Mark (24 September 2019), "The constant: A history of getting things wrong", constantpodcast.com (Podcast), archived from the original on 20 February 2020, retrieved 19 February 2020
- ^ OCLC 319499491
- OCLC 26359284
- ^ Agarwal, Dr Pragva (19 October 2018), "Here is how bias can affect recruitment in your organization", Forbes, archived from the original on 31 July 2019, retrieved 31 July 2019
- ^ S2CID 145659040
- ^ Baron 2000, p. 201
- S2CID 14102789
- ^ "Backfire effect", The Skeptic's Dictionary, archived from the original on 6 February 2017, retrieved 26 April 2012
- ^ Silverman, Craig (17 June 2011), "The backfire effect", Columbia Journalism Review, archived from the original on 25 April 2012, retrieved 1 May 2012,
When your deepest convictions are challenged by contradictory evidence, your beliefs get stronger.
- ^ Nyhan, B. & Reifler, J. (2010). 'When corrections fail: The persistence of political misperceptions". Political Behavior, 32, 303–320
- ^ "Facts matter after all: rejecting the "backfire effect"". Oxford Education Blog. 12 March 2018. Archived from the original on 23 October 2018. Retrieved 23 October 2018.
- ISSN 1556-5068
- ^ "Fact-checking doesn't 'backfire,' new study suggests", Poynter, 2 November 2016, archived from the original on 24 October 2018, retrieved 23 October 2018
- ^ OCLC 7578020
- ^ Festinger, Leon (1956), When prophecy fails: A social and psychological study of a modern group that predicted the destruction of the world, New York: Harper Torchbooks.
- ^ Kunda 1999, p. 99
- PMID 1185517 via Kunda 1999, p. 99
- PMID 33837143, p. 4:
The CIE refers to the tendency for information that is initially presented as true, but later revealed to be false, to continue to affect memory and reasoning
- ^ a b c d e Baron 2000, pp. 197–200
- ^ a b c Fine 2006, pp. 66–70
- ^ a b Plous 1993, pp. 164–166
- PMID 8610138 via Kunda 1999, p. 127
- ^ a b c Kunda 1999, pp. 127–130
- ^ Plous 1993, pp. 162–164
Sources
- Baron, Jonathan (2000), Thinking and deciding (3rd ed.), New York: Cambridge University Press, OCLC 316403966
- OCLC 60668289
- Friedrich, James (1993), "Primary error detection and minimization (PEDMIN) strategies in social cognition: a reinterpretation of confirmation bias phenomena", Psychological Review, 100 (2): 298–319, PMID 8483985
- OCLC 259713114
- Hergovich, Andreas; Schott, Reinhard; Burger, Christoph (2010), "Biased evaluation of abstracts depending on topic and conclusion: Further evidence of a confirmation bias within scientific psychology", Current Psychology, 29 (3): 188–209, S2CID 145497196
- Kida, Thomas E. (2006), Don't believe everything you think: The 6 basic mistakes we make in thinking, Amherst, NY: OCLC 63297791
- Koehler, Jonathan J. (1993), "The influence of prior beliefs on scientific judgments of evidence quality", Organizational Behavior and Human Decision Processes, 56: 28–55,
- OCLC 40618974
- Lewicka, Maria (1998), "Confirmation bias: Cognitive error or adaptive strategy of action control?", in Kofta, Mirosław; Weary, Gifford; Sedek, Grzegorz (eds.), Personal control in action: Cognitive and motivational mechanisms, Springer, pp. 233–255, OCLC 39002877
- MacCoun, Robert J. (1998), "Biases in the interpretation and use of research results" (PDF), Annual Review of Psychology, 49: 259–287, (PDF) from the original on 9 August 2017, retrieved 10 October 2010
- Mahoney, Michael J. (1977), "Publication prejudices: An experimental study of confirmatory bias in the peer review system", Cognitive Therapy and Research, 1 (2): 161–175, S2CID 7350256
- Nickerson, Raymond S. (1998), "Confirmation bias: A ubiquitous phenomenon in many guises", Review of General Psychology, 2 (2): 175–220, S2CID 8508954
- Oswald, Margit E.; Grosjean, Stefan (2004), "Confirmation bias", in Pohl, Rüdiger F. (ed.), Cognitive illusions: A handbook on fallacies and biases in thinking, judgement and memory, Hove, UK: Psychology Press, pp. 79–96, OCLC 55124398
- OCLC 26931106
- Poletiek, Fenna (2001), Hypothesis-testing behaviour, Hove, UK: Psychology Press, OCLC 44683470
- Risen, Jane; Gilovich, Thomas (2007), "Informal logical fallacies", in Sternberg, Robert J.; Roediger III, Henry L.; Halpern, Diane F. (eds.), Critical thinking in psychology, Cambridge University Press, pp. 110–130, OCLC 69423179
- Vyse, Stuart A. (1997), Believing in magic: The psychology of superstition, New York: Oxford University Press, OCLC 35025826
- Wason, Peter C. (1960), "On the failure to eliminate hypotheses in a conceptual task", S2CID 19237642
Further reading
- Leavitt, Fred (2015), Dancing with absurdity: Your most cherished beliefs (and all your others) are probably wrong, OCLC 908685982
- Stanovich, Keith (2009), What intelligence tests miss: The psychology of rational thought (Lay), New Haven (CT): ISBN 978-0-300-12385-2
- Westen, Drew (2007), The political brain: The role of emotion in deciding the fate of the nation, OCLC 86117725
External links
- Skeptic's Dictionary: confirmation bias – Robert T. Carroll
- Teaching about confirmation bias – class handout and instructor's notes by K.H. Grobman
- Confirmation bias at You Are Not So Smart
- Confirmation bias learning object – interactive number triples exercise by Rod McFarland for Simon Fraser University
- Brief summary of the 1979 Stanford assimilation bias study – Keith Rollag, Babson College