Belief Perseverance

Also known as: Belief Tenacity, Persistence of Belief

Belief perseverance is a cognitive bias in which individuals maintain or even strengthen their original beliefs despite receiving clear, credible information that contradicts or disconfirms those beliefs. Once people have constructed an explanation or narrative that supports a belief, that mental model tends to persist and shape information processing, leading them to discount, reinterpret, or selectively forget disconfirming evidence.

Cognitive Biases

/ Belief updating

10 min read

experimental Evidence


Belief Perseverance: When Beliefs Survive Contrary Evidence

Changing one’s mind is harder than it seems. Once we adopt a belief—about politics, health, other people, or ourselves—it tends to stick, even when new information shows that our original reasoning was flawed. This stubbornness is known as Belief Perseverance.

Belief perseverance occurs when people maintain or even strengthen their initial beliefs after the evidence that originally supported those beliefs has been discredited. For example, you might continue to think a particular food is unhealthy even after learning that the original study you read was retracted. Or you might keep trusting a false rumor even after someone shows you clear debunking from reliable sources.

This bias matters because it slows or blocks learning. In a complex information environment—where new data, corrections, and retractions are common—belief perseverance can lead individuals, groups, and even institutions to cling to outdated or false models of reality.

The Psychology Behind It

Several mechanisms help explain why beliefs persist:

  1. Causal Stories and Mental Models
    When we adopt a belief, we typically build a story around it—a causal explanation that makes the belief feel coherent. Even if the initial evidence is later withdrawn, the explanatory story remains. The brain is reluctant to discard coherent narratives, so the belief survives.

  2. Motivated Reasoning and Identity
    Beliefs are not just abstract propositions; they are often tied to our identity, values, and social groups. Abandoning a belief can feel like betraying our group, admitting we were wrong, or threatening our self-image. Motivated reasoning leads us to protect the belief rather than re-evaluate it.

  3. Confirmation Bias in Information Processing
    Once a belief is in place, people preferentially seek, notice, and remember information that supports it, while ignoring or downplaying contradictory evidence. This selective processing continually reinforces the belief, even in a changing evidence landscape.

  4. Cognitive Dissonance Reduction
    When evidence clashes with a cherished belief, we experience cognitive dissonance—mental discomfort from holding inconsistent ideas. One way to reduce dissonance is to dismiss or reinterpret the evidence, rather than revising the belief.

  5. Initial Impressions as Anchors
    First impressions act as anchors. Subsequent information is filtered and adjusted relative to that anchor, often insufficiently. This anchoring effect supports the persistence of early beliefs.

Together, these factors create a psychological "inertia" that keeps beliefs in place long after they should have been updated.

Real-World Examples

1. Debunked Studies and Health Myths

Even after a scientific study is retracted or strongly criticized, many people continue to believe its claims. For instance, despite overwhelming evidence and repeated corrections, some still believe disproven links between vaccines and autism. The initial belief, once widely spread, persists.

2. Personal Impressions of People

If you initially form a negative impression of a colleague based on gossip or a single awkward interaction, that impression may persist even after many positive experiences. You may interpret their neutral actions through a negative lens, preserving the original belief.

3. Political Beliefs and Fact-Checking

In politics, fact-checking and corrections often have limited impact. Supporters of a candidate may retain favorable beliefs even when specific claims are shown to be false, telling themselves that "the main point still stands" or that the criticism is biased.

4. Stereotypes and Social Categories

Stereotypes about social groups can be extremely resistant to disconfirmation. Counterexamples are often dismissed as "exceptions" rather than evidence against the stereotype, allowing the overarching belief to remain intact.

Consequences

Belief perseverance can have serious consequences:

  • Resistance to Learning
    People fail to update beliefs based on new evidence, hindering learning in science, medicine, finance, and everyday life.

  • Polarization and Conflict
    Groups holding opposing beliefs may become more entrenched when confronted with disconfirming evidence, deepening polarization.

  • Persistence of Harmful Practices
    Ineffective or harmful policies, treatments, and practices can persist because those who endorsed them struggle to admit they were wrong.

  • Mistrust of Corrective Information
    Repeated corrections can backfire if people see them as attacks, leading to further entrenchment of false beliefs.

How to Mitigate It

Belief perseverance is strong, but not unbeatable. Several strategies can help individuals and organizations update beliefs more effectively:

  1. Emphasize Explanations, Not Just Facts
    When correcting a belief, offer an alternative explanation that makes sense of the world. Replacing an old story with a new, coherent story is more effective than simply saying "this is wrong."

  2. Encourage Intellectual Humility
    Cultivate norms that value saying "I was wrong" and "I changed my mind" as signs of growth, not weakness. This reduces the identity threat of updating beliefs.

  3. Use Prebunking and Inoculation
    Warn people about likely misinformation before they encounter it, and explain the tactics used to mislead. This can build mental "antibodies" that reduce belief formation in the first place.

  4. Create Safe Contexts for Updating
    In groups, frame belief revision as a collective improvement rather than an individual defeat. For example, "As a team, we’ve learned new data and are improving our approach".

  5. Practice Active Open-Mindedness
    Deliberately seek out strong counterarguments to your views and imagine scenarios in which you might be wrong. This habit makes it easier to loosen the grip of entrenched beliefs.

  6. Slow Down and Reflect
    When you encounter disconfirming evidence, pause before reacting defensively. Ask: "If I hadn’t believed this already, how would I evaluate this new information?"

Conclusion

Belief perseverance shows that letting go of ideas can be harder than acquiring them. Once beliefs are woven into our stories, identities, and social worlds, they resist change—even in the face of strong counterevidence. This bias helps explain why misinformation sticks, why stereotypes endure, and why people cling to failing strategies.

However, acknowledging belief perseverance can itself be liberating. By building cultures that reward updating, offering better explanations, and practicing intellectual humility, we can make it easier to revise our beliefs when the world proves us wrong. Doing so is essential for personal growth, scientific progress, and healthier public discourse.

Common Triggers

Strong initial evidence or vivid narratives

Beliefs tied to identity or group membership

Social reinforcement

Typical Contexts

Political and ideological debates

Public health and scientific controversies

First impressions in social and work settings

Financial markets and investment narratives

Mitigation Strategies

Offer alternative explanations: When correcting a false belief, provide a coherent alternative story that explains the facts without relying on the original, mistaken belief.

Effectiveness: high

Difficulty: moderate

Normalize belief updating: Celebrate examples of people changing their minds in light of new evidence to make revision socially acceptable.

Effectiveness: medium

Difficulty: moderate

Encourage active open-mindedness: Train individuals to regularly seek out and consider arguments against their current views.

Effectiveness: medium

Difficulty: moderate

Potential Decision Harms

Policymakers continue to support ineffective or harmful policies because they are unwilling to revise beliefs formed early in their careers.

major Severity

Patients cling to ineffective or disproven treatments, delaying or rejecting evidence-based care.

major Severity

Companies persist with failing strategies because leaders are attached to their original vision, leading to lost opportunities and competitive disadvantage.

major Severity

Key Research Studies

Perseverance of Social Theories: The Role of Explanation in the Persistence of Discredited Information

Ross, L., Lepper, M. R., & Hubbard, M. (1975) Journal of Personality and Social Psychology

Showed that beliefs can persist even after the evidence supporting them is discredited, especially when people have generated explanations for the original evidence.

Further Reading

Mistakes Were Made (But Not by Me)

by Carol Tavris & Elliot Aronson • book

Explores how cognitive dissonance and self-justification support belief perseverance and resistance to admitting error.

How Minds Change

by David McRaney • book

Discusses why people cling to beliefs and how genuine belief change can occur.


Related Biases

Explore these related cognitive biases to deepen your understanding

Loaded Language

Loaded language (also known as loaded terms or emotive language) is rhetoric used to influence an audience by using words and phrases with strong connotations.

Cognitive Biases

/ Emotive language

Euphemism

A euphemism is a mild or indirect word or expression substituted for one considered to be too harsh or blunt when referring to something unpleasant or embarrassing.

Cognitive Biases

/ Doublespeak (related)

Paradox of Choice

10 min read

The paradox of choice is the idea that having too many options can make decisions harder, reduce satisfaction, and even lead to decision paralysis.

Cognitive Biases / Choice and complexity

/ Choice Overload

Choice Overload Effect

10 min read

The choice overload effect occurs when having too many options makes it harder to decide, reduces satisfaction, or leads people to avoid choosing at all.

Cognitive Biases / Choice and complexity

/ Paradox of Choice

Procrastination

2 min read

Procrastination is the action of unnecessarily and voluntarily delaying or postponing something despite knowing that there will be negative consequences for doing so.

Cognitive Biases

/ Akrasia (weakness of will)

Time-Saving Bias

2 min read

The time-saving bias describes the tendency of people to misestimate the time that could be saved (or lost) when increasing (or decreasing) speed.

Cognitive Biases

/ Time-saving illusion