Belief Perseverance: When Beliefs Survive Contrary Evidence
Changing one’s mind is harder than it seems. Once we adopt a belief—about politics, health, other people, or ourselves—it tends to stick, even when new information shows that our original reasoning was flawed. This stubbornness is known as Belief Perseverance.
Belief perseverance occurs when people maintain or even strengthen their initial beliefs after the evidence that originally supported those beliefs has been discredited. For example, you might continue to think a particular food is unhealthy even after learning that the original study you read was retracted. Or you might keep trusting a false rumor even after someone shows you clear debunking from reliable sources.
This bias matters because it slows or blocks learning. In a complex information environment—where new data, corrections, and retractions are common—belief perseverance can lead individuals, groups, and even institutions to cling to outdated or false models of reality.
The Psychology Behind It
Several mechanisms help explain why beliefs persist:
-
Causal Stories and Mental Models
When we adopt a belief, we typically build a story around it—a causal explanation that makes the belief feel coherent. Even if the initial evidence is later withdrawn, the explanatory story remains. The brain is reluctant to discard coherent narratives, so the belief survives. -
Motivated Reasoning and Identity
Beliefs are not just abstract propositions; they are often tied to our identity, values, and social groups. Abandoning a belief can feel like betraying our group, admitting we were wrong, or threatening our self-image. Motivated reasoning leads us to protect the belief rather than re-evaluate it. -
Confirmation Bias in Information Processing
Once a belief is in place, people preferentially seek, notice, and remember information that supports it, while ignoring or downplaying contradictory evidence. This selective processing continually reinforces the belief, even in a changing evidence landscape. -
Cognitive Dissonance Reduction
When evidence clashes with a cherished belief, we experience cognitive dissonance—mental discomfort from holding inconsistent ideas. One way to reduce dissonance is to dismiss or reinterpret the evidence, rather than revising the belief. -
Initial Impressions as Anchors
First impressions act as anchors. Subsequent information is filtered and adjusted relative to that anchor, often insufficiently. This anchoring effect supports the persistence of early beliefs.
Together, these factors create a psychological "inertia" that keeps beliefs in place long after they should have been updated.
Real-World Examples
1. Debunked Studies and Health Myths
Even after a scientific study is retracted or strongly criticized, many people continue to believe its claims. For instance, despite overwhelming evidence and repeated corrections, some still believe disproven links between vaccines and autism. The initial belief, once widely spread, persists.
2. Personal Impressions of People
If you initially form a negative impression of a colleague based on gossip or a single awkward interaction, that impression may persist even after many positive experiences. You may interpret their neutral actions through a negative lens, preserving the original belief.
3. Political Beliefs and Fact-Checking
In politics, fact-checking and corrections often have limited impact. Supporters of a candidate may retain favorable beliefs even when specific claims are shown to be false, telling themselves that "the main point still stands" or that the criticism is biased.
4. Stereotypes and Social Categories
Stereotypes about social groups can be extremely resistant to disconfirmation. Counterexamples are often dismissed as "exceptions" rather than evidence against the stereotype, allowing the overarching belief to remain intact.
Consequences
Belief perseverance can have serious consequences:
-
Resistance to Learning
People fail to update beliefs based on new evidence, hindering learning in science, medicine, finance, and everyday life. -
Polarization and Conflict
Groups holding opposing beliefs may become more entrenched when confronted with disconfirming evidence, deepening polarization. -
Persistence of Harmful Practices
Ineffective or harmful policies, treatments, and practices can persist because those who endorsed them struggle to admit they were wrong. -
Mistrust of Corrective Information
Repeated corrections can backfire if people see them as attacks, leading to further entrenchment of false beliefs.
How to Mitigate It
Belief perseverance is strong, but not unbeatable. Several strategies can help individuals and organizations update beliefs more effectively:
-
Emphasize Explanations, Not Just Facts
When correcting a belief, offer an alternative explanation that makes sense of the world. Replacing an old story with a new, coherent story is more effective than simply saying "this is wrong." -
Encourage Intellectual Humility
Cultivate norms that value saying "I was wrong" and "I changed my mind" as signs of growth, not weakness. This reduces the identity threat of updating beliefs. -
Use Prebunking and Inoculation
Warn people about likely misinformation before they encounter it, and explain the tactics used to mislead. This can build mental "antibodies" that reduce belief formation in the first place. -
Create Safe Contexts for Updating
In groups, frame belief revision as a collective improvement rather than an individual defeat. For example, "As a team, we’ve learned new data and are improving our approach". -
Practice Active Open-Mindedness
Deliberately seek out strong counterarguments to your views and imagine scenarios in which you might be wrong. This habit makes it easier to loosen the grip of entrenched beliefs. -
Slow Down and Reflect
When you encounter disconfirming evidence, pause before reacting defensively. Ask: "If I hadn’t believed this already, how would I evaluate this new information?"
Conclusion
Belief perseverance shows that letting go of ideas can be harder than acquiring them. Once beliefs are woven into our stories, identities, and social worlds, they resist change—even in the face of strong counterevidence. This bias helps explain why misinformation sticks, why stereotypes endure, and why people cling to failing strategies.
However, acknowledging belief perseverance can itself be liberating. By building cultures that reward updating, offering better explanations, and practicing intellectual humility, we can make it easier to revise our beliefs when the world proves us wrong. Doing so is essential for personal growth, scientific progress, and healthier public discourse.