Cognitive Dissonance and Belief Perseverance Contribute to the Tendency to Double Down When Confronted with Contradictory Evidence

Key Takeaways

  • When faced with evidence that contradicts their beliefs, many people double down rather than update their views, a pattern driven by cognitive dissonance and belief perseverance.
  • This tendency is stronger when beliefs are tied to personal identity or group loyalty, making change feel like a personal defeat.
  • The bias affects people across the political spectrum and intelligence levels; even highly intelligent individuals can construct sophisticated justifications to protect their original beliefs.
  • Research shows this reaction is a normal human response to mental discomfort, not a sign of stupidity or weakness.

open your mind or double down

When new evidence challenges a strongly held belief, the rational response would be to update or revise that belief. Yet in practice, many people do the opposite: they double down, defend the original position more vigorously, and sometimes become even more confident in it.

This reaction is not rare. It appears in politics, relationships, health decisions, and everyday disagreements. Research in psychology has studied this phenomenon for decades and identified two core mechanisms at work: cognitive dissonance (the discomfort of holding conflicting ideas) and belief perseverance (the tendency for beliefs to survive even after their supporting evidence is discredited).

This article explores what the studies actually show about why people double down when confronted with contradictory evidence, how identity and group loyalty intensify the effect, and what practical steps can help reduce this automatic response.

Cognitive Dissonance: The Discomfort That Drives Defense

The foundation of this behavior is cognitive dissonance theory, first proposed by Leon Festinger in 1957. When reality clashes with a strongly held belief, people experience mental discomfort. To reduce that discomfort, they often change their interpretation of reality rather than change the belief itself.

Classic experiments showed that after making a decision, people actively seek information that supports their choice and avoid or dismiss information that contradicts it. This selective processing helps restore psychological consistency.

Belief Perseverance: Beliefs Survive Even After Evidence Is Retracted

A landmark series of studies by Lee Ross and colleagues (1975) demonstrated belief perseverance. Participants were given information that later turned out to be false. Even after being explicitly told the information was fabricated, many continued to believe the original conclusion and generated new reasons to support it.

This effect was further explored by Craig Anderson and colleagues (1980), who showed that once people construct an explanation for a belief, retracting the original evidence does not fully erase the belief. Participants often continued to hold onto their conclusions even after being explicitly told the supporting data was fabricated. Researchers found that once individuals generate their own explanations for a belief, those explanations can become self-reinforcing. Even when the original evidence is removed, the internally generated reasoning remains, allowing the belief to persist.

Identity and Group Loyalty Amplify the Effect

The tendency to double down becomes much stronger when beliefs are tied to personal or group identity. A 2006 fMRI study by Drew Westen and colleagues examined how committed partisans responded to contradictory information about their preferred candidates. The researchers observed increased activity in brain regions associated with emotional processing and conflict resolution, alongside reduced engagement of areas typically linked to analytical reasoning. This suggests that when beliefs are strongly tied to identity, the brain may prioritize emotional regulation over objective evaluation, helping explain why participants were able to justify inconsistencies rather than revise their views.

Brendan Nyhan and Jason Reifler (2010) tested whether factual corrections work in politics. In several experiments, they found that corrections could sometimes backfire, with some participants becoming more confident in false beliefs after being presented with corrective information. However, later research by Wood and Porter (2019) showed that backfire effects are relatively rare overall. While people can accept facts on many issues, beliefs tied to political identity or group loyalty remain far more resistant to change.

A 2026 study by Oktar and Lombrozo further showed that when people learn they have underestimated societal disagreement on an issue, they often persist in their beliefs by perceiving opponents as more ignorant, biased, or immoral. This perception helps maintain the original belief despite contradictory evidence.

The “Team Sport” Effect in Politics

Politics often functions like sports fandom. People root for their side and against the other, even when facts challenge their team. This tribal dynamic makes doubling down more likely because admitting error feels like betraying your group. Studies on affective polarization show that dislike of the opposing party has grown dramatically, making it emotionally costly to change your mind.

When a politician or cause turns out to be misleading, supporters frequently defend it more vigorously rather than admit they were wrong. The embarrassment of being deceived triggers a defensive response: instead of saying “I got this one wrong,” many people say “It’s not that bad” or “The other side is worse.”

Why This Happens to Almost Everyone

The studies show this is not a flaw of “stupid” or “uneducated” people. It happens across intelligence levels and political orientations. Smart people can be especially good at constructing sophisticated justifications for their beliefs. The bias is human, rooted in the need to reduce discomfort, protect self-image, and maintain social belonging.

Cultural and philosophical traditions often emphasize open-mindedness, humility, and truth-seeking. However, psychological research shows that in practice, social identity and group loyalty can override these ideals, especially in emotionally charged contexts.

How to Break the Pattern

Recognizing the tendency is the first step. Here are evidence-based strategies that can help:

  • Pause before reacting. Give yourself time to sit with the discomfort instead of immediately defending the original belief.
  • Separate your identity from the belief. Remind yourself: “I can be wrong about this idea and still be a good, intelligent person.”
  • Seek disconfirming evidence deliberately. Actively look for one credible source that challenges your view.
  • Talk to someone outside your bubble. Choose a calm, trusted person who disagrees and listen without trying to win the argument.
  • Focus on being accurate rather than consistent. Ask: “What is most likely to be true?” instead of “How do I protect my previous position?”
  • Accept that being wrong is normal. Everyone makes mistakes. Growth comes from updating beliefs, not defending them forever.

These steps are not about switching sides. They are about reducing the automatic defensive response so you can think more clearly.

Related Reading:

Patriotism vs. Moral Principles: Why Loyalty Usually Wins and How People Overestimate Their Own Ethics

How Self-Proclaimed Tolerant People Can Be Intolerant of Disagreement: Insights from Psychology

Recognizing Manipulation: A Psychological Guide to Identifying Cult-Like Dynamics and Echo Chambers

The Psychology of Culture Wars: How the Elite Divide and Manipulate the Masses

FAQs: Why People Double Down When They’re Wrong

Why do people double down when they realize they’ve been wrong?

Because changing their mind creates mental discomfort (cognitive dissonance), so they defend the original belief to reduce that discomfort.

Is this something only certain people do?

No, research shows it happens across political sides, education levels, and intelligence.

What is belief perseverance?

The tendency to keep believing something even after the supporting evidence is proven false.

How does identity make doubling down worse?

When a belief becomes part of “who I am,” changing it feels like losing part of yourself.

Does this happen more in politics?

Yes, political beliefs are heavily tied to group identity, making them especially resistant to correction.

What did the fMRI study by Westen show?

When partisans saw contradictions about their candidate, emotional brain areas activated instead of reasoning areas.

Do corrections ever backfire?

Yes, some studies show that presenting facts can make strongly held false beliefs even stronger.

Why do people say “I was right all along” even when evidence says otherwise?

To protect self-image and avoid the embarrassment of being wrong.

How can I tell if I’m doubling down?

Notice if you feel angry, defensive, or eager to dismiss new information instead of considering it.

Is there a way to reduce this tendency?

Yes, pause before reacting, seek disconfirming information, and separate identity from beliefs.

What should I do when I realize I was wrong?

Admit it to yourself first, then focus on learning rather than defending the old position.

Why do people treat politics like a sports team?

Group loyalty feels safer and more rewarding than changing your mind.

Does this behavior make society more divided?

Yes, it turns disagreements into tribal battles instead of opportunities to understand.

Can smart people still double down?

Yes, intelligence can even make it worse by helping people construct better justifications.

What role does embarrassment play?

Embarrassment triggers a strong defensive response to avoid looking foolish.

How can I have better conversations with people who disagree?

Listen first, avoid “gotcha” moments, and focus on understanding rather than winning.

Is it possible to change your mind without feeling weak?

Yes, viewing it as intellectual honesty and growth makes it easier.

What should I do if I see someone else doubling down?

Don’t mock or attack, give them space and ask curious questions instead of declaring victory.

Can this pattern be unlearned?

Yes, awareness, practice, and deliberate exposure to different views help reduce automatic defensiveness.

Why do people defend politicians or causes even after they’ve been disproven?

Loyalty to the “team” becomes more important than accuracy.

How can I avoid falling into this trap in the future?

Regularly expose yourself to good-faith opposing views and practice updating your beliefs when evidence warrants it.

Is there hope for less division?

Yes, when more people recognize this pattern in themselves, it becomes easier to have honest conversations.

What’s the most important takeaway?

Being wrong is normal. The real mistake is refusing to ever admit it.

Bottom line: How do I break the doubling-down habit?

Pause, separate your identity from the belief, seek truth over consistency, and treat disagreement as information rather than an attack. Growth starts with humility.

Final Thoughts

When people realize they’ve been misled by a politician, a cause, or a narrative, the most common reaction is not reflection but defense. Studies show this doubling-down pattern is widespread because beliefs often become tied to identity and group loyalty. Changing your mind can feel like losing face or betraying your team.

This reaction is human, not a sign of stupidity. It happens on both the left and the right, and it keeps people locked in unnecessary conflict. When politics becomes a team sport, the goal stops being truth and becomes winning. Both extremes benefit from keeping people divided and angry, it distracts from shared problems and shared needs.

If you find yourself in that moment, embarrassed, defensive, or tempted to dig in deeper, remember that almost everyone has been there. The healthiest response is not to attack the other side or double down out of pride. It is to pause, separate your worth from the belief, and ask yourself honestly: “What is most likely to be true here?”

Giving people room to process without immediate mockery or “gotcha” moments makes it easier for them to step back. Listening before correcting, understanding before judging, these small shifts reduce defensiveness and open the door to genuine dialogue. Most people, regardless of where they stand, share similar core concerns: security, fairness, stability, and a desire to be heard.

When we treat every disagreement like a battle, we hand power to those who profit from division. When we treat it as an opportunity to understand, we reclaim our ability to grow. Being wrong is not the end of the world. Refusing to ever admit it is what keeps us stuck.

References

Festinger, L. (1957). A theory of cognitive dissonance. Stanford University Press.

Ross, L., Lepper, M. R., & Hubbard, M. (1975). Perseverance in self-perception and social perception: Biased attributional processes in the debriefing paradigm. Journal of Personality and Social Psychology, 32(5), 880–892. https://doi.org/10.1037/0022-3514.32.5.880

Anderson, C. A., Lepper, M. R., & Ross, L. (1980). Perseverance of social theories: The role of explanation in the persistence of discredited information. Journal of Personality and Social Psychology, 39(6), 1037–1049. https://doi.org/10.1037/h0077720

Westen, D., Blagov, P. S., Harenski, K., Kilts, C., & Hamann, S. (2006). Neural bases of motivated reasoning: An fMRI study of emotional constraints on partisan political judgment in the 2004 U.S. Presidential election. Journal of Cognitive Neuroscience, 18(11), 1947–1958. https://doi.org/10.1162/jocn.2006.18.11.1947

Nyhan, B., & Reifler, J. (2010). When corrections fail: The persistence of political misperceptions. Political Behavior, 32(2), 303–330. https://doi.org/10.1007/s11109-010-9112-2

Oktar, K., & Lombrozo, T. (2026). How beliefs persist amid controversy: The paths to persistence model. Psychological Review, 133(3), 636–665. https://doi.org/10.1037/rev0000583

Wood, T. J., & Porter, E. (2019). The elusive backfire effect: Mass attitudes’ steadfast factual adherence. Political Behavior, 41(1), 135–163. https://doi.org/10.1007/s11109-018-9443-y