What Is… Belief Perseverance

head with cogs inside

In this series, I dig a little deeper into the meaning of psychology-related terms. This week’s term is belief perseverance.

Belief perseverance is the tendency to hold onto beliefs even after the information that they’re based on is shown to be false. I’m a big fan of this line from a paper by Savion (2009):

“Belief perseverance—clinging to explicitly discredited beliefs—is ubiquitous to the point of serving as the ultimate evidence of the feebleness of our mind.”

According to Oxford Bibliographies, “People’s proclivity to passionately cling to, and advocate for, beliefs or attitudes that exist in the absence of evidentiary support manifests in a range of life domains… In fact, this propensity to develop, maintain, and unwaveringly cling to one’s beliefs in the absence of sufficient evidence is one of the most well-established tendencies in the social-psychological canon.”

A common research method for studying this effect is the debriefing paradigm. Researchers give participants information at the beginning of an experiment, and then later tell them that the initial information given was in error; for example, the researchers might tell the participants that the information actually applied to something/someone else. Belief perseverance is seen when participants continue to believe the initial information even when they’ve been explicitly told it was wrong.

Types of beliefs that persevere

Evidence has been found for three different types of beliefs that we tend to persevere with:

  • self-impressions
  • social impressions (beliefs we have about specific other people)
  • naïve theories

Self-impressions

These are the beliefs that we have about ourselves and our abilities. If you’re given feedback about whether you performed a task well or poorly, and then later told that feedback was actually about someone else’s performance, your perception of your ability at the task is still likely to be influenced by the initial feedback.

If we’re given feedback that poses a threat to our core self-concept, the motivation to maintain our self-concept may make it easier to discount the initial feedback and help to weaken the belief perseverance effect.

Naïve theories

Naïve theories are ideas we’ve developed about how the world works and what causes the various things that we observe.

One kind of naïve theory that we tend to hang onto is our pet theories about how the world works. From a young age, we start to develop intuitive theories about how the world works and create causal connections. Such theories may incorporate bits of facts, but we then draw our own conclusions that aren’t based on facts. Even when presented with information that disconfirms pet theories, we tend to stick with them.

This category of beliefs also includes social theories, which are our beliefs about how people generally think, feel, behave, and interact with one another.

Factors that may influence belief perseverance

Anchoring

This cognitive bias means that we rely more heavily on the information we’re initially presented with. For example, we’ll evaluate the reasonableness of the price we end up paying for something based on the price that was originally given. That means that if something with a $50 regular price is on sale for $30, but it’s really only worth $15, we’ll still tend to use that $50 point as our anchor to evaluate how good a deal we’re getting.

If we are initially given feedback on how we complete a task, and then we’re later told that the initial feedback was invalid, we may persevere in believing the initial feedback because it acts as an anchor for how we rate our abilities.

Creating causal explanations

The persistence in believing discredited information may relate to the causal explanations or scenarios we come up with to account for the initial belief. These explanatory stories can take on a life of their own and become the meaning on which the belief resists, so the belief is no longer dependent on the initial information. When that initial information is discredited, those explanations may remain alive in our heads and well and keep the belief going.

One technique that can reduce the belief perseverance effect is trying to imagine how the opposite belief might be true, like if you were having to take that side in a debate. Generating new causal theories reduces the biasing effects of the initial causal attributions.

The backfire effect

This post originally began as a post on the backfire effect, something I first heard about in an awfully impressive comic on The Oatmeal. While belief perseverance refers to beliefs persisting after finding out the information they were based on was false, the backfire effect goes a step further, suggesting that presenting people with corrective information actually increases their belief in their misconceptions.

It turns out that there’s some debate about whether the backfire effect actually exists. It’s not observed in people across the board, but it may be more likely when the original beliefs are strongly held, especially if they’re political. It may be that it’s not a distinct psychological effect of its own, but rather something that arises from a combination of other effects.

Illusory truth effect

The illusory truth effect means that we’re more likely to believe information the more often we hear it, so when correcting misinformation, it may be useful to present it within a “truth sandwich” (i.e. truth, what the misinformation was, truth). Part of the backfire effect may have to do with repeating the misinformation without effectively tying it to the correction, thus feeding into the illusory truth effect.

Reactance

When you tell someone what to do or not do and they perceive it as a threat to their freedom, they may push back and do the opposite to exert their freedom; this is known as reactance. If belief-discrediting information is presented in a way that’s perceived as a threat to freedom, that could have consequences along the lines of the backfire effect.

Our weird and wonderful minds

I find it so interesting to learn about the ways that we think. Our minds are definitely biased, and that bias doesn’t tend to be in the direction of accuracy.

We can probably all think of many examples of people being stubborn in their beliefs, but does it surprise you that it’s a pretty automatic thing that we’re all prone to?

References

The Psychology Corner: Insights into psychology and psychological tests

The Psychology Corner has an overview of terms covered in the What Is… series, along with a collection of scientifically validated psychological tests.

Ashley L. Peterson headshot

Ashley L. Peterson

BScPharm BSN MPN

Ashley is a former mental health nurse and pharmacist and the author of four books.

31 thoughts on “What Is… Belief Perseverance”

    1. What’s still missing for me is how we do things differently. I’ve come across various things that help to explain why things are happening the way we are, but I still have no idea how we can do better in the future.

  1. Really good examples for some of these effects and biases so you can get a good feel for how they can present.

    I actually feel like belief perseverance, and the backfire effect in particular, seem to apply quite well within politics and racism. That sense of stubbornness is almost impossible to argue against. Getting into a discussion who won’t allow in new information or re-consider their stance means you’re very unlikely to get anywhere. I like to say “you can’t argue with stupid” when it comes to hatefulness against Covid protections or racist nastiness. What makes some apply such perseverance so hard despite contradictory evidence? Closed-mindedness, fear, egotism?

    1. I agree, you can’t argue with stupid. What I have a hard time wrapping my head around is why people accept some of these nonsense beliefs in the first place. Like with COVID, people are selectively rejecting science-based information but eagerly grabbing onto every new bit of cray-cray nonsense that comes along. It’s so weird.

  2. I’m not surprised by this, but I am surprised by the prevalence. I few summers ago, I was camping with friends and I came to realize that they believed cardinal directions were subjective. As in, the west you use to go to the west coast is different from the west you use when you wanted to go to the lake. I was unable to convince them that compass directions were fixed. I like knowing the name, however; “naive theories” fits though it was somewhat hairpulling for me 😂

  3. Enjoyed this very much. I’m sure I have some stubbornly wrong beliefs, but I’m not sure what they are… I count on others to point them out, lol. It’s definitely easy to notice OTHER people’s wrong ideas, like all the men I dated who declared themselves to be geniuses but were utter failures… and naturally failing (at whatever thing) only served to convince them of their superiority, since it was other people’s lack of awareness that they were special which ruined everything!

  4. Yes, horribly relevant. I guess we don’t construct beliefs based on pure logic. A friend and I were discussing the logic of sacrifice recently, as in to a deity, and he pointed out that the rewards are random, but people still believe that it works.

    1. I think there are quite a few common practices that yield rewards a small portion of the time, but it’s very easy for people to come up with reasons to reconcile why it didn’t work all those other times.

  5. Interesting Ashley, I was thinking about something I saw on the news the other day…I am too tired to explain it….but at the end of the day it came down to unshifting beliefs and others who were baffled and appalled by anyone who could think that. I was thinking about how powerful beliefs are in shaping us and our decisions.

Leave a Reply

%d bloggers like this: