In this series, I dig a little deeper into the meaning of psychology-related terms. This week’s term is belief perseverance.
Belief perseverance is the tendency to hold onto beliefs even after the information that they’re based on is shown to be false. I’m a big fan of this line from a paper by Savion (2009):
“Belief perseverance—clinging to explicitly discredited beliefs—is ubiquitous to the point of serving as the ultimate evidence of the feebleness of our mind.”
According to Oxford Bibliographies, “People’s proclivity to passionately cling to, and advocate for, beliefs or attitudes that exist in the absence of evidentiary support manifests in a range of life domains… In fact, this propensity to develop, maintain, and unwaveringly cling to one’s beliefs in the absence of sufficient evidence is one of the most well-established tendencies in the social-psychological canon.”
A common research method for studying this effect is the debriefing paradigm. Researchers give participants information at the beginning of an experiment, and then later tell them that the initial information given was in error; for example, the researchers might tell the participants that the information actually applied to something/someone else. Belief perseverance is seen when participants continue to believe the initial information even when they’ve been explicitly told it was wrong.
Types of beliefs that persevere
Evidence has been found for three different types of beliefs that we tend to persevere with:
- social impressions (beliefs we have about specific other people)
- naïve theories
These are the beliefs that we have about ourselves and our abilities. If you’re given feedback about whether you performed a task well or poorly, and then later told that feedback was actually about someone else’s performance, your perception of your ability at the task is still likely to be influenced by the initial feedback.
If we’re given feedback that poses a threat to our core self-concept, the motivation to maintain our self-concept may make it easier to discount the initial feedback and help to weaken the belief perseverance effect.
Naïve theories are ideas we’ve developed about how the world works and what causes the various things that we observe.
One kind of naïve theory that we tend to hang onto is our pet theories about how the world works. From a young age, we start to develop intuitive theories about how the world works and create causal connections. Such theories may incorporate bits of facts, but we then draw our own conclusions that aren’t based on facts. Even when presented with information that disconfirms pet theories, we tend to stick with them.
This category of beliefs also includes social theories, which are our beliefs about how people generally think, feel, behave, and interact with one another.
Factors that may influence belief perseverance
This cognitive bias means that we rely more heavily on the information we’re initially presented with. For example, we’ll evaluate the reasonableness of the price we end up paying for something based on the price that was originally given. That means that if something with a $50 regular price is on sale for $30, but it’s really only worth $15, we’ll still tend to use that $50 point as our anchor to evaluate how good a deal we’re getting.
If we are initially given feedback on how we complete a task, and then we’re later told that the initial feedback was invalid, we may persevere in believing the initial feedback because it acts as an anchor for how we rate our abilities.
Creating causal explanations
The persistence in believing discredited information may relate to the causal explanations or scenarios we come up with to account for the initial belief. These explanatory stories can take on a life of their own and become the meaning on which the belief resists, so the belief is no longer dependent on the initial information. When that initial information is discredited, those explanations may remain alive in our heads and well and keep the belief going.
One technique that can reduce the belief perseverance effect is trying to imagine how the opposite belief might be true, like if you were having to take that side in a debate. Generating new causal theories reduces the biasing effects of the initial causal attributions.
The backfire effect
This post originally began as a post on the backfire effect, something I first heard about in an awfully impressive comic on The Oatmeal. While belief perseverance refers to beliefs persisting after finding out the information they were based on was false, the backfire effect goes a step further, suggesting that presenting people with corrective information actually increases their belief in their misconceptions.
It turns out that there’s some debate about whether the backfire effect actually exists. It’s not observed in people across the board, but it may be more likely when the original beliefs are strongly held, especially if they’re political. It may be that it’s not a distinct psychological effect of its own, but rather something that arises from a combination of other effects.
Illusory truth effect
The illusory truth effect means that we’re more likely to believe information the more often we hear it, so when correcting misinformation, it may be useful to present it within a “truth sandwich” (i.e. truth, what the misinformation was, truth). Part of the backfire effect may have to do with repeating the misinformation without effectively tying it to the correction, thus feeding into the illusory truth effect.
When you tell someone what to do or not do and they perceive it as a threat to their freedom, they may push back and do the opposite to exert their freedom; this is known as reactance. If belief-discrediting information is presented in a way that’s perceived as a threat to freedom, that could have consequences along the lines of the backfire effect.
Our weird and wonderful minds
I find it so interesting to learn about the ways that we think. Our minds are definitely biased, and that bias doesn’t tend to be in the direction of accuracy.
We can probably all think of many examples of people being stubborn in their beliefs, but does it surprise you that it’s a pretty automatic thing that we’re all prone to?
- Anderson, C.A. (2007). Belief perseverance (pp. 109-110). In R. F. Baumeister & K. D. Vohs (Eds.), Encyclopedia of Social Psychology. Sage.
- Anderson, C. A., Lepper, M. R., & Ross, L. (1980). Perseverance of social theories: The role of explanation in the persistence of discredited information. Journal of Personality and Social Psychology, 39(6), 1037.
- Guenther, C. L., & Alicke, M. D. (2008). Self-enhancement and belief perseverance. Journal of Experimental Social Psychology, 44(3), 706-712.
- Guenther, C.L., & Smith, A.M . (2020). Belief Perseverance. Oxford Bibliographies: Psychology.
- Savion, L. (2009). Clinging to Discredited Beliefs: The Larger Cognitive Story. Journal of the Scholarship of Teaching and Learning, 9(1), 81-92.
- Swire-Thompson, B., DeGutis, J., & Lazer, D. (2020). Searching for the backfire effect: Measurement and design considerations. Journal of Applied Research in Memory and Cognition, 9(3), 286-299.
The Psychology Corner has an overview of terms covered in the What Is… series, along with a collection of scientifically validated psychological tests.
Ashley L. Peterson
BScPharm BSN MPN
Ashley is a former mental health nurse and pharmacist and the author of four books.