In this series, I dig a little deeper into the meaning of psychology-related terms. This week’s term is cognitive bias.
Do you think you’re biased? Well, your brain certainly is. A cognitive bias is a type of shortcut your brain takes to make tasks easier and more automatic. Sometimes that’s helpful, but often it’s not, especially when we’re unaware of it.
Types of Cognitive Biases
There are loads of cognitive biases. These are just some of them.
We’re naturally inclined to go for the “sure thing” rather than something with an uncertain outcome; this is true even when the result would be better if the uncertain outcome occurred. This is essentially “better the devil you know than the devil you don’t.”
This is part of why change tends to be difficult. The results of change are often not a sure thing, but we limit ourselves if we put too much weight on our biased thinking.
This involves attributing human characteristics to non-human objects. An example is my belief that my guinea pigs love me. Love is a human emotion, and I have no way of knowing what it’s like to be in a guinea pig’s head. It’s probably pretty safe to assume that it bears little resemblance to my anthropomorphizing.
The strength of a logical argument is evaluated based on the subjective believability of the conclusion rather than on the strength of the arguments that led to the conclusion.
If I believe the earth is flat, I’m more likely to believe a logically weak argument about the movement of the moon that’s consistent with the notion of the earth being flat than I am to believe a strong argument that suggests the moon moves in a way that demonstrates the earth isn’t flat.
Belief perseverance is the tendency to hang onto beliefs even once the initial information that they’re based on has been discredited. If I were to tell you that Joanne failed a test, and then I later told you that oops, it was actually Mark who failed the test, you’re likely to still have lingering doubts about Joanne’s performance.
You can read more on this in the post What Is… Belief Perseverance.
Ben Franklin effect
If someone has previously done you a favour, they’re more likely to do you another favour in the future than if you had originally done the favour for them. So, if you’re trying to get someone to view you more positively, ask them to do you a small favour.
Bias blind spot
We tend to believe that we are less biased than other people are, regardless of our own actual level of bias. This means that we believe that our own beliefs are mostly based in facts and objective reality while we view other people’s beliefs as influenced more by prejudice.
An example of this is that most people think they’re better drivers than other people. The math doesn’t work out for the majority of people to be above average at anything, so that’s a good indicator that the bias blind spot is coming into play.
We tend to seek out, believe, and focus on information that confirms what we already believe, and ignore information that goes against our beliefs.
The combination of selective attention and confirmation bias can produce the frequency illusion. This occurs when something comes to your attention and all of a sudden it seems like you’re seeing it everywhere.
Curse of knowledge
This is the assumption that other people have the same level of background knowledge that you do, even if you have specialized knowledge that most people wouldn’t have. I tend to get caught up in this, so I’ve never been very good at dealing with students in my work as a nurse because I always assume they should have more knowledge than they actually do.
The Dunning-Kruger effect is a type of cognitive bias that causes people who know the least about a subject to feel confident that they have greater knowledge or competence than they actually do. This occurs when people lack the basic knowledge and meta-awareness to recognize just how much they don’t know.
There’s more on this in the post What Is… the Dunning-Kruger Effect.
Receiving the same information is evaluated differently depending on how it’s framed. If a news outlet is presenting an accurate piece of information in the context of other information that is negative, it will be evaluated differently than if it was framed within the context of positive information.
The gambler’s fallacy is the belief that the odds of an event happening change based on events that have already occurred. This may cause someone to believe that if they’ve had half an hour of losing at a particular slot machine, that machine must be due to pay out imminently.
You can also see this with a simple coin toss; if you toss a coin and get ten straight heads, the gambler’s fallacy would tell you the next toss is more likely to be tails. However, the coin doesn’t remember and doesn’t care about previous toss results; the odds stay 50/50 for each toss regardless of previous results.
An outcome seems like it should have been obviously predictable after it’s already happened. It’s very hard to un-know what you’ve learned in the interim. It can be a problem if you apply your superpowers of hindsight with the expectation that you can solve problems to come in the future.
You can see this with public figures being judged for having done things in the past that are now considered socially unacceptable. For example, Canadian prime minister Justin Trudeau made the news in 2019 for having worn brownface makeup with an Aladdin costume back in 2001. Had I been at that party in 2001, I wouldn’t have given his costume a second thought, as what’s unacceptable now wasn’t viewed the same way back then.
Hyperbolic discounting is the preference for a smaller short-term gain over a larger but delayed gain. For example, given the choice, people are more likely to choose being given $100 a month for a year over being given $1500 at the end of the year, even though waiting would mean an extra $300.
Illusion of causality
Illusions of causality happen when we see cause-and-effect relationships in things that are actually unrelated. This is more likely to occur the more frequently the outcome of interest occurs and the more frequently the identified potential cause is present.
People often engage in alternative health practices because of illusions of causality, even though there’s no scientific evidence to support their effectiveness. Other forms of pseudoscience may also be supported by illusions of causality, like the belief in astrology that Mercury’s apparent retrograde motion disrupts communication on earth.
You can find out more about this in the post What Is… the Illusion of Causality.
Illusion of transparency
We tend to overestimate how well others can know what’s going on inside our heads, and also how well we know what’s going on in other people’s heads. You may think the anxiety running circles in your head is obvious to other people, but it’s probably a whole lot less obvious than you think it is.
Illusory pattern perception
Illusory pattern perception, or apophenia, is the tendency to find patterns where there aren’t any. Pareidolia involves pattern-finding (and in particular, finding faces) in visual information.
We’re not very good at conceptualizing what true randomness looks like, and we tend to find patterns in data sets that actually are random. Illusory pattern perception can even be reassuring in the face of a world that seems unpredictable and uncontrollable.
There’s more on this in What Is… Apophenia (Finding Patterns Where None Exist)
Illusory truth effect
The illusory truth effect means that we’re more likely to believe information the more we’re exposed to it, even if the information itself is false. The effect of familiarity can override rationality, and this is often exploited by political campaigns.
You can read more in the post What Is… the Illusory Truth Effect.
Just world fallacy
We’re naturally inclined to think good things happen to good people who behave properly, and bad things happen to bad people who do the wrong thing. The just world fallacy can feed into victim-blaming, as people don’t want to believe the bad things happen to people who have properly because that would mean they’re at risk.
Mere exposure effect
The mere exposure effect, also known as the familiarity effect, means that we tend to have a preference for things that we’re more familiar with.
Mere ownership effect
We tend to like things more simply because we own them, making us reluctant to give things up once we have them. This plays a role in the psychology of advertising and free trial period offers.
We’re more likely to notice and remember things that are negative compared to things that are positive. This had a survival function back in the caveman days, because it was important to remember where the poisonous plants and the tigers’ den were. Even though that survival function is far less necessary now, it’s hardwired into our brains.
Expecting a certain outcome causes the observer to unconsciously do things to influence the outcome. Clinical trials are often designed to be double-blinded (neither the participant nor the researchers dealing with them know what form of treatment the participant received) to account for this. This means that neither the researchers dealing directly with the study participants nor the participants themselves know whether a given participant is receiving the treatment intervention or placebo.
Pluralistic ignorance occurs when members of a group go along with something they don’t agree with because they perceive it to be the group norm, even though it’s actually not. For example, if a professor gives a lecture that didn’t make any sense, students may refrain from asking questions because they assume everyone else understands, and they don’t want to be the one person that looks like an idiot. In the meantime, everyone else is thinking the exact same thing, so the expected group norm (understanding the lecture) isn’t the norm at all.
There’s more on this in What Is… Pluralistic Ignorance.
Post-purchase rationalization, also known as the choice-supportive bias, means that even if you spent far too much money on something, you’ll likely try to convince yourself afterwards that it was totally worth it. This may occur by enhancing its attributes, minimizing its flaws, or dismissing the validity of other options.
If something big and significant happens, we tend to expect there to be a big and significant reason behind it, even if the actual explanation is simple or totally random. This can help to fuel conspiracy theories, where an elaborate conspiracy may feel like it’s more appropriate than a simple explanation.
If someone you don’t like makes a suggestion, you’ll tend to automatically assume it’s a bad suggestion, regardless of the merit of the suggestion itself. This can occur on an individual level or on a group level, such as a policy proposal from a political party you don’t support.
Rhyme as reason effect
This seems utterly bizarre, but Wikipedia gives this example from O.J. Simpson’s trial: “If the gloves don’t fit, then you must acquit.” If things rhyme, they feel right, and we find that convincing, even though it has no actual bearing on the veracity of the statement.
The more scarce we perceive something to be, the greater the value that we tend to place on it.
During the great toilet paper crisis of 2020 and other panic buying situations, the scarcity effect was reinforced by the mere exposure effect from seeing empty shelves in stores and on the news.
We tend to think that other people are thinking about us and paying attention to us far more than they actually are. In reality, they’re too busy thinking about themselves and what other people think of them to pay too much attention to what we’re doing.
Sunk cost fallacy
This is the tendency to think that you should stick with something because you’ve already put a lot of time/effort/money in.
For example, let’s say you bought $20,000 worth of stock, and that stock has dropped to $8,000 in value. Rather than taking your $8K and getting out of that disaster while you can, the sunk cost fallacy makes you likely to hang onto that stock as it keeps on dropping.
Even if you’re only hungry enough to need a small plate of food, if given a large plate, you’ll judge that full plate as being the appropriate amount.
There are many more examples of cognitive biases. Some of them have a greater degree of voluntariness, such as the ostrich effect (i.e. sticking one’s head in the sand to ignore something bad happening). Others we likely wouldn’t realize if they weren’t pointed out to us. Regardless, the ways that we perceive the world around us are often not an objectively accurate representation.
Source: Wikipedia: List of cognitive biases
The Psychology Corner has an overview of terms covered in the What Is… series, along with a collection of scientifically validated psychological tests.
Ashley L. Peterson
BScPharm BSN MPN
Ashley is a former mental health nurse and pharmacist and the author of four books.