In this series, I dig a little deeper into the meaning of psychology-related terms. This week’s term is cognitive bias.
Do you think you’re biased? Well, your brain certainly is. A cognitive bias is a type of shortcut your brain takes to make tasks easier and more automatic. Sometimes that’s helpful, but often it’s not, especially when we’re unaware of it.
Here are some examples.
We’re naturally inclined to go for the “sure thing” rather than something with an uncertain outcome; this is true even when the result would be better if the uncertain outcome occurred.
This involves attributing human characteristics to non-human objects, e.g. my belief that my guinea pigs love me.
The strength of a logical argument is evaluated based on the subjective believability of the conclusion rather than on the strength of the arguments that led to the conclusion.
Ben Franklin effect
If you’ve previously done someone a favour, you’re more likely to do them another favour than if it was them who had originally done a favour for you.
Bias blind spot
We tend to believe that we are less biased than other people are, regardless of our own actual level of bias.
We tend to both seek out, believe, and focus on information that confirms what we already believe, and ignore information that goes against our beliefs.
Curse of knowledge
This is the assumption that other people have the same level of background knowledge as you do; I tend to get caught up in this, so I ‘ve never been very good with nursing students because I always assume they should have more knowledge than they actually do.
Receiving the same information is evaluated differently depending on how it’s framed. If a news outlet is presenting an accurate piece of information in the context of other information that is negative, it will be evaluated differently than if it was framed within the context of positive information.
This is the sort of belief that because you’ve had half an hour of losing at a particular slot machine, the machine is due to pay out imminently. You can also see this with a simple coin toss; if you toss a coin and get ten straight heads, the gambler’s fallacy would tell you the next toss is more likely to be tails. However, the coin doesn’t remember previous toss results; the odds stay 50/50 for each toss regardless of previous results.
An outcome seems like it should have been obviously predictable after it’s already happened.
Hyperbolic discounting is the preference for a smaller short-term gain over a larger but delayed gain. For example, given the choice, people are more likely to choose being given $100 a month for a year over bring given $1500 at the end of the year, even though waiting would mean an extra $300.
Illusory truth effect
The illusory truth effect means that we’re more likely to believe information the more we’re exposed to it, even if the information itself is false. The effect of familiarity can override rationality, and this is often exploited by political campaigns.
Just world fallacy
We’re naturally inclined to think that good things happen to good people who behave properly, and bad things happen to bad people who do the wrong thing. The just world fallacy can feed into victim-blaming, as people don’t want to believe the bad things happen to people who have properly because that would mean they’re at risk.
Mere ownership effect
We tend to like things more simply because we own them, making us reluctant to give things up once we have them.
We’re more likely to notice and remember things that are negative compared to things that are positive. This had a survival function back in the caveman
Expecting a certain outcome causes the observer to unconsciously do things to influence the outcome; clinical research trials are often designed to be double-blinded to account for this.
Even if you spent far too much money on something, you’ll likely try to convince yourself afterwards that it was totally worth it.
If something big and significant happens, we tend to expect there to be a big and significant reason behind it, even if the actual explanation is simple or totally random.
If someone you don’t like makes a suggestion, you’ll tend to automatically assume it’s a bad suggestion, regardless of the merit of the suggestion itself.
Rhyme as reason effect
This seems utterly bizarre, but Wikipedia gives this example from O.J. Simpson’s trial: “If the gloves don’t fit, then you must acquit.”
Sunk cost fallacy
This is the tendency to think that you should stick with something because you’ve already put a lot of time/effort/money in.
Even if you’re only hungry enough to need a small plate of food, if given a large plate, you’ll judge that full plate as being the appropriate amount.
There are many, many more examples of cognitive biases. Some of them have a greater degree of voluntariness, such as the ostrich effect (i.e. sticking one’s head in the sand to ignore something bad happening). Others we likely wouldn’t realize if they weren’t pointed out to us. Regardless, the ways that we perceive the world around us are often not an objectively accurate representation.
Source: Wikipedia: List of cognitive biases