In this series, I dig a little deeper into the meaning of psychology-related terms. This week’s term is heuristic.
Heuristics (from the Greek “to discover”) are mental rules of thumb or shortcuts that allow our brains to process information and arrive at conclusions more quickly. A post not long ago covered philosophical razors, which are a type of logical heuristic. This post will cover psychological heuristics, which you probably use without even realizing it.
Heuristics aren’t necessarily a bad thing. They’re efficient, they don’t require conscious thought, and they handle unimportant stuff pretty well. The problem comes when we get too attached to our conclusions and believe that we arrived at them logically, failing to see that our mind just made a flying leap and that’s where it happened to land.
Heuristics were first described by psychologists Amos Tversky and Daniel Kahneman in a 1974 paper. They identified three of them: representativeness, availability, and anchoring and adjustment. Several more have since been identified.
Anchoring and adjustment
People will evaluate pricing based on the initial anchor point they’re given. A used car might be worth $6000, but if the dealer’s asking price is $10,000, that’s what your brain wants to use a reference point. You end up thinking you got an awesome deal by only paying $7500, while the dealer is happy you fell for their trick.
When you go to a clothing store, either in-person or online, sale items will always be clearly marked with the original price. That serves as the anchor, and makes you think you’re getting a great deal, even if the original price was way higher than it deserved to be in the first place. I can get sucked into this one; I am cheap, and feeling like I got a great deal is very satisfying.
We judge the frequency of events based on the example that comes to mind most easily. Car crashes are waaaaaay more common than plane crashes, but because plane crashes are big news and car crashes are not, our estimations of likelihood are skewed by plane crashes being more front of mind. Car crashes become background risk, yet some people are terrified of flying.
We also overestimate violent crime rates and misjudge trends in those rates because those types of crimes get a lot of media attention.
This is a variation of the availability heuristic, and means that if something comes to mind quickly, we’re likely to think it’s the right/safe choice. We’re also likely to overestimate the safety of familiar environments, to the point that we may miss seeing major hazards. This may be part of why you’re most likely to have a car accident when you’re close to home.
Our brains like to create categories and come up with representative examples of those categories. When someone comes along who reminds of one of those representative examples, we slot them into that category and apply all the beliefs and judgments we have around the category to the person we’ve just shoved in there. Hello, stereotypes.
This is also why we’re bad at identifying what’s random and what’s not. We expect a random set of data to look a certain way… but we’re totally out to lunch. If you flip a coin ten times and it comes up heads each time, you might think that it’s a trick coin or, at the very least, that there’s a higher likelihood of flip #11 coming up tails, because 11 heads in a row just doesn’t happen randomly. Except it does, and you’d be wrong, because the odds of every flip are 50:50, and the coin has no memory of what’s already happened. Even if it did, it probably wouldn’t care.
The rarer something is, the more valuable it’s believed to be. When toilet paper started flying off shelves in spring 2020, the scarcity heuristic kicked in and people start hoarding enough toilet paper to last the next decade.
Think of diamonds and cubic zirconia. Diamonds cost more because they’re rare, so people are willing to shell out the big bucks for them even though they look pretty much the same to the average person.
We are sheeple, and if other people are doing it, we think we probably should be too. Add this to the scarcity heuristic, and it’s not just a few nutters hoarding toilet paper; everyone’s getting in on the action.
Similarly to the social proof heuristic, we make some moral evaluations based on how commonly we observe a behaviour. The more common a behaviour is, the more likely we are to deem it a moral action, even if the behaviour itself is actually selfish. Baaaaaa!
I find it fascinating that our brains try so hard to be helpful, but can totally miss the boat. Perhaps the takeaway is that thoughts just aren’t reliably accurate. Someone (although I don’t know who) has said “don’t believe everything you think”, and that’s very good advice. There’s some good stuff that our minds come up with, but there’s also a lot of nonsense floating around in there. Deciding which is which isn’t necessarily easy.
Do these heuristics sounds familiar to you (not necessarily the name of the heuristic, but the shortcuts they refer to)? How do you think we can get around our natural tendency to believe in nonsense thoughts?
- American Psychological Association
- Association for Psychological Science: Heuristics Revealed
- Psychology – 1st Canadian Edition: Problem-Solving: Heuristics and Algorithms
- The Decision Lab: Heuristics
- Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124-1131.