In this series, I dig a little deeper into the meaning of psychology-related terms. This week’s term is the illusory truth effect.
The illusory truth effect is a type of cognitive bias that makes us more likely to believe false information to be correct the more often we’re exposed to it. It was first described in 1977.
When we evaluate whether or not something is true, we consider it in the context of what we already know and whether it’s familiar. However, the illusory truth effect shows that familiarity can override rationality. Familiarity also speeds up processing time in the brain, which is mistakenly interpreted as an indicator of truth.
The problem with hitting back with truth
This effect is exploited by political campaigns. With the massive number of false statements being made all around, fact-checking seems like a very reasonable thing to do. The problem is, though, that increases the familiarity of the original false information and thereby reinforces it. Also, we don’t process negations as effectively, so if the misinformation is that Jane is a thief, and the corrective information is that Jane is not a thief, the “not” can start to get kind of fuzzy in our minds.
U.C. Berkeley Professor Emeritus George Lakoff suggests dealing with this by using a truth sandwich. While this tweet specifically mentions Donald Trump, it applies just as well to false statements coming from politicians who might not be quite as prolific in their lying.
Believability isn’t required
A study that looked at fake news headlines that appeared on Facebook during the 2016 U.S. presidential campaign found that only a single exposure was enough to trigger the illusory truth effect. This occurred even when the headlines themselves had low believability, were flagged by fact-checkers, and were inconsistent with the viewer’s political beliefs. However, blatantly absurd headlines did not trigger the illusory truth effect. The researchers said the results “suggest that social media platforms help to incubate belief in blatantly false news stories.”
The mental illness-gun violence non-connection
I decided to write this post after reading about this effect in the context of public views on mental illness and gun violence. The issue is far more nuanced than it’s commonly made out to be, but without fail, every time there’s a mass shooting, politicians, law enforcement, the National Rifle Association, and various other bandwagoneers start their spiel about how gun violence is because of crazy people, and guns don’t kill people, crazy people do.
It’s inaccurate, but because it’s so familiar, the illusory truth effect kicks in, and people start to be a lot more afraid of their neighbour on one side who has a mental illness than their neighbour on the other side who’s an angry white dude with a semi-automatic rifle that takes high-capacity magazines, who’s a patriot because he refuses to wear a face mask.
The scary thing is that the illusory truth effect isn’t just a stupid/ignorant people cognitive bias. It’s a mental shortcut that we’re all prone to taking.
Someone needs to come up with a good truth sandwich for the gun violence and mental illness issue. I don’t think I’m qualified because I’d use too many expletives.
- Pennycook, G., Cannon, T. D., & Rand, D. G. (2018). Prior exposure increases perceived accuracy of fake news. Journal of Experimental Psychology: General, 147(12), 1865.
- Psychology Today: When Correcting a Lie, Don’t Repeat It. Do This Instead.
- Wikipedia: Illusory truth effect