THE ILLUSION OF TRUTH: how to escape manipulation
Yevhenii Shapovalov. Project Ukraine Oberig. Trypillia Artifacts, 2022
We live in an era of information wars. And fake news. Strangely enough, the information we hear repeatedly often seems the most truthful to us. This is the essence of the cognitive bias known as the «illusion of truth effect». It turns out that we tend to trust information that is, to some extent, familiar to us. And the more often we hear it, the more credible it seems.
A lie repeated a thousand times becomes the truth
Attributed to Joseph Goebbels
HITLER AND GOEBBELS, AS PRACTITIONERS OF THE EFFECT
W
hen Hitler came to power, he devised a plan to exterminate the Jews. Putin chose not to improvise but to use it as a manual for the destruction of Ukrainians. One of its key components is deliberate lies.
Adolf Hitler and Joseph Goebbels set out to convince the German people that all Jews were the main and most dangerous enemies. Using the press, radio, and television — all controlled by the Nazis — this duo blamed the Jews exclusively for all of Germany’s troubles and failures.
Through extremely deceitful propaganda, Hitler instilled in his citizens the belief that there were no more cruel and terrifying people on earth than the Jews. To support this claim, Hitler and Goebbels invented a heart-wrenching story that in the Middle Ages, Jews performed ritual killings of Christian children. But even that was not enough: supposedly, the blood of the murdered was added to the bread baked for Passover… And the Germans believed it.
Today’s Russia is simply a copy of Nazi Germany during World War II. There is no difference in how Putin and his inner circle spread lies about the Ukrainian nation among Russians — lies disguised as truth.

THE FIRST TO DESCRIBE THE ILLUSION OF TRUTH EFFECT
The illusion of truth effect was first formulated during research by psychologists Thomas Toppino from Villanova University, as well as Lynn Hasher and David Goldstein from Temple University (both located in Philadelphia, USA). This was in 1977.
Participants in the experiment were presented with 60 facts that seemed quite truthful at first glance (though in reality, they were not): «The first military air base was established in New Mexico» or «In 1925, basketball became an Olympic sport». The participants were asked to rate each statement on a scale from 1 («definitely false») to 7 («definitely true»).
A week or two later, the same participants were invited for a second session. Then, two weeks after that, the study was repeated again. Twenty of the statements remained the same in all three rounds, while the other two-thirds were new each time.
With each new session, when some statements were repeated (both true and false ones), their perceived truthfulness increased. The participants generally didn’t remember hearing these statements during the previous sessions but were confident they had heard them somewhere before. Therefore, the statements seemed truthful to them.
When encountering information we have seen before, our brain processes it more quickly and thus (incorrectly) assumes it to be true. As a result, we tend to use the least energy-consuming shortcuts when judging whether a statement is plausible.
THE LESSONS OF REPETITION
So, what’s the point? Psychologist Lynn Hasher from the University of Toronto and her research team first observed the effect back in the 1970s. «Repetition makes things more believable. And the effect is probably stronger when people are tired or distracted by other information». Repetition is what makes fake news work. This was pointed out by researchers from Central Washington University in 2012.
And, of course, it is a fundamental element of political propaganda, as was briefly mentioned at the beginning. Adolf Hitler didn’t know what it was called but fully understood how to use the technique. «Slogans should be persistently repeated until the very last person gets the idea», he wrote in Mein Kampf.
THE ILLUSION OF TRUTH EFFECT IN THE HANDS OF SPEAKERS AND POLITICIANS
We all understand perfectly well that politicians know how to lie — and do it quite skillfully. Like chameleons, they change and adjust their views to match the demands of the electorate. But it’s not just politicians who do this. If you offer someone a variety of facts to choose from and then ask them to share these facts with a wider audience, that person will almost always select the information that aligns with the listeners’ priorities.
Edward Tory Higgins, a professor of psychology at Columbia University, discovered an interesting trick used by speakers who aim to please their audience. After choosing which fact to present, the speaker starts to believe in that fact themselves. Higgins called this «audience tuning».
It is commonly believed that this kind of belief in one’s own lies, which helps to lie much more effectively, is a form of self-deception. A person capable of self-deception often has an advantage over competitors — especially in politics. It may contradict common sense, but it helps achieve the goal.
WEAKENING THE POWER OF THE EFFECT
It is quite difficult to completely avoid the illusion of truth effect. Since its advantage lies in the fluency of information processing, people often don’t even realize they have become victims of this cognitive bias. However, with effort, its influence can be weakened.
The first useful step: since the brain tends to rush while processing information, it is important to slow down when evaluating the accuracy and truthfulness of what you hear, read, or see — to engage rational thinking, verify the reliability of the information source, and approach new data impartially (under no circumstances analyzing it through one’s own preconceived personal stance).
Thus, by becoming a fact-checker and verifying everything for authenticity, you can weaken the power of the illusion of truth effect. This reality was demonstrated by experiments conducted by psychologists from Harvard (USA) and Duke University (Durham, North Carolina, USA).
Another way to dispel the illusion of truth is to pay close attention to your emotions: does your attitude toward the information carry the weight of personal experience and emotions that only you fully understand?
Mood also influences gullibility: a positive mindset encourages creative thinking, while a not-so-good mood leads to a more detailed and thoughtful approach, with greater attention to detail. A «less-than-good» mood also serves as a warning signal, triggering doubts about the information presented.

HOW THE EFFECT INFLUENCES THE KNOWLEDGEABLE PERSON
Do our own knowledge and expertise protect us from the illusion of truth effect? Not always. A group of researchers led by psychologist Lisa Fazio from Vanderbilt University (Nashville, USA) studied how the effect interacts with our prior knowledge. Does it influence it? Even when people are certain about a fact, repeated exposure to contradictory statements can make those statements seem more credible over time.
For example, consider the statement: «A sari is a short checkered skirt worn by Scots».
If this statement is repeated a few times, even those, who were sure it referred to a kilt will, after hearing it again and again, gradually start to believe that it might actually be a sari.
The same applies to the Cyclops. Participants who knew that the one-eyed giant from Greek mythology is a Cyclops, after reading the statement «The Minotaur is a one-eyed giant in Greek mythology» twice, rated its credibility higher than those who read it only once.
When you see this fact for the second time, it is much easier to process — you read it faster, you comprehend it more fluently. Our brain interprets this fluency as a signal that something is true
Lisa Fazio
ESCAPING THE EFFECT
However, repetition is not the only thing that can confuse our beliefs. We understand well that our capacity for reasoning is, in fact, a limited resource. Our mind essentially becomes a victim of the illusion of truth effect because our instincts tend to choose the shortest path when searching for and evaluating plausibility.
Once we become aware of the effect, we already have a chance to resist it. We start by «turning on our brain» and double-checking why we believe what we do: does something sound plausible because it’s true — or simply because we’ve heard it many times?
That’s why scientists provide references — to help us verify the source of any statement instead of blindly accepting it.
It is within our power (and part of the defense against the illusion) to simply stop following lies blindly. In the world we live or exist in, facts have always mattered and will continue to matter.
When you keep repeating things without bothering to check their accuracy, you contribute to a situation where lies and truth become indistinguishable — easier to confuse and swap. So please: think, weigh, question, doubt — before repeating something thoughtlessly.
When copying materials, please place an active link to www.huxley.media
Select the text and press Ctrl + Enter