Menu
For joint projects editor@huxley.media
For cooperation with authors chiefeditor@huxley.media
Telephone

THE BACKFIRE EFFECT: Why Is It So Hard to Change Your Mind?

Huxley
Author: Huxley
© Huxley – an almanac about philosophy, art and science
THE BACKFIRE EFFECT: Why Is It So Hard to Change Your Mind?
Evgeny Shapovalov. Antique still life, 2017

 

Each of us holds certain biases — and sometimes, we suffer because of them. That’s exactly what the cognitive distortion known as the Backfire Effect is about.

Many people, when confronted with evidence that contradicts their beliefs, will begin to question themselves. The natural reaction in such cases is to adjust our thinking and try to understand where we may have gone wrong. This may lead to new beliefs or, at the very least, a revised version of our existing ones based on the new information.

However, some people do the opposite — they cling even more tightly to their original position, despite clear and undeniable evidence to the contrary.

 

Try not to get too attached to a hypothesis just because it’s yours

 

Carl Sagan

 

THE DISCOVERY OF THE BACKFIRE EFFECT

 

T

he Backfire Effect was first described by professors Brendan Nyhan and Jason Reifler. Although the term itself appeared in 2010, their research began around 2005–2006. The professors distributed fake newspaper articles to their students — essentially «news hoaxes» — covering politically sensitive issues.

One such example was a claim that weapons of mass destruction had been found in Iraq prior to the U.S. invasion. After reading these fake reports, the participants were shown real articles that fully debunked the claim, including a CIA report confirming that no such weapons existed in Iraq.

The Backfire Effect was most noticeable among participants with conservative views, who refused to accept the correction and instead insisted that the weapons had simply been hidden or destroyed.

Nyhan and Reifler concluded that corrections «often fail to reduce misperceptions among targeted ideological groups» and, in fact, that «corrections can actually increase misperceptions».

 

HOW THE BACKFIRE EFFECT WORKS

 

The Backfire Effect is the tendency of some people to reject evidence that contradicts their existing beliefs — and not just reject it, but double down on what they already believe, even in the face of solid counter-evidence. This phenomenon is closely related to confirmation bias, which leads us to give more weight to information that supports our views while dismissing anything that challenges them.

It’s a cognitive error that undermines our ability to think critically, because it prevents us from evaluating all available evidence. Even when we know, on some level, that we might be wrong, we often refuse to admit it. Those affected by the Backfire Effect ignore arguments that might be crucial.

 

AND YET — THE EARTH IS FLAT!

 

«Of course the Earth is round, and it spins!» you might say. «No way — it’s flat!» would argue the participants of the first International Flat Earth Conference, held in 2017 in North Carolina, USA. Tickets to the event cost $249, and around 400 people attended.

According to them, the Earth is a flat disc 40,000 km in diameter, centered somewhere near the North Pole. There is no South Pole at all. The Sun and Moon are positioned directly above the disc. «But what about satellite photos?» you might counter. «Fake!» the flat-earthers would reply with confidence.

So what can you do with people who are so sure they’re right? Why would they need this «new» information about photos from space? Clearly, the Backfire Effect has done its damage.

 

WHY DO PEOPLE EXPERIENCE THE BACKFIRE EFFECT? IS IT EMOTIONAL?

 

When someone encounters information that contradicts their beliefs, it often triggers a wave of negative emotions. They may feel threatened by the realization that they’re wrong or distressed that something they deeply believed has turned out to be false.

The Backfire Effect doesn’t necessarily attack one’s principles directly. Rather, people instinctively reject anything that undermines their core beliefs — and this is precisely what causes them to defend their (possibly flawed) views even more aggressively.

 

 

HOW TO AVOID FALLING INTO THE BACKFIRE EFFECT YOURSELF

 

Cognitive biases are something we all fall victim to from time to time. But there are ways to minimize their influence. The first step is to recognize where you’re most vulnerable. Each of us has areas where we feel absolutely certain we’re right — and it’s precisely in these areas that the Backfire Effect strikes hardest.

Secondly, it’s important to acknowledge that your beliefs can be wrong. Even when diving deeper into subjects where you consider yourself an expert, try to stay open-minded. Let others question your views calmly — and don’t rule out the possibility that someone else might present information that challenges your core principles. In the end, take a step back, evaluate the situation, and apply critical thinking when new evidence comes your way.

 

HOW TO AVOID TRIGGERING THE BACKFIRE EFFECT IN OTHERS

 

It can be difficult to challenge people who are firmly attached to their views. When engaging with someone you believe is under the influence of the Backfire Effect, it’s crucial to approach them with respect. Understand that even just presenting new information can unintentionally feel like an attack.

Use simple, clear explanations, and adjust your delivery if you see that your message isn’t getting through. Ask whether the person has any questions about the new information — and answer them with patience and respect.

Research shows that people are more likely to reconsider their beliefs when their questions are acknowledged and addressed thoughtfully. The key is not to get angry or try to force someone to change their mind. The Backfire Effect is powerful — and pushing too hard may damage the relationship.

 

REAL-WORLD EXAMPLES OF THE BACKFIRE EFFECT

 

A number of studies have demonstrated how this effect operates in different contexts:

  • Voting preferences: When voters were presented with negative facts about the candidate they supported, many doubled down in their support instead of reconsidering.
  • Misconceptions on controversial issues: Topics like tax reform or stem cell research revealed that detailed factual information often reinforced existing false beliefs, especially when the facts clashed with ideological views.
  • Child vaccination: Parents who were skeptical of vaccines became even more convinced that vaccines cause autism after being informed about their importance.
  • Flu shots: One study found that when people were told the risk of flu vaccines was negligible, their willingness to get vaccinated decreased.
 
THE COST OF THE BACKFIRE EFFECT

 

Backfire effects are not as common as we tend to believe. In fact, we can’t reliably predict the specific circumstances under which they occur

 

The Debunking Handbook 2020

 

Despite the impression that the Backfire Effect is widespread and frequently triggered, research shows it’s not as thoroughly understood as we might wish. This is where The Debunking Handbook 2020 offers valuable insight. The handbook was authored by over 20 contributors, including cognitive scientist Stephan Lewandowsky of the University of Bristol and John Cook, a cognitive science researcher from the University of Western Australia.

The authors define the Backfire Effect as follows:

«A backfire effect occurs when a correction inadvertently increases a person’s belief in or confidence in misinformation, compared to their original level of belief before the correction — or even in the absence of any correction at all».

The key term here is misinformation — meaning false information spread either unintentionally or with the intent to mislead.

Back in 2010, researchers and practitioners feared that correcting misinformation might actually reinforce it — strengthening people’s belief in the falsehood. However, more recent studies have dispelled these concerns, showing that backfire effects are relatively rare, and that the risk of triggering them is much lower than previously thought.

So the main takeaway is this:

Do not hesitate to correct misinformation out of fear that it might backfire or strengthen false beliefs.

In the context of today’s aggressive information war between our country and the aggressor state, this advice couldn’t be more timely — or more crucial.

 


When copying materials, please place an active link to www.huxley.media
Found an error?
Select the text and press Ctrl + Enter