
In this blog post, we look at why misinformation still spreads even after it’s been debunked. And how charities can make their myth-busting as effective as possible.
Why is misinformation so prevalent in health?
Health is one of the most personal and emotive subjects in any of our lives. No one wants to get ill, and if they do, they want to get better quickly. Therefore, it stands to reason why health misinformation spreads quickly. Everyone’s looking for the next “health hack”, the “one simple trick” that clickbait headlines promise.
Most people who spread misinformation have good intentions. They’re concerned about the health of their friends and family. Any information you have to hand that could protect them (particularly if it’s shocking) is worth sharing, right? So scary information relating to health – about COVID-19, cancer, or vaccines, for example – is information that spreads easily.
Sadly, there are also people out there who have a lot of influence to gain (and money to make) by spreading false information about health issues. For example, a recent report by the Center for Countering Digital Hate reveals that only 12 people are responsible for the majority of anti-vaxxer content on social media. Many of the so-called ‘Disinformation Dozen’ have businesses based on promoting false cures (including dietary supplements and bleach) as alternatives to vaccinations.
Why are charities on the receiving end of myths?
I recently asked charity people on Twitter to share the most common myths they come up against in their charity jobs. There was a breadth of answers. And I’m ashamed to admit that some myths I thought to be true.
One of the reasons why charities often come up against myths is because they are on the front line. The accessibility and trustworthiness of charities mean that the public will often ask questions based upon misinformation they’ve encountered – “I’ve heard that…. Is that true?”, or “my friend told me that…”.
Certain issues that charities engage with are also a bit of a mystery or taboo to discuss. For example, several people mentioned myths about hospices, which I feel is partly down to the reluctance people have to talk about death (or even think about it). Myths and assumptions tend to fill the void where facts would normally be.
In other situations, raising awareness about (and funds for) a cause is likely to come up against some resistance. Some topics are more politically charged than others – say, immigration, or homelessness, or abortion, or climate change, or racial equality. Myths and misinformation may already be circling around, which as counterarguments, slow down change. Charities often need to deal with these because they are part of the force driving this change.
The challenges facing myth-busting
Myth-busting or debunking is a common way for charities to respond to myths. A typical method is to have content that includes the misinformation, and then an explanation refuting that misinformation.
However, much research has gone into the effectiveness of debunking, and the persistence of misinformation. The outcome of this research is that myths can continue to influence us even after it is ‘busted’ or debunked. This is a phenomenon psychologists call the ‘continued influence effect’.
To understand why myth-busting sometimes doesn’t work and how to make it more effective, it’s worth exploring the explanations for the continued influence effect.
The details of this section are based upon a fascinating review article by Stephan Lewandowsky and colleagues, called “Misinformation and its correction: Continued Influence and Successful Debiasing”, published in 2012. It’s definitely worth a read if you have the time.
However, more recent research deepens our understanding of effective strategies for debunking. The latest evidence is summarised in the 2020 update of “The Debunking Handbook”, which is free to download from: https://www.climatechangecommunication.org/debunking-handbook-2020/
Stories

Facts (or misinformation) based around stories (or ‘mental models’) are particularly convincing. It’s easy to remember a logical sequence of events – Factor A combined with Factor B which leads to Outcome C.
But let’s say that Factor B is based on a myth. You might assume that a retraction of Factor B collapses the whole story and the whole argument. But this isn’t the case.
The rest of the story remains intact, except there’s now a gap where ‘Factor B’ was once. Our brains find it difficult to reconcile this. So we would rather continue to rely on Factor B. It ‘fits’ the story we have in our heads about the events. To some extent, we’d rather stick to a false story and ignore the inconvenient truth.
This is why when faced with myth debunking, someone may rely upon anecdotes, stories they’ve heard, personal accounts or those of friends, family, or strangers.
It’s also why myths are often quite simple, and seem intuitive or ‘just common sense’. A simple story is easier to remember. But the correction may be more complicated, more difficult to remember (and therefore accept).
Familiarity
The more a myth is repeated, the more that people accept it as true, and the more popular we believe the viewpoint to be.
In the context of debunking, people were once concerned about something called the ‘familiarity backfire effect’. This perverse situation is where repeating the myth, even while debunking it (as most myth-busting content does) can ironically make the myth even stronger.
However, more recent evidence has suggested that this backfire effect is not strong if it exists at all. Correcting the myth usually outweighs any repetition. Repeating the myth may in fact increase the effectiveness of the correction.
Worldview

People are more likely to accept information as true if it fits with their pre-existing beliefs. And so, a person’s worldview influences debunking misinformation. One example is the ‘birther’ myth, that President Barack Obama was not born in the USA, and therefore was not eligible to be US President. Even after being thoroughly debunked, according to one survey in 2010, only 57% of Republican voters believe that Obama was ‘definitely’ or ‘probably’ born in the USA, compared to 85% of Democrat voters.
It’s much easier for us to reject a correction of misinformation if it goes against our personal worldview. We often feel that our views and our beliefs are part of who we are. We welcome new information if it affirms our beliefs and therefore strengthens the evidence we have to support the myth. But to have those views and beliefs challenged is uncomfortable. So our brains would rather reject the correction entirely.
How to make myth-busting more effective
Bearing in mind the challenges above, how can we make sure that myth-busting and debunking are as effective as possible?
Using the latest evidence, the recommendations of Lewandowsky and colleagues are updated as part of the 2020 ‘The Debunking Handbook’. Here are a few of the recommendations:
- Provide a better story. Correcting a myth can leave a gap in a story which needs to be filled. Where possible, provide an alternative explanation which fills this gap, and creates a new story in someone’s mind.
- Reinforce the facts. People may have heard the myth many times before, so counteract that by repeating the facts as often. This repetition will make the correct stronger, and more likely to quell any remaining influence of the myth. Don’t be concerned about the ‘familiarity backfire effect’ – any familiarity gained by repeating the myth is strongly outweighed by the correction.
- Give people a heads-up. When repeating the myth in order to correct, make it absolutely clear, in advance, that the information to follow is misleading.
- Make the explanation as memorable as possible. Use the strongest, most memorable arguments to counteract a myth first. Keeping things simple will help make it more memorable.
- Affirm worldviews. Understand your audience and what they hold dear to them. When correcting misinformation, frame evidence in a way that affirms these views, rather than goes against them.
Misinformation inoculation, or “pre-bunking”

If myths stick in our heads, it could be more effective to prevent this from happening than trying to ‘unstick it’ afterwards.
This is the idea behind ‘inoculation’ or ‘pre-bunking’, which has similarities to inoculation against diseases. In the case of a vaccine against a virus, giving someone a weakened version of a virus trains their immune system to be prepared when it’s up against the real thing. And similarly, giving someone the weakened (refuted) version of a myth will help to prepare against being persuaded by a myth when they encounter it ‘in the wild’.
According to an article in 2016, there are two important parts to inoculation. First, you have to make it clear what the ‘threat’ is. Telling people that their “existing position on an issue is vulnerable” as the authors put it, that there are people out there who will be looking to spread rumours and myths and change their minds. The second part is to provide the myth as well as information they need to argue against it – a ‘weakened’ version of the myth. Then, when they come up against the myth they’ll be more sceptical and better prepared to rejected it.
What misinformation do you encounter often in your work? Have you ever found out that what you believed was actually a myth? Have you seen myth-busting articles that have changed your mind?
I’d love to hear from you. Get in touch, either by email, tweet me @DrRichardBerks or follow me on LinkedIn.