Sceptics Beware: The Dangers of Debunking Myths

December 11, 2009 at 5:32 pm (Miscellaneous) (, , , , , , , , , )

Here is a PDF (Schwarz et al) that discusses attempts to improve decision-making – and the frequent failures of these attempts.

Ironically, the more people try to consider the opposite, the more they often convince themselves that their initial judgment was right […]

Similar surprises arise in the domain of public information campaigns. Presumably, erroneous beliefs can be dispelled by confronting them with contradictory evidence. Yet attempts to do so often increase later acceptance of the erroneous beliefs, as known since Allport and Lepkin’s pioneering research (1945) into rumor transmission. Again, the unintended effect arises because the educational strategy focuses solely on information content and ignores the metacognitive experiences that are part and parcel of the reasoning process.

The section of the quote that I have bolded has implications for sceptics wishing to debunk myths – including those about climate change, HIV/Aids, vaccination and other important topics.

On page 20 of the PDF, the authors state that:

Any attempt to explicitly discredit false information necessarily involves a repetition of the false information, which may contribute to its later familiarity and acceptance. Although this problem has been known since Allport and Lepkin’s research (1945) into wartime rumors, the idea that false information needs to be confronted is so appealing that it is still at the heart of many information campaigns.

The authors then discuss “Spreading Myths by Debunking Them”, using the example of a flyer published by the Centers for Disease Control (CDC). It illustrates a common format of information campaigns that counter misleading information by confronting “myths” with “facts.” This format is also used by sceptical bloggers. The myths and facts, in the case of the CDC flyer, relate to flu vaccines.

Skurnik, Yoon, and Schwarz (2007) gave participants the CDC’s “Facts & Myths” flyer or a parallel “Facts” version that presented only the facts. Here is the bit I find most interesting:

Right after reading the flyer, participants had good memory for the presented information and made only a few random errors, identifying 4% of the myths as true and 3% of the facts as false. Thirty minutes later, however, their judgments showed a systematic error pattern: They now misidentified 15% of the myths as true, whereas their misidentification of facts as false remained at 2%.

It turns out that familiar statements are more likely to be accepted as true: “This familiarity bias results in a higher rate of erroneous judgments when the statement is false rather than true, as observed in the present study.” The attempt to debunk myths actually facilitates their acceptance after a short delay (a delay of only 30 minutes in the study by Skurnik, Yoon, and Schwarz).

So: if fisking misleading articles, debunking myths, and tackling misinformation involves repeating the myths, then sceptics (or, if you prefer, “skeptics”) may actually be making the situation worse and helping to propagate the myths and misinformation in question.

What to do? Well, we could make the facts familiar to people – and make the familiarity bias work in our favour. If you have factual information that would help counter misinformation, then the best tactic may be to state the information, and to repeat it sufficiently often for the true statement(s) to ‘enter into the public consciousness’.

Schwarz et al consider the possibility of using “memorable slogans that link the myth and fact [and] may provide a promising avenue”, but they do not seem to recommend this course of action:

In most cases, however, it will be safer to refrain from any reiteration of the myths and to focus solely on the facts. The more the facts become familiar and fluent, the more likely it is that they will be accepted as true and serve as the basis of people’s judgments and intentions.

I recently wrote about squalene, an ingredient in some vaccines (including the swine flu / H1N1 vaccine). The post related to myths and misinformation regarding squalene and in order to write this post, it was necessary to make reference to the misinformation. Even by pointing out that there was a spurious link between squalene and a certain syndrome, I may have been helping to perpetuate the myth. Perhaps rather than mentioning the link, I should have simply written:

Squalene is a component of some adjuvants that is added to vaccines to enhance the immune response. A naturally occurring substance found in plants, animals and humans, squalene is synthesized in the liver and circulates in the human bloodstream. It is also found in a variety of foods, cosmetics, over-the-counter medications and health supplements.

Tackling misinformation without inadvertently helping to spread it may be tricky, but I think it is something that is worth attempting. If you have any bright ideas, then please feel free to post them in the comments section below.

10 Comments

  1. endlesspsych said,

    I was going to blog on this! What with that there background I have in decision making research!

    You scurrilous rouge!

    Good post btw

  2. Tweets that mention Sceptics Beware: The Dangers of Debunking Myths « Stuff And Nonsense -- Topsy.com said,

    […] This post was mentioned on Twitter by jdc 325, Keir Liddle. Keir Liddle said: RT @jdc325: Sceptics beware: debunking dangerous. http://bit.ly/528rPQ […]

  3. jdc325 said,

    Well, I hope you blog it anyway! I’d be interested to see your take on this.

  4. al capone junior said,

    As fascinating as this aspect of myth debunking might be for psychologists et al, it sure doesn’t make it any easier to debunk stuff!! Very interesting blog tho… will keep it in mind.

    al

  5. AndyD said,

    I’m not convinced. When I see a claim I feel to be spurious, I’m not usually comforted by encyclopaedic-style statements that don’t specifically discuss the claim. So if, for example, I wondered about the dangers of squalene in vaccines, your “wikipedic” statement above wouldn’t tell me enough to dispel specific claims against it.

    I see the point being made but the positive and informative statements would somehow need to address the spurious claims even if the claims aren’t mentioned. So emphasising the commonality/safety of squalene to the extent of labouring the point would be mandatory in order to appease a concerned researcher. I think.

  6. dt said,

    I have been wondering about this for some time, speaking as I do from the perspective of a sometime “myth buster”, and wondering how effective my countermeasures are.
    One only has to see the Mail’s offering to realise how right you may be.
    http://www.dailymail.co.uk/health/article-1233989/Swine-flu-jab-dilemma-Parents-pregnant-women-refuse-experts-insist-safe–whos-right.html
    I suspect many people reading this would have their fancies that the vaccine causes brain damage and is dangerous in pregnancy will have their fears reinforced, rather than allayed..

  7. Fred Vanderpoel said,

  8. DonQuixote99 said,

    And another thing…if your debunking results in the posting of associated AdSense ads promoting the very myths you debunked, this also is a counterproductive effect.

  9. Sceptical about Skeptics? « Purely a figment of your imagination said,

    […] the (in)effectiveness of fact-based argument and how it may actually strengthen irrational belief, jdc325 reviews an interesting paper from Schwarz et al. on the subject – PDF link in the […]

  10. Correcting Misinformation « Stuff And Nonsense said,

    […] in 2009, I wrote about a paper from Schwarz et al that discussed attempts to improve decision-making and found that efforts to discredit false […]

Leave a comment