Correcting Misinformation

September 24, 2012 at 7:31 pm (Miscellaneous)

Back in 2009, I wrote about a paper from Schwarz et al that discussed attempts to improve decision-making and found that efforts to discredit false information could backfire and actually lead to myths being spread due to repetition of the misinformation. Schwarz is co-author of a more recent paper, Lewandowsky et al, that looks at ways to reduce the impact of misinformation.

They begin with discussion of persistent myths (using myths surrounding Barack Obama, the MMR vaccine, and Listerine as examples). They go on to discuss the societal costs of misinformation – in the case of the MMR vaccine, they write that the myth “had dire consequences for both individuals and societies, including a marked increase in vaccine-preventable disease and hence preventable hospitalizations, deaths, and the unnecessary expenditure of large amounts of money for follow-up research and public-information campaigns aimed at rectifying the situation”.

There is a section on the origins of misinformation. The particular sources discussed by the authors are rumours and fiction; governments and politicians; vested interests; and the media. Another section deals with how people assess the truth of a statement. The authors argue that there is a default of acceptance and that when people do “thoughtfully evaluate the truth value of information”, they take into account a small number of considerations – whether it is compatible with their beliefs, whether the source is credible, whether other people believe it, and whether it is coherent.

Following a section on the failure of retractions to eliminate the influence of misinformation, we come to “Reducing the Impact of Misinformation”.

To date, only three factors have been identified that can increase the effectiveness of retractions: (a) warnings at the time of the initial exposure to misinformation, (b) repetition of the retraction, and (c) corrections that tell an alternative story that fills the coherence gap otherwise left by the retraction.

The authors write that advance warning that information someone is about to be given may be misleading can reduce the effects of misinformation, as it can change the default expectation that information is valid.

Such a warning would allow recipients to monitor the encoded input and “tag” it as suspect. Consistent with this notion, Schul (1993) found that people took longer to process misinformation when they had been warned about it, which suggests that, rather than quickly dismissing false information, people took care to consider the misinformation within an alternative mental model. Warnings may induce a temporary state of skepticism, which may maximize people’s ability to discriminate between true and false information.

Repetition of the retraction seems more problematic.

The success of retractions can also be enhanced if they are repeated or otherwise strengthened. Ecker, Lewandowsky, Swire, and Chang (2011) found that if misinformation was encoded repeatedly, repeating the retraction helped alleviate (but did not eliminate) misinformation effects.

…the repetition of corrections may ironically decrease their effectiveness. On the one hand, some evidence suggests a “protest-too-much” effect, whereby overexerting a correction may reduce confidence in its veracity (Bush, Johnson, & Seifert, 1994). On the other hand, as noted above, corrections may paradoxically enhance the impact of misinformation by repeating it in retractions (e.g., Schwarz et al., 2007).

It seems that repeating misinformation has more of an effect that repeating a retraction of that misinformation. The authors note that this is particularly unfortunate in the domain of social networking media.

As mentioned earlier, one of the few things people tend to take into account when they evaluate the truth of a statement is whether it is coherent. Lewandowsky et al argue that if a retraction leaves a coherence gap, this gap may motivate reliance on misinformation in spite of a retraction and that providing an alternative explanation might fill this gap.

Studies have shown that the continued influence of misinformation can be eliminated through the provision of an alternative account that explains why the information was incorrect (e.g., “There were no gas cylinders and oil paints, but arson materials have been found”; “The initial suspect may not be guilty, as there is an alternative suspect”; H. M. Johnson & Seifert, 1994; Tenney, Cleary, & Spellman, 2009).

To successfully replace the misinformation, the alternative explanation provided by the correction must be plausible, account for the important causal qualities in the initial report, and, ideally, explain why the misinformation was thought to be correct in the first place (e.g., Rapp & Kendeou, 2007; Schul & Mazursky, 1990; Seifert, 2002).

They go on to note that explaining the motivation behind an incorrect report may make a correction more successful, that people generally prefer simple explanations to complex accounts, and that providing too many counter-arguments may backfire.

The next section deals with worldview and skepticism. The authors argue that people’s worldview or personal ideology plays a key role in the persistence of misinformation, and give examples of Republicans being more likely than Democrats to believe the “birthers” and liberals being “less well calibrated than conservatives” when it comes to the consequences of higher oil prices. A retraction can be undermined if it challenges a pre-existing belief system and can actually lead to people becoming more entrenched. The authors suggest that:

The research on preexisting attitudes and worldviews implies that debiasing messages and retractions must be tailored to their specific audience, preferably by ensuring that the correction is consonant with the audience’s worldview.

While some attitudes can have a distorting effect, Lewandowsky et al write that some attitudes can safeguard against misinformation effects. Attitudes like skepticism.

In particular, skepticism can reduce susceptibility to misinformation effects if it prompts people to question the origins of information that may later turn out to be false. For example, people who questioned the official casus belli for the invasion of Iraq (destroying WMDs) have been shown to be more accurate in processing war-related information in general (Lewandowsky et al., 2005). Suspicion or skepticism about the overall context (i.e., the reasons for the war) thus led to more accurate processing of specific information about the event in question. Importantly, in this instance, skepticism also ensured that correct information was recognized more accurately, and thus did not translate into cynicism or a blanket denial of all war-related information.

The authors write that skepticism is most likely to exert an influence if experienced at the time of message exposure and that skepticism and distrust do not always protect people from unreliable or intentionally misleading sources. I found it interesting that as well as making people less susceptible to misinformation, skepticism also made it easier to recognise correct information – at least in the example discussed by the authors.

Later in the paper, there is a summary from the authors of the main points from the literature on debiasing:

  • Consider what gaps in people’s mental event models are created by debunking and fill them using an alternative explanation.

  • Use repeated retractions to reduce the influence of misinformation, but note that the risk of a backfire effect increases when the original misinformation is repeated in retractions and thereby rendered more familiar.

  • To avoid making people more familiar with misinformation (and thus risking a familiarity backfire effect), emphasize the facts you wish to communicate rather than the myth.

  • Provide an explicit warning before mentioning a myth, to ensure that people are cognitively on guard and less likely to be influenced by the misinformation.

  • Ensure that your material is simple and brief. Use clear language and graphs where appropriate. If the myth is simpler and more compelling than your debunking, it will be cognitively more attractive, and you will risk an overkill backfire effect.

  • Consider whether your content may be threatening to the worldview and values of your audience. If so, you risk a worldview backfire effect, which is strongest among those with firmly held beliefs. The most receptive people will be those who are not strongly fixed in their views.

  • If you must present evidence that is threatening to the audience’s worldview, you may be able to reduce the worldview backfire effect by presenting your content in a worldview-affirming manner (e.g., by focusing on opportunities and potential benefits rather than risks and threats) and/or by encouraging self-affirmation.

  • You can also circumvent the role of the audience’s worldview by focusing on behavioral techniques, such as the design of choice architectures, rather than overt debiasing.

More

The full text of the paper is available, free, here.

Two of the authors, Cook and Lewandowsky, have written a “Debunking Handbook”, which is available, free, here. They call for an emphasis on the facts, explicit warnings of forthcoming misinformation, alternative explanations, and graphic display of core facts.

3 Comments

  1. Martin said,

    The so-called “debunking handbook” is, if I recall, merely an instruction manual for spinning your own beliefs in a form that is supposed to be more convincing, at least to yourself. It contains little about refuting, let alone debunking, other interpretations of evidence and has nothing at all about objective assessments. Not a good advert for the scientific competence of either of the authors.

    For some ‘proper’ research on biases and the ways we (that includes you, me, and Lewandowsky and his only vaguely plausible opinions of why we believe stuff) don’t do rationality very well, read anything by Kahneman and the late Tversky. Ideally get this from your local friendly library: http://www.amazon.co.uk/Judgment-under-Uncertainty-Heuristics-Biases/dp/0521284147 and while you’re at it this: http://www.amazon.co.uk/Choices-Values-Frames-Daniel-Kahneman/dp/0521627494 and you’ll have some idea of why sensible people believe weird stuff.

  2. Martin said,

    …or, for someone aligned to bad-sciences viewpoints (heh) but less emotionally involved than cook/lewandowsky – ie, actually interested rather than fantasising away the causes of disagreement – try Kahan, eg “Cultural Cognition of Scientific Consensus” http://papers.ssrn.com/sol3/papers.cfm?abstract_id=1549444

    “Yet public debates rarely feature open resistance to science; the parties to such disputes are much more likely to advance diametrically opposed claims about what the scientific evidence really shows.” A bit more, well, scientific than frothing about ‘misinformation’

  3. jdc325 said,

    Thanks for the recommended reading Martin. I’ve downloaded the Kahan PDF and I’ll see if my library has the books you’ve recommended.

    With regard to the Debunking Handbook, yes it’s just a series of recommendations about how to present a refutation. I don’t know if there is a debunking handbook that tells readers how to determine if something is a myth but if you fancy writing one I’d be happy to work on it with you ;)

Leave a comment