Beware The Bias Blind Spot

December 23, 2010 at 3:20 pm (Miscellaneous) (, , , , , )

The term bias blind spot refers to “the cognitive bias of failing to compensate for one’s own cognitive biases”.

I’ve written previously that we are all irrational, and asserted that you have to be aware of your irrationality in order to guard against it. It might be the case that being aware of the cognitive biases that contribute to our irrationality is not enough.

Awareness of cognitive biases should (one might think) help us to guard against them. However – once informed of the existence of cognitive biases, people tend to rate themselves as being less prone to these cognitive biases than others.

Pronin, Lin, and Ross authored a paper titled Perceptions of Bias in Self Versus Others, which suggested that individuals see the existence and operation of cognitive and motivational biases much more in others than in themselves:

Study 1 provides evidence from three surveys that people rate themselves as less subject to various biases than the “average American,” classmates in a seminar, and fellow airport travelers.

[…]

Participants in one follow-up study who showed the better-than-average bias insisted that their self-assessments were accurate and objective even after reading a description of how they could have been affected by the relevant bias. Participants in a final study reported their peer’s self-serving attributions regarding test performance to be biased but their own similarly self-serving attributions to be free of bias.

In a later paper on the subject of the bias blind spot, Pronin and Kugler [PDF] show that a source of this bias blind spot involves the value that people place, and believe they should place, on introspective information.

The blind spot persisted when observers had access to the introspections of the actor whose bias they judged. And, participants claimed that they, but not their peers, should rely on introspections when making self-assessments of bias.

Happily, after being “educated about the importance of nonconscious processes in guiding judgment and action” participants ceased denying their relative susceptibility to bias.

As with Dunning & Kruger in Unskilled and Unaware of It [PDF], the authors don’t just describe a cognitive bias – they offer a potential solution.

More

The researchers mentioned above have argued that introspection illusion is responsible for the bias blind spot. Wikipedia has pages on cognitive bias and illusory superiority (illusory superiority is what the unskilled are said to suffer from in the Dunning-Kruger effect).

Edit, 4:10pm 23/12/2010: there’s a PDF of the 2002 Pronin/Lin/Ross paper here. Here is a blogpost written by Emily Pronin, one of the authors of the papers on the bias blind spot I link to in this post. The blogpost is illustrated with this:

Note: I am under no illusions as to my inferiority in this area. Should anyone more knowledgeable feel like offering some constructive criticism, please feel free to do so in the comment section below.

Skeptics and the bias blind spot

Skeptics, of all people, should probably be aware of this cognitive bias. And yet the refrain “I’m a skeptic but…” is still uttered by those who self-identify as skeptics but have failed to apply sufficient skepticism in one area or another. In fact, “I’m a skeptic but…” is sometimes used to defend irrational beliefs.

Claiming to be a skeptic (the implication being that, being a skeptic, you would not be easily fooled) does not mean that your use of chiropractic is justified. Nor does it mean that you can safely enthuse about the magic of “the powerful placebo” (it’s fascinating, yes – but it’s not magic and I’m not sure whether “powerful” is an accurate epithet) or the brilliance of the political party you voted for at the last election. Being a rational skeptic, sadly, does not make you immune to cognitive biases.

“I’m a skeptic but…” amounts to a bogus appeal to authority on the basis of a self-professed skepticism.

Bogus skepticism

There is also a tendency among some commentators to use bogus skepticism. The Daily Mail provided a wonderful example in a puff piece for the Universal Countour Wrap, a ‘detox body wrap’. The author of the piece was gushing about the merits of this body wrap, but before the enthusiastic praise began there was this:

When it comes to body wraps — those mummifying treatments involving bandages and mud that claim to make you instantly, effortlessly, thinner — I’ve always been the biggest sceptic around.

I’ve assumed any benefits would just be temporary fluid loss — you’d be back to normal dimensions after the next glass of water.

I could never understand why celebrities such as Zoe Ball, Tamzin Outhwaite and Rachel Stevens swore by them.

This bogus skepticism seems to be an extra effort to convince the reader of the merits of the body wrap – ‘listen, I wasn’t sure myself at first (in fact, I thought it was nonsense), but this really is great!‘. And the author wasn’t just skeptical – oh no – they were “the biggest sceptic around”, and apparently have always been so.

The self-professed skepticism in this case is so at odds with the gushing, uncritical praise that follows that it seems obviously bogus. Other examples of bogus skepticism may be less obvious.

8 Comments

  1. Amélie Gourdon said,

    I think self-professed skepticism also acts to communicate that “I am such a sceptic that if I believe in this thing, it must be on rational reasons; I just would never go for bad science or quack”.

  2. jdc325 said,

    @Amélie
    Yes, it does seem to me to be a case of “I am rational, therefore what I believe must be rational”.

  3. Al Capone Junior said,

    I am skeptical about the bias of your skepticism whilst simultaneously being quite sure the my own skepticism about your skeptical bias is free of bias in and of itself. Or maybe I’m just suffering from the Dunning-Kruger effect. Or something.

    nice blog J

  4. martin said,

    nice article. As you say, being aware of bias doesn’t automatically compensate for it – in fact thinking that you have can make it worse. this is why it’s v important to have independent checks, esp from people outside the community, to develop onjective ish assessments.

    Following poster above re i am rational therefore this thing i believe is rational, the same applies externally: ie i am rational, therefore you who disagree must be irrational / motivated by xxxx / stupid / part of some conspiracy.

  5. jdc325 said,

    Thanks for your comments Al, martin.

    @martin – I think you may be right that thinking that you have corrected for bias can make it worse. This might be a factor in the whole “I’m a skeptic but…” thing.

  6. The Year In Nonsense. And Stuff. « Stuff And Nonsense said,

    […] guide to Reflexology, which is basically a foot rub with added pseudoscience, and a post about the bias blind-spot – the cognitive bias of failing to compensate for one’s own cognitive […]

  7. jace said,

    thanks for introducing me to the Dunning-Kruger effect. the most notable part about it, to me, is the cultural implications. as an unsatisfied citizen of the USA, i find it very amusing that “american” arrogance can be demonstrated in so many studies…

  8. A Rough Guide to Evidence-Based Medicine « Stuff And Nonsense said,

    […] The control group, the randomisation, and the blinding of participants and researchers are all attempts to minimise bias. There is no minimisation of bias in anecdotal experience, and there are many forms of cognitive bias that may make anecdotal experience unreliable. You cannot trust the evidence of your own eyes – because that evidence has been filtered by a brain prone to a huge number of cognitive biases; there is even a cognitive bias of failing to compensate for your own cognitive biases, the bias blind spot. […]

Leave a comment