Jack of Kent has posted some interesting questions on his blog regarding the image of skepticism. I’m posting this partly in response to some of his questions.
Jack has asked whether, to those whose cherished views are being questioned, skepticism comes across with “Tebbitite stridency” and posed the question “Is skepticism getting a reputation for arrogance and smugness?”
I think the answer to those questions is probably “yes”. I’m not sure whether a reputation for arrogance and smugness is justified, but you have to address the perceptions people have rather than the perceptions you would like them to have.
I tend to assume that while robust criticism may persuade fence-sitters that x is wrong, it will likely elicit a defensive response from those who currently believe that x is right. Particularly if said criticism uses language that might be considered to be rude or unkind. I admit that I do not always practice what I preach, and that is a failing I should attempt to address.
Skeptics who disagree with me and favour a more robust approach than mine might like to think about how they feel when they are referred to as smug or arrogant – and bear that feeling in mind when they are about to call someone stupid or deluded. (With reference to the use of “deluded”: Nicholas Marsh has a post that touches on the question of whether it is ethical to criticise people by implying, or explicitly stating, that they are mentally ill and notes that “attacking someone’s integrity or mental health risks undermining the open discussion that one wishes to defend – you aren’t going to win many converts by calling people frauds, mad, or both.”)
My comments above, like the majority of those on Jack of Kent’s blogpost, are speculative. It’s interesting to note that people (myself included) have tended to respond to Jack’s questions with subjective opinion. I’m not sure if much evidence has been gathered regarding the perception of skeptics, or the relative success of different approaches to arguing with people we disagree with, but I’m a little surprised that skeptics don’t seem to have asked whether there is any evidence to support the positions being taken. (Well, not at the time of writing – someone will surely quote the skeptic’s motto of “evidence or STFU” before too long.)
The only research into persuasion that I have read is contained in Robert Cialdini’s book Influence: Science and Practice.
Cialdini describes the successful use of what he describes as ‘weapons of influence’ – including reciprocal concessions, commitment and consistency, and liking. If these approaches are successful, it seems logical to assume that contrary approaches will be less successful in persuading and may even be counterproductive.
I’m writing a long blogpost on anti-vaccine campaigns at the moment and have written that the response of some parents to the misinformation they have been subjected to is understandable. For example, in the MMR scare.
People were exposed to extremely partial reporting of Wakefield’s research, were unaware of its flaws, and were not told about published research that disagreed with Wakefield’s conclusions (while unpublished research that purportedly agreed with Wakefield was promoted uncritically). Digression: There is also the issue of whether people have the tools to appraise scientific evidence. People are considering how this should be addressed. For example, Ben Goldacre asked the following question on his blog:
given that we struggle to engage children with science, while half of all science coverage is health, and people are clearly very engaged in these issues around risk: should evidence based medicine, basic epidemiology and trial design, be taught in schools?
Calling people who are concerned about vaccination “morons” (even if the concerns are ill-founded) is, I think, likely to be counterproductive. Instead of persuading them, the effect is to put them on the defensive.
Would it be practical for skeptics to use any of the weapons of influence described by Cialdini in debating? I don’t know, but I will now make some tenuous connections and speculate wildly.
Authority seems like the most obvious weapon of influence that could be deployed. The debunking of anti-vaccine canards might seem more convincing if it came from an infectious diseases specialist or immunologist rather than from me.
Liking also seems to be a weapon of influence that could be utilised. Cialdini has a chapter subheading titled “Making Friends to Influence People” and tells the story of Joe Girard, a man who made $200,000 a year selling cars and claimed to use the simple formula of presenting customers with a fair price and someone they’d like to buy from. Cialdini also lists some of the reasons for liking: physical attractiveness; similarity; compliments; contact & cooperation.
Reciprocation is the third and final of Cialdini’s weapons of influence that I shall mention. He discusses the effect of reciprocal concessions and how they might help to persuade. Interestingly, Cialdini speculates that reciprocal concessions may explain Watergate.
G Gordon Liddy apparently first presented a scheme that cost $1m to be spent on bugging, a “chase plane”, break-ins, kidnapping, mugging squads, and a yacht featuring “high-class call girls”. This was rejected – at which point Liddy came back with a slightly less-ambitious plan costing $500,000. Again, Liddy’s proposal was rejected. Finally, he submitted his burglary and bugging plan that cost $250,000. Cialdini writes that “this time the plan, still stupid but less so than the previous ones was accepted”. If Cialdini is right, then this is a good example of just how powerful weapons of persuasion can be.
Trying to persuade people by using weapons of influence might in some cases be difficult, impractical, or even impossible. But if the alternative people reach for is to call parents who don’t vaccinate “stupid” or “deluded”, then perhaps those weapons of influence are worthy of consideration.
One area where there is some evidence of the importance of presentation is that of debunking myths.
Schwarz et al [PDF] discuss attempts to improve decision-making – and the frequent failures of these attempts.
Presumably, erroneous beliefs can be dispelled by confronting them with contradictory evidence. Yet attempts to do so often increase later acceptance of the erroneous beliefs […] Any attempt to explicitly discredit false information necessarily involves a repetition of the false information, which may contribute to its later familiarity and acceptance. […] In most cases it will be safer to refrain from any reiteration of the myths and to focus solely on the facts. The more the facts become familiar and fluent, the more likely it is that they will be accepted as true and serve as the basis of people’s judgments and intentions.
What to do? Well, we could make the facts familiar to people – and make the familiarity bias work in our favour. If you have factual information that would help counter misinformation, then the best tactic may be to state the information, and to repeat it sufficiently often for the true statement(s) to ‘enter into the public consciousness’.
If debunking myths can have such unintended consequences, then what else in the presentation of skeptical views can be counterproductive? I think the language used can probably be unhelpful in some instances and, like the debunking of myths, can entrench opposition to the skeptical views being espoused. There are probably other problematic elements of the presentation of skeptic views that have not even occurred to me.
I don’t think I have the answers to the questions that have been raised regarding the image of skeptics or the debunking of myths, but I do think that the questions are worthy of discussion.
This post was brought to you by Self-Doubters Anonymous, in conjunction with the Navel-Gazing Council of Great Britain. Thank you for reading it.