The art of persuasion
Scientists, like myself, can be naïvely idealistic in believing that well-conducted scientific research is the kind of indisputable evidence that changes minds. We are sometimes blind to the possibility that non-scientists might resist our evidence. In fact, my research suggests that presenting scientific evidence on controversial issues might do more harm than good.
Some beliefs are resistant to change. When people hold a belief that supports a value important to them, or linked to their sense of self or group identity, or charged with emotion, they defend it tenaciously against information that might threaten it.
For example, climate scientists might like to think that the wealth of scientific studies supporting the role of human involvement in climate change should convince politicians and their constituents. However, some people oppose at a gut level the thought of restricting access to technology, taxing carbon usage, or regulating the free market. These strategies threaten their core values, especially if the political parties have clear positions.
Instead of changing their beliefs, these people are likely to reject the evidence. They may criticize the methodological quality of the research, question the objectivity of the researchers (for example, ‘climategate’), or otherwise discredit the evidence.
Research in my lab suggests that one way that people discredit scientific information that challenges a cherished belief is to form the idea that science cannot be used to answer questions on that particular topic. Relative to control groups, people reading belief-threatening scientific conclusions also reported that science was impotent to answer questions related to a variety of other topics.
This finding is perhaps the most disconcerting of all. Repeated exposure to scientific information that challenges an important belief can decay the person’s trust in science to answer real-world questions.
Scientists should be very cautious in presenting evidence that runs counter to beliefs or attitudes based on political ideology, religious dogma, or other powerful value-laden structures. They should present evidence in non-threatening ways.
The science is not so threatening when one’s value system is reinforced. Faced with free-market individualists who shudder at the thought of smug tree-huggers, supporting government regulations and the taxing of technology, a climate scientist presenting evidence of human involvement in climate change should frame the implications as an opportunity for humans to demonstrate their resourcefulness in developing technological innovations that will stimulate the next economic boom.
Provoke positive emotions
Reducing gut opposition can make people more accepting of the evidence. For example, a person who speaks with humility, optimism, and warmth provokes positive emotions rather than the anger, disgust, and guilt provoked by sarcastic pundits, arrogant ‘know-it-alls’, or ‘holier-than-thou’ moralizers. Similarly, the use of discovery-type scientific words and phrases (for example, ‘we learned that… or ‘the studies revealed that…’) might be less emotionally provocative than debate-type scientific words and phrases (for example, ‘we argue that…’ or ‘we disagree with so-and-so and contend that…’).
Avoid culture wars
Anything that draws attention to an ingroup-outgroup divide is likely to lead to defensive resistance if it appears that the science or its source represents the outgroup. So avoiding culture war divisions is crucial. If possible, climate scientists should solicit endorsements of their research from representatives of big energy companies, conservative political parties, religious institutions, and others from the cultural communities that are usually more resistant to the conclusions of climate scientists.
Finally, as a college professor, I believe that frequent exposure to critical thinking skills, practice with critical thinking situations, and quality feedback about critical thinking allows people to understand how their own biases can affect their analysis of information and result in open-minded thinkers who are skeptical yet not defensive.
Dr Geoffrey Munro is Associate Professor in Psychology at Towson University, Maryland