Social sciences have shown beyond reasonable doubt that many of our everyday decisions are not entirely in our control: even if we have all the relevant information about the options available to us, irrelevant features of our environment often end up shaping our decisions, and—what’s worse—we don’t even notice it.
This feature of the human mind has deep consequences for politically contested environmental issues like global warming. Here’s why: even when people have sufficient information about the state of scientific understanding of an issue, this relevant information might be unable to shape people’s attitudes or behaviour. Which is why mentioning scientific facts to your climate-denier friends usually doesn’t change a thing.
What are the psychological reasons for this disconnection between evidence, on the one hand, and beliefs, attitudes, and decisions on the other? For a general introduction, in this video behavioural economist Dan Ariely shows how irrational and contex-dependent our decision-making can often be. For specific research on climate science communication, you should hear the recent debate between two experts, , Stephan Lewandowsky and Dan Kahan.
Here’s the basic issue: many different factors shape our beliefs, and relevant available information is only one of them. There’s also the influence of our worldview—which includes our political ideology. And then there’s also our links of group membership: ‘What do my friends and people like me believe?’
In the debate, organized by Climate Desk, both experts agree that worldview plays a strong role in determining belief. But Lewandowsky’s recent studies conclude that telling people about the scientific consensus does mitigate the worldview’s influence on belief. So, even if you’re a conservative libertarian, when you hear about the 97% scientific consensus you’re more likely to stop denying that humans are causing global warming. No matter what your ideological position is, going against the huge majority of world experts is something you’d rather avoid.
Okay, that sounds like positive results. The problem, Kahan contends, is that people still are social creatures: even if relevant information is important to them when isolated, once they get together with others they are different beasts. We tend to make our beliefs consistent with those of the groups we belong to (our families, our friends, our colleagues, our churches). This, among other things, is why if you take global warming seriously you’re likely to have no friends who deny it: social groups tend to homogenize themselves, and separate themselves from divergent social groups.
So Kahan’s research suggests that, given the social nature of belief, in order for scientific information to be effective, it must be presented in ways that make it more palatable to the reader, understood as a member of certain groups. More clearly, if you want people to believe something, you have to surround the idea with things that your audience already believes; you have to make it clear that people like them already believe it.
Does this kind of message-framing sound suspicious, or even immoral? Should we just state the facts, and trust on people’s judgement? Well, I must say no. As I mentioned at the beginning, social science has already demonstrated that our capacities for critical judgement are very limited. So, now that we know that apparently irrelevant features of the context strongly affect people’s attitudes to a given message, how could we keep treating them like they’re perfectly rational beings? That would benefit only the interests of our opponents.