Communicating ideas needs a narrative to get the point across
For those of us trying to communicate science to a wide audience, the common pitfalls are that we’ll bore, confuse or patronise our audience. And, let’s be honest, sometimes we’ll slip up and misinform them too.
All the same, the attempt feels worth making; indeed, it’s vital to maintaining well-informed public debate. It’s sobering, then, to be told from a recent preprint that the old adage ‘a little knowledge is a dangerous thing’ might be true.1 Federico Francisco and Joana Gonçalves-Sá report evidence that negative attitudes towards science peak not for people who know the least about it but for those whose knowledge is moderate – but whose confidence in their knowledge is correspondingly greater.
Does this mean that, as the authors suggest, ‘offering information that is incomplete, partial, or over-simplified, as science communicators often do, might indeed backfire, as it may offer a false sense of knowledge to the public, leading to over-confidence, and less support [for science]’? That’s less clear.
For one thing, science communication should not simply be ‘offering information’. It should be providing some way to contextualise that information. It’s not terribly useful to tell people that the world’s average temperature looks set to rise at least by 2°C in isolation. We also need to say why that doesn’t simply mean summer and winter will be this ‘tiny’ bit warmer (sounds fine, right?) – and tell them what the alarming actual consequences are likely to be.
Indeed, it’s debatable whether the best ‘information’ is in the form of bare facts and numbers anyway. At a recent meeting at the Royal Society on science and narrative, one speaker (Chatham House rule, sorry) pointed out how little impact a BBC programme on climate change several years ago had when it summarised its message in three numbers, compared to the hard-hitting Climate Change: The Facts broadcast this April, which took a more narrative approach. There was a general feeling at the meeting that minds are rarely changed by a deluge of ‘facts’ – what tends to work is giving people a different narrative.
As Francisco and Gonçalves-Sá point out, providing scientific facts might not just be ineffective; it can actually backfire. A study in 2010 showed that presenting people with scientific evidence that undermines their pre-existing beliefs might just lead them to deny science is relevant to the issue – or to other, related issues.2 In that case, the participants were tested on stereotypical prejudices about homosexuality, specifically that it is associated with mental illness. But it’s easy to see the relevance to climate change too (‘you can’t conclude anything from computer models’, as a correspondent to the Financial Times recently claimed).
In their study, Francisco and Gonçalves-Sá used data from a survey called The Science and Technology Eurobarometer, which from 1989–2005 conducted more than 84,000 interviews with people in 34 European countries about their attitudes to science. Participants’ level of scientific knowledge was gauged with 13 basic questions such as ‘electrons are smaller than atoms’, while their attitudes were tested with questions like ‘because of their knowledge, scientific researchers have a power that makes them dangerous’. In general, there was a stronger antiscientific feeling among those with intermediate levels of knowledge than with low.
But such people also showed a high degree of confidence in their view. This is consistent with the Dunning-Kruger effect, a psychological phenomenon according to which (as the authors put it) ‘confidence grows much faster than knowledge’, creating individuals ignorant of their ignorance. Such people are apt to know the scientific consensus on, say, climate change or vaccines, but to somehow persuade themselves that they know better than the experts.
All this supports the suggestion of Francisco and Gonçalves-Sá that we need ‘science communication strategies that offer a good balance between sharing not only accurate and precise information, but also large doses of humility, both on the scientists and the lay public’s side’. That is a challenging task. But I think it is more multidimensional than they acknowledge, and certainly not a matter of avoiding ’incomplete or partial’ information in science communication (how can it ever be otherwise?). The danger is that this just becomes a variant of the old ‘deficit model’ idea: just give them more information and they’ll agree with us.
We know all too well that when scientists are humble and honest about uncertainties it is ruthlessly exploited by ‘merchants of doubt’,3 arguing that ‘they don’t really know anything for sure’. And, without wanting to be overly pessimistic about the possibility of persuasion, apparently unrelated beliefs tend to form clusters that reflect affiliation to particular (often political) world views. You won’t easily change one belief without changing the others. Once again, it comes down to that issue of narrative: your story about how the world works.
Francisco and Gonçalves-Sá’s interesting results feel like a slice through a higher-dimensional space. The answer to better science communication is not ‘give more information’, and perhaps not even simply ‘give better information’. But thinking about your narrative might be a place to start.
1 F Francisco and J Gonçalves-Sá, ArXiv, 2019, 1903.11193
2 G D Munro, J. Appl. Soc. Psychol., 2010, 40 , 579
3 E M Conway and N Oreskes, Merchants of Doubt, 2010, Bloomsbury