In the age of social media, scientific mis- and disinformation spreads far and fast – with deadly consequences. During the early days and peak of the Covid-19 pandemic, for example, the torrents of false and misleading information led to highly risky behaviours, impacted mitigation efforts, vaccine uptake, and even resulted in (preventable) deaths. Besides the pandemic, science disinformation is also particularly rampant and harmful when it comes to the climate crisis, which presents an existential threat to the world. Therefore, fighting science mis- and disinformation with evidence-based tools and resources is of paramount importance, not just for the scientific community, but for policymakers, the media, and the public.
Dr Carlo Martini, who leads PERITIA’s work on Behavioural Tools for Building Trust, speaks to ALLEA Digital Salon on how scientific disinformation is becoming more sophisticated and harder to detect, and the resulting need for equally vigorous counter-measures by professional science journalists.
“Scientific disinformation is different, because it is often rather complex to debunk, and it tends to stand on pseudo-evidence, that is, something that looks like scientific evidence but is not obtained through rigorous scientific methodology.”
Question: In a recent interview you emphasise that expertise is the substantial possession of two traits: experience and competence. Could you elaborate on the importance of these two in the make-up of an expert?
Carlo Martini: I view experience and competence as the backward-looking and forward-looking components of expertise. What that means is that experts need experience, typically in a very narrow field of human knowledge, to gain the capacity and proficiency to deal with new problems and tasks, which is usually called competence. Experience alone, however, is not always enough to acquire competence, and sometimes competence can be acquired through other means (for example, instruction manuals). The relationship between experience and competence is thus a complex one. For example, lots of experience will yield little competence if said experience is acquired by mere repetition of the same task.
Q: The ease of access to communication technology makes it easier for pseudo-science to spread to ever-larger audiences. What tools or resources do laypeople have to recognise the “bogus” experts and their pseudo-scientific claims?
CM: Without a filter at the source, laypeople can only rely on critical thinking to vet the information they receive. Scholars disagree on how “gullible” people are, but unfortunately, it is a fact that there are many bogus “professional” experts, often very well-funded, who are very keen on and skilled at constructing and spreading disinformation. This type of professional-looking disinformation is rather hard to spot without specific skills that are acquired through the study and application of critical thinking and digital literacy.
Q: What about legitimate disagreements between experts? How can laypeople make important decisions on topics where experts who are on equal epistemic standing express conflicting views or recommendations?
CM: Legitimate disagreement among experts is a thorny issue for laypeople’s decision-making. First, though, the fact that there is a genuine disagreement should be established. Unfortunately, much of what appears to be “disagreement among experts” is bogus. Once we have done that, however, and we are still faced with disagreeing parties, a few options remain. Sometimes the disagreement may mask different assumptions about, for instance, risk attitudes and values.
For instance, there was a lot of bogus disagreement during the COVID pandemic; but some disagreements were legitimate, and it was sometimes the result of different stances about how much value to assign to human life, as opposed to, for example, economic and psychological suffering deriving from restrictions. If nonetheless, experts’ views about ethics and risks are aligned but they still disagree, it probably makes sense to sit on the fence, as it were, and wait until new evidence is available. Unfortunately, there are situations when sitting on the fence is not an option.
“Experts may not realise that their incompatible conclusions may each be supported by good evidence if they start from different stances about the evaluation of some basic moral facts.”
Q: The work of the EU-funded research project PERITIA, in which you are one of the lead researchers, deals with the topic of disinformation. What is the difference between scientific misinformation and disinformation, and why is it important to make this distinction?
CM: We can be disinformed about many diverse topics, from politics to pop culture. Let us imagine we hear that an actor we particularly love has broken up with their partner. Is it true? Is it false? A tabloid or a social media account may spread disinformation to gather readership or clicks. But often this kind of disinformation is a lie with no legs to stand on, like the infamous “Pizzagate” affair during the 2016 US presidential election. Scientific disinformation is different; it is often rather complex to debunk, and it tends to stand on pseudo-evidence, that is, something that looks like scientific evidence but is not obtained through rigorous scientific methodology. One shouldn’t generalise but it is safe to say that most scientific disinformation is supported by pseudoscience.
Q: Your research focus within PERITIA deals with the emotional and cognitive components of trusting behaviour. What are the key facts that your research has found on this front?
CM: One of the foci of our research is the idea that often people do not trust information based on the contents of what they read or hear, but rather, they tend to trust familiar sources, irrespective of the objective quality of their contents.
For example, in one of our studies, we tried to improve people’s ability to spot disinformation by giving them critical thinking prompts. In the first round of experiments, we ran into the problem that familiarity of sources was masking the effect of our intervention because people tended to judge as accurate those sources that they perceive as trustworthy and familiar. In order to try to detect the effect of our prompts, we had to refine our search and we ran a second round of experiments using only unfamiliar sources, to test whether our prompts were helping people become more accurate in their search for reliable information.
“We need professional scientific journalism back, and the competition coming from scientific disinformation and click-bait style journalism is unfortunately not helping.”
Q: Part of your work also focuses on the role of expertise in knowledge transfer from science to policy. How has the role of experts in policy advice changed in recent years? What do you see as positive developments, and what must still be improved?
CM: I think it’s fair to say that in recent years we have witnessed opposing trends. On the one hand, crises like Brexit have been fuelled by and, in turn, magnified a wave of negative feelings towards expert advice and evidence-based policy-making. Experts have been accused of protecting a worldview, rather than holding superior knowledge. On the other hand, the COVID-19 pandemic was an eye-opener on how much science (and experts) can accomplish when they coordinate with each other and with policymakers. Some experts even attained celebrity status during the pandemic.
My research team and I ran in-depth interviews with several major COVID-19 experts who were prominent public communicators during the first wave of the pandemic and one of the key takeaway points they tended to agree on was that communication should be improved. We need professional scientific journalism back, and the competition coming from scientific disinformation and click-bait style journalism is unfortunately not helping.
About Carlo Martini
Dr Carlo Martini is Associate Professor of Philosophy of Science in the Faculty of Philosophy at Vita-Salute San Raffaele University (UNISR). His primary research interests are in philosophy of the social sciences and social epistemology. He works on the role of expertise in knowledge transfer from science to policy, on expert disagreement and on public trust in scientific experts. He is a visiting fellow at the Centre for Philosophy of Social Science, University of Helsinki. Before taking up his post at UNISR (Milan) he was a senior researcher at the Academy of Finland Centre of Excellence in the Philosophy of the Social Sciences, after completing his Ph.D. at the Tilburg Centre for Logic and Philosophy of Science in 2011.
Dr Martini also leads PERITIA’s work package on Behavioural Tools for Building Trust. PERITIA is an EU-funded research project investigating public trust in expertise. ALLEA is one of the partners of the consortium, which is composed by 11 organisations from across Europe.