Posts

PERITIA Conference ‘Trust in Expertise in a Changing Media Landscape’

The EU-funded research project PERITIA is organising the virtual scientific conference ‘Trust in Expertise in a Changing Media Landscape’, to be held on 18-19 March. The registration and programme are available on the event website

The event will bring together researchers from all over the world, discussing how best to assess, establish and maintain the credibility and trustworthiness of expertise in a rapidly changing media environment. 

Scholars will present their latest findings on distrust and disinformation, populist responses to expertise, and the role of journalism and platform algorithmic design in formations of public trust in science. 

The highlights of the multidisciplinary conference include keynotes by Onora O’Neill (University of Cambridge), Christoph Neuberger (University Free, Berlin), Natali Helberger (Amsterdam), and Michael Latzer (Zurich). 

A welcome keynote will be delivered by the organisers José van Dijck and Donya Alinejad (Utrecht University), which will be followed by two days with more than 40 speakers and a dozen of panel discussions. Among the topics covered by the programme are the pandemic, climate change, conspiracy theories, algorithms and social media platforms.

The conference will close with a roundtable discussion featuring Stefan Larsson (Lund University), Jo Pierson (Free University Brussels), Judith Simon (University of Hamburg), and José van Dijck (Moderator, University of Utrecht).

Registered participants will be invited to join the Digital Café, a networking platform run by wonder.me to informally meet participants and speakers during the coffee breaks and after the event. 

ALLEA is part of PERITIA as one of the project partners working on coordination, communications and dissemination. The research project, funded by the Horizon Europe research and innovation programme was launched as a continuation of the research work developed under ALLEA’s working group Truth, Trust and Expertise.

“Inoculating people against being manipulated will be crucial”

What are the main approaches to win the fight against misinformation? And how do the fact-checking methods applied by social media platforms affect the actual spread of conspiracy myths? Stephan Lewandowsky, professor of cognitive science at the University of Bristol and member of the ALLEA scientific committee Fact or Fake?, gives an insight into current research on trust in science and why it is essential to foster deliberative communication formats.

Question: Mr. Lewandowsky, conspiracy myths and misinformation are not really a new thing. However, they are currently making headlines again. Are we really experiencing a rise in misinformation during the pandemic?

S. L.: I don’t know of an evidence-based answer to this question as I do not have data on the quantity of misinformation and conspiracy myths. What we do know is that people’s trust in science and research has increased in response to the corona pandemic. On that we do have data from different European countries like Germany or the U.K. as well as the US. In Germany for example the science barometer by Wissenschaft im Dialog (Science in Dialogue) showed a dramatic increase in trust to over 70 per cent in April. That has been accompanied by a vastly smaller number of people who have gone the other way and have been swept up in the toxic brew of covid denialism and anti vaccination movements. I think these are the developments we have, based on the data.

We also see that the media is paying a lot of attention to conspiracy myths and misinformation and while it is important to do so, at the same time, by talking about it a lot, you are enhancing the prevalence of misinformation as well. So that is something to watch.

“(A rise of trust in science) has been accompanied by a vastly smaller number of people who have gone the other way and have been swept up in the toxic brew of covid denialism and anti vaccination movements.”

Q.: Why are pandemics a good breeding ground for conspiracy myths?

S. L.: Pandemics are always a trigger for conspiracy myths and that has been true throughout history. People are frightened, their sense of control over their lives is disrupted and whenever that happens, people are drawn towards conspiracies. Psychologically, people seek comfort in the assumption that evil people are responsible for bad things that are happening because there is potential for the world to be better. If you have an enemy that is responsible for bad things, you can pretend that things would be better if they were not there. Accepting that a virus is responsible is something that is out of control. That is frightening and that is why these times are breeding times for conspiracy myths.

Q.: In Germany we are currently seeing protests against measures the government has taken. In how far are they due to uncertainty when introducing measures, especially with regards to the introduction of masks?

S. L.: Most Germans actually think that the government is doing a good job with the measures. So once again we should not pay too much attention to the minority of protesters. Corresponding to that we see a decline in support for the AfD because they do not offer any solutions for the problems at hand. I think we have to be careful not to exaggerate the uncertainties that existed. Social distancing for example was never doubted as an effective measure against the pandemic and even though there was uncertainty about masks, a lot of scientific advice was actually quite consistent. Of course it would have been nice, if the science on masks had been available more quickly but I do not think uncertainty was a trigger for conspiracy myths in this case.

Q.: If trust is rising, why should we still care about fighting conspiracy myths? 

S. L.: The mere exposure to conspiracy myths can potentially reduce people’s trust in official institutions and is inducing people to become disengaged with politics. So the mere exposure has adverse consequences and that’s not talking about the people who believe in them. Secondly, we have data showing that the people who believe in conspiracy myths are less likely to comply with social distancing measures. So there is an association between not doing what you are supposed to do and believing in conspiracy myths. We do not know if there is a causal relationship but we know there is an association. The final thing is that ultimately conspiracy theorists are more prone to violence than others and are more likely to endorse violence as a means to resolve conflicts. So there are a number of reasons why we should be concerned about them and why we need to tackle the problem at hand.

“We do know that it is better to inoculate people before they are exposed to conspiracy myths than to fight them after they are spread.” 

Q.: What are the main approaches to win the fight against misinformation?

S. L.: First of all we do know that it is better to inoculate people before they are exposed to conspiracy myths than to fight them after they are spread. Ideally what could have been done right in the beginning of the pandemic would have been to communicate up front not only what we know about the virus but also what might happen during the pandemic with regards to conspiracy myths developing. There is evidence that shows that telling people how they will be misled is actually beneficial to building up resistance. On a societal level the moment to do so has passed, but we can still do it with new disinformation that may come along.

The second thing is, that you can correct things and you can get through to people who are spreading conspiracal narratives and it has been shown that not all people are completely resistant to correction. Sometimes the narratives are just used as a rhetorical device and for that group of people corrections can work and are a good device. For hard core conspiracy theorists where the myths have become part of their identity, that is not the case and talking them out of them is very difficult.

If (scientists) communicate well and explain things online and offline, they can be an asset in the fight against misinformation. The same is true for physicians who are very influential and can play a large role. I think by now most scientists – especially younger ones – are very capable of communicating well and know how to use social media well.”  

Q.: What can scientists themselves do to combat fake news?

S. L.: A lot. Scientists are among the most trusted people in most societies including Germany. If they communicate well and explain things online and offline, they can be an asset in the fight against misinformation. The same is true for physicians who are very influential and can play a large role. I think by now most scientists – especially younger ones – are very capable of communicating well and know how to use social media well.

“Algorithms should not draw attention to outrage and myths and that is something we have to tackle and deal with.” 

Q.: Some of the social media platforms like Facebook or Twitter have started introducing fact checking. What is your opinion on those?

S. L.: I do not think there is a single magical silver bullet to the problem. We instead need to add up different measures and put them together to solve the issue. Labeling – if done correctly – can be very effective. What Twitter is doing is OKish but not good enough. What Facebook has done with Covid misinformation has been much better because they put an opaque banner on them that hid the headline so that you could not see it at first glance. That is much more effective than the little button twitter put underneath the information. To be effective you have to introduce friction that prevents access to the information that is critical. Not totally of course because that is censorship, but sufficiently so that it causes friction. The Facebook manipulation cut sharing of misinformation by 95 per cent which is very good and that shows that labeling can work, if it is done right.

But even before you get there what really needs to be done and needs to be discussed are the algorithms of the platform. Nothing you see on Facebook or Twitter is there accidentally but is put there by the algorithms. Those algorithms are often guiding users to extremist content and even though the platforms knew that they did not do anything against it because they were afraid that it would cut into their revenue. We therefore have to take a close look at the information diet that is created to us and we have to make them accountable for their activities. This is not about censorship but about mandating information and about holding platforms accountable. Algorithms should not draw attention to outrage and myths and that is something we have to tackle and deal with.

Q.: How likely do you think it is that this will happen sooner than later?

S. L.: In the United States we are probably not going to get there any time soon. In Europe chances are much higher. The European Union will be taking action and I have written an in depth report for them and hopefully they will use some of those ideas when it comes to introducing regulations.

“Dialogue can be successful and positive in formats that focus on deliberation, on sharing data and on moderated debates in which people can participate.”

Q.: What would a good online discourse look like?

S. L.:  Dialogue can be successful and positive in formats that focus on deliberation, on sharing data and on moderated debates in which people can participate. That is something we are not finding online at the moment but we know it works from deliberative assemblies like those in Ireland which debated topics like abortion and gay marriage. Topics with the potential to tear a country apart but that did not happen because they were led successfully. There is evidence that this can work online as well if you design spaces in which this can work. The moment you create those spaces and make them work you move away from the terribly polluted spaces that we are currently having.

Q.: One topic people are currently worried about is vaccinations and trust in vaccines. Are you worried that this will be a huge breeding ground for conspiracy myths?

S. L.: It depends on the country you are talking about. I am worried about the situation in the U.K. because the government has not exactly a good track record in managing the pandemic and thus it is very likely to be problematic. In Germany I think it is much more likely to work well. Countries like Germany, New Zealand or Australia with well-functioning governments acting in the interest of the people will be able to deal with the situation well. What is crucial is to make the vaccine easily available and to make uptake easy. I don’t think we will be facing insurmountable problems especially if you make it mandatory to be vaccinated to be able to take part in certain activities we will be fine. Once again, inoculating people against being manipulated will be crucial and we should be planning those campaigns right about now.

 

Stephan Lewandowsky is professor of Cognitive Science at the University of Bristol. His research examines people’s memory, decision making, and knowledge structures, with a particular emphasis on how people update their memories if information they believe turn out to be false. This has led him to examine the persistence of misinformation and spread of “fake news” in society, including conspiracy theories.

He will speak at the session “Disinformation, Narratives and the Manipulation of Reality“ organized by ALLEA at the International Forum on Digital and Democracy on the 10th and the 11th of December. The session will present some of the findings of the JRC Report on Technology and Democracy: Understanding the influence of online technologies on political behaviour and decision-making by Stephan Lewandowsky and Laura Smillie.

This interview was conducted by Rebecca Winkels and was first published on the Wissenschaftskommunikation’s website. Credit picture: Stephan Lewandowsky. 

New PERITIA Video: Why Trust Experts?

Our EU-funded research project PERITIA has launched a new animation video, “Why Trust Experts?“. Inspired by their principal investigator Maria Baghramian’s article “Trust in Experts: Why and Why Not”, the video invites everyone to reflect on the role of expertise in our daily lives.

The Covid-19 pandemic has shown once again that experts play a key role in advising politicians and citizens. There may be no better time to ask ourselves some relevant questions about trust in expertise.

  • How does trust in experts work?
  • How is trust in science related to trust in media?
  • Why is trust in expertise important for democracies?
  • How can we learn to trust trustworthy experts?

The short animation video summarizes the key questions of PERITIA’s research in the context of today’s pandemic crisis and raises some relevant points. It touches upon the different dimensions of trust in expertise from a philosophical perspective, the influential role of media (and social media) in how we access scientific information, or the difficult balance between science independence and policymaking.

In the dedicated webpage “Why trust Experts?“, PERITIA delves into these key questions including resources. The page is available to help you learn more about the topic and find more scientific contributions to the debates from the team and their partners.

About PERITIA

PERITIA is a Horizon 2020-funded research project exploring the conditions under which people trust expertise used for shaping public policy. The project brings together philosophers, social and natural scientists, policy experts, ethicists, psychologists, media specialists and civil society organisations to conduct a comprehensive multi-disciplinary investigation of trust in and the trustworthiness of policy related expert opinion. As part of consortium of 11 partners from 9 countries, ALLEA leads the work on public engament and interaction of the project.

Trust in Expertise at times of Covid-19

The EU-funded research project PERITIA just launched its first newsletter dedicated to Covid-19 and trust in expertise. The issue includes highlights from the first five months of the project with a selection of essays, news, interviews, blog posts, and podcasts from its team dealing with how the pandemic is affecting trust in expertise and science advice systems. A general introduction to the project’s research agenda emphasizes three key questions:

  • What is the role of expertise in democracies?
  • How should science inform political decisions?
  • How can we prevent a populist backlash against expertise?

If you are curious about how PERITIA’s team has engaged in public debates and research around these questions, we kindly invite you to take a look and let us know what you think. If you enjoy it, don’t forget to subscribe here.

The project is conducting a comprehensive multi-disciplinary investigation of trust in, and the trustworthiness of, policy-related expert opinion. Its research will develop a theoretical framework to understand the fundamentals of trust, which will be complemented empirically with surveys and in-lab experiments.

Science advice and public engagement

A central part of PERITIA’s work will consist of a comparison of existing science advice mechanisms in four European countries. PERITIA researchers will investigate how expert advice is elicited and which of the available models is more trust enhancing.

The project’s plans also reach beyond research. Investigators seeks to design effective indicators and tools to build trust in expertise informing policy. Their conclusions will be tested in a series of citizens’ forums where experts, policymakers, and citizens will engage in face-to-face discussions on climate change.

ALLEA is a partner in the PERITIA consortium, which is formed by eleven organisations from nine countries, and is leading its work on communications and public engagement. The project is a follow-up of the ALLEA working group Truth, Trust and Expertise.