ALLEA Permanent Working Group on Science & Ethics Meets in Brussels to Discuss Emerging Topics in Research Ethics

On 2 and 3 April 2024, the ALLEA Permanent Working Group on Science & Ethics (PWGSE) met in Brussels to reflect on the experiences and impact of recent activities, discuss emerging topics in research ethics and research integrity, and scope future activities. 

The meeting was generously hosted by the Royal Flemish Academy of Belgium for Science and the Arts (KVAB) at the neoclassical Academy Palace in Brussels. It brought together research ethics and integrity experts from ALLEA Member Academies under the chairwomanship of Dr Maura Hiney (Royal Irish Academy). 

The European Code of Conduct for Research Integrity: growing impact  

A central and recurring theme in the meeting was the revised edition of the European Code of Conduct for Research Integrity, which was developed following extensive consultation with European research stakeholders. Since its publication in June 2023, the revised Code has been steadily finding its way into the research system, with already 12 translations released on the ALLEA website, and many more to come in the coming weeks and months.  

In addition, the Code prompts the development and update of national and institutional codes of conduct and is increasingly referenced in discipline-specific guidelines and (European and national) policy documents. For example, the Code inspired detailed general and field-specific guidelines for responsible Open Science, developed by the Horizon-funded ROSiE (Responsible Open Science in Europe) project, as well as the recently published guidelines on responsible use of AI in research by the European Research Area Forum. 

The future of research ethics and integrity within EU-funded projects 

The working group welcomed Isidoros (Dorian) Karatzas, Head of Sector for Ethics and Research Integrity at the European Commission Directorate Research & Innovation (DG-RTD), to the meeting for an extended discussion on the impact of the Code and the future of research ethics and integrity as part of EU-funded projects. Joint reflections identified additional outreach strategies, both with Academies and the wider research community, to further improve awareness and knowledge on good research practices. They also pinpointed a number of clauses from the Code where the PWGSE may be able to support researchers and their organisations by providing further context, help with interpretation, and additional resources. 

Agenda highlights: insights and initiatives discussed  

Further items on the agenda included reflections on and lessons learnt from the group’s recent publication on Predatory Publishing, as well as a joint statement with the ALLEA Working Group on Science Education on Scientific Literacy for Young Learners. In addition, the group discussed possible tensions that can arise when academic researchers collaborate with or are funded by the private sector, a request for feedback by the European Commission on support for projects with dual-use potential, and progress of ALLEA’s various activities as part of the Coalition for Advancing Research Assessment (CoARA).  

ALLEA Addresses Dangerous and Exploitative Predatory Publishing Practices during International Open Access Week

Over the past two decades, open access publishing has rapidly grown into a global industry, making scholarly publications readily available to researchers, policymakers, and the general public. While this has generally been seen as a positive development, the predominant “Gold” open access route has also given rise to unforeseen challenges.

On the occasion of International Open Access Week 2023, ALLEA is contributing to the discourse around “Community over Commercialization” through the publication of a statement issued by the Permanent Working Group on Science and Ethics, with Professor László Fésüs (Hungarian Academy of Sciences) as principal author, ‘Curbing Predatory Practices in Open Access Publishing‘.

The statement is both a set of guidelines and an appeal to the broader research community to collectively identify and disempower so-called “predatory” journals with subpar editorial and publication standards. The proliferation of such outlets comes with increasingly sophisticated exploitative practices, impacting researchers, eroding research integrity, and wasting financial and human resources. Indeed, the 2023 revised edition of the European Code of Conduct for Research Integrity explicitly states that supporting or using journals, publishers, events, or services that undermine research quality is a violation of research integrity norms and is considered misconduct.

Upholding the integrity and quality of scholarly work is a fundamental pillar of ALLEA’s mission, and as such, our Member Academies play a crucial role in promoting publishing outlets with appropriate editorial and publication standards recognised by the broader research community. We are committed to ensuring that the best interests of the academic community and the public remain at the forefront of open access initiatives.

— ALLEA President Antonio Loprieno

 

Read the full statement here

“Questions, Not Answers, Are Better Suited to Start a Reflection on Ethical Issues”

Technology has immense power to shape our world in a variety of spheres, from communication to education, work, health, transportation, climate, politics, and security. New and innovative technologies with such gross potential for wide socio-cultural and economic impact (often referred to as “emerging technologies”) are thus often fraught with ethical questions – which range from concerns about privacy breaches to manipulation, fairness, and the exacerbation of power gaps and exploitation. Because they could affect every aspect of our lives, it is important to acknowledge and address these ethical questions right at the outset – as early in the process of technological design and implementation.

In this relatively nascent field of emerging technologies and ethics, TechEthos (Ethics for Technologies with High Socio-Economic Impact), a Horizon 2020-funded project, published a report on the ethical issues that need to be considered for three technology families: Digital eXtended Reality, including the techniques of visually eXtended Reality (XR) and the techniques of Natural Language Processing (NLP), neurotechnologies, and climate engineering, including Carbon Dioxide Removal (CDR) and Solar Radiation Management (SRM).

Dr Laurynas Adomaitis, Tech Ethicist, CEA

In this Digital Salon interview, we speak with the lead author of the report, Dr Laurynas Adomaitis, Tech Ethics Researcher at Commissariat à l’Énergie Atomique et aux Énergies Alternatives (CEA), on the ethical dilemmas inherent to emerging technologies, how researchers can effectively use the tools in the report, and the role for policymakers and funding organisations in promoting the integration of ethics into every stage of technology research.

 

Question: Are the core ethical dilemmas in emerging technologies fundamentally similar to ethical considerations inherent to all research? How are they different?

Laurynas Adomaitis: Emerging technologies are often based in research, so there definitely is overlap between the core dilemmas we discuss in research ethics. For example, while looking at climate engineering, we discovered that one point of contention was whether research into Solar Radiation Management (reflecting/refracting solar energy back into space) is ethically justified. One of the arguments against it is that researching such techniques presents the world with a “plan B”, which may distract from climate change mitigation efforts.

We also found a lot of issues with consent in XR (extended reality) and neurotech, which cuts across research ethics. For example, there are ethical concerns with so-called “deadbots” – chatbots constructed based on conversational data from deceased individuals. How is consent possible for an application that did not exist when the person was conscious? Likewise, in neurotech we must be aware of changing people’s mental states. For example, sometimes a treatment is required before consent can be given, but then can it be revoked by the patient? Or, if a BCI (brain-computer interface) changes a person’s mental states, can it also change how they feel about consent?

 

“Each technology family has many issues and at least one beastly challenge to conquer.”

 

Q: Which of the three technology families did you find particularly fraught with ethical issues? Why?

LA: The three technology families – XR, neurotech, and climate engineering – are at very different stages of development. Many applications in XR are already in production and available to the public; neurotech is starting in medical tests but is mainly based on future promise, whereas climate engineering is only beginning to be explored with huge issues on the horizon.

Each technology family has many issues and at least one beastly challenge to conquer. For climate engineering, it’s irreversibility – can we make irrevocable changes to the planet? For neurotech, it’s autonomy – how can we enhance cognitive abilities, while respecting independent and free thinking? For XR, it’s a set of particular issues, like nudging, manipulation, deep fakes, concerns about fairness, and others. I think it’s a wider array of issues for XR because it is already hitting the reality of implementation, where many practical problems arise. There are even skeptical researchers who think that virtual realities should not exist at all because of the moral corruption they may cause, especially with children. This fundamental issue still lingers spurring the need for empirical studies.

 

Q: What were some overarching ethical themes common to all three technology families?

LA: There are cross-cutting issues that relate to uncertainty, novelty, power, and justice. But the most important aspect that kept reappearing was the narratives about new technologies that are found in lay reactions to it.

We used a framework to elucidate this in the report that was developed in the DEEPEN (Deepening ethical engagement and participation in emerging Nanotechnologies) project over 10 years ago. It worked very well in the context of our ethical analysis. Many concerns were along the lines of five tropes of lay reactions to novelty: “Be careful what you wish for”, based on the motifs of exact desire and too big a success; “Messing with Nature”, based on the motifs of irreversibility and power; “Opening Pandora’s box”, based on the motifs of irreversibility and control; “Kept in the dark”, based on the motifs of alienation and powerlessness; and “The rich get richer, the poor get poorer”, based on the motifs of injustice and exploitation. Although these reactions are natural, and sometimes justified, we had to keep asking ourselves whether they are the most pressing ones. It’s still astonishing that the same narratives apply across times and technologies.

 

“There are cross-cutting issues that relate to uncertainty, novelty, power, and justice. But the most important aspect that kept reappearing was the narratives about new technologies that are found in lay reactions to it.”

 

Source: TechEthos Report on the Analysis of Ethical Issues

 

Q: How can the research community best implement the tools/findings in this report?

LA: The report is structured in a hierarchical way, starting with some core dilemmas that are the foundation of reasoning, then there are applications and, finally, values and principles. The value sections are the most important for researchers and practitioners. They cover the key considerations, and each value section ends with a set of questions. We wrote these questions with a researcher in mind. What should one consider when trying to explore, design, and implement the technology? What are the checks and balances with respect to the value in question? We intended these questions to be operationalisable so they offer the best value for implementation.

 

Q: How can policymakers better support the integration of “ethics by design” in emerging technologies?

LA: Technology research should be in step with ethical research on the technologies. The time difference between the development in tech and ethical or policy research creates a divide, where we have to work retroactively, and it’s very inefficient. Imagine if carbon-intensive technology and industry were developed alongside climate preservation from the very beginning. Of course, there have been philosophers and ethicists, like Hans Jonas, as early as the 1970s calling for ecological activism and responsibility for future generations. But they were mavericks and pioneers, working with passion but without support. We should try to open up these perspectives and take them seriously at the policy level when the technologies are emerging.

 

“Technology research should be in step with ethical research on the technologies. The time difference between the development in tech and ethical or policy research creates a divide, where we have to work retroactively, and it’s very inefficient.”

 

Q: What role can funding organisations play in centering ethics in emergent tech?

LA: It’s a difficult question to answer since causality is very uncertain in provoking ethical reflection. Ethical reflection is, as we like to call it, opaque. It’s not always transparent when it happens or why. What will actually cause people – researchers and industry alike – to stop and reflect? In our report, we avoided guidelines or directives that would offer “solutions”. Instead, we focused on questions that should be asked. Questions are better suited for starting a reflection on ethical issues. For example, if you’re building a language model, how will it deal with sensitive historical topics? How will it represent ideology? Will it have equal representation for different cultures and languages?

There is no “one way” to address these challenges, but the questions are important and researchers should at least be aware of them. If the standards for dealing with them are not clear yet, I would prefer to see each research project find their own way of tackling them. That will lead to more original approaches and, if a working consensus is found, standardisation. But the central role played by the funding bodies could be to guide the researchers into the relevant questions and start the reflection. We intended our report to provide some instruction on that.

 


You can read our summary of the TechEthos report by Dr Adomaitis on the analysis of ethical issues in Digital eXtended Reality, neurotechnologies, and climate engineering here, and the full report here. 

TechEthos is led by AIT Austrian Institute of Technology and will be carried out by a team of ten scientific institutions and six science engagement organisations from 13 European countries over a three-year period. ALLEA is a partner in the consortium of this project and will contribute to enhancing existing legal and ethical frameworks, ensuring that TechEthos outputs are in line with and may complement future updates to The European Code of Conduct for Research Integrity.

New Report Explores the Ethics of Digital eXtended Reality, Neurotechnologies, and Climate Engineering

TechEthos project publishes two analyses of the ethics and laws applicable to the three technology families under study

In June 2022, TechEthos (Ethics for Technologies with High Socio-Economic Impact), a Horizon 2020-funded project, published a draft report on the ethical issues that need to be considered for the three technology families under study:

  • Digital eXtended Reality, including the techniques of visually eXtended Reality (XR) and the techniques of Natural Language Processing (NLP)
  • Neurotechnologies
  • Climate Engineering, including Carbon Dioxide Removal (CDR) and Solar Radiation Management (SRM)

The report, co-authored by tech ethicist Laurynas Adomaitis and physicist Alexei Grinbaum at the Commissariat à l’Énergie Atomique et aux Énergies Alternatives (CEA), along with Dominic Lenzi from the University of Twente (TU), is currently under review by the European Commission. It is based on literature studies, original research, expert consultation, and digital ethnographies.

TechEthos

Source: TechEthos Report on the Analysis of Ethical Issues

In addition to briefly describing the technologies in each family, the report identifies core ethical dilemmas, describes key applications and case studies, and identifies ethical values and principles in line with the “ethics by design” (the implementation of ethical, legal, and societal values and principles from the conception to implementation stages of technology design) methodology, provides operational checks and balances for each value/principle in the form of questions, and outlines mitigations strategies for the same.

The 142-page report is structured into four chapters, which include an introduction into technology ethics and cross-cutting issues in the three technology families, and a deep-dive into each one. Some examples of the ethical issues unique to the different technologies include:

  • The impact of digital eXtended Reality on the values and principles of transparency, dignity, privacy, non-manipulation, and responsibility, as well as their relevance for the analysis of risk reduction, environmental impact, dual use and misuse, gender bias, and power and labour relations
  • The lack of human-like reasoning or understanding in NLP systems, spontaneous anthropomorphisation of chatbots, and the influence of artificial emotions on human users
  • The impact of neurotechnologies on the values and principles of autonomy, responsibility, privacy, risk reduction, and informed consent
  • The potential for less costly, but less effective climate engineering solutions to divert resources away from more sustainable, but more expensive initiatives
  • The potential for climate engineering to be more wasteful

Beyond the well-researched and in-depth analysis of the conceptual arguments, there are also helpful use cases and questions that stakeholders can ask when dealing with the ethics of the technologies in each family.

Analysis of international and EU law and policy applied to Digital eXtended Reality, Neurotechnologies, and Climate Engineering

In July 2022, following the analysis of the ethical dilemmas inherent to each technology family studied by TechEthos, a second draft report was published, which delved into the international and EU laws and policies for their relevance and applicability to Digital eXtended Reality, Neurotechnologies, and Climate Engineering. Although there is no dedicated EU or international law governing these three technology families, there do exist several legal frameworks that could be applied to them.

The report serves to review these legal domains and related obligations at international and EU levels, identifies the potential implications for fundamental rights and principles of democracy and rule of law, and reflects on issues and challenges of existing legal frameworks to address current and future implications of the technologies. The 242-page report covers human rights law, rules on state responsibility, environmental law, climate law, space law, law of the seas, and the law related to artificial intelligence (AI), digital services and data governance, among others as they apply to the three technology families.

The report was co-authored by Nicole Santiago, Ben Howkins, Julie Vinders, Rowena Rodrigues, and Zuzanna Warso from Trilateral Research (TRI), Michael Bernstein from the AIT Austrian Institute of Technology, and Gustavo Gonzalez and Andrea Porcari from the Associazione Italiana per la Ricerca Industriale (Airi). It aims to present an evidence base for the TechEthos project’s development of recommendations for policy and legal reform, and is currently being reviewed by the European Commission.

—————————————————————–

TechEthos is led by AIT Austrian Institute of Technology and will be carried out by a team of ten scientific institutions and six science engagement organisations from 13 European countries over a three-year period. ALLEA is a partner in the consortium of this project and will contribute to enhancing existing legal and ethical frameworks, ensuring that TechEthos outputs are in line with and may complement future updates to The European Code of Conduct for Research Integrity.

ALLEA Welcomes Council Conclusions on Research Assessment and Open Science

ALLEA welcomes the adoption of the Conclusions on Research Assessment and Implementation of Open Science by the Council of the European Union on 10 June. See ALLEA’s full response here.

The Conclusions are in agreement with points that ALLEA has made over the years, in particular on the necessity of appropriately implementing and rewarding open science practices and the development of research assessment criteria that follow principles of excellence, research integrity and trustworthy science.

At the same time, ALLEA continues to stress that it matters how we open knowledge, as the push for Open Access publishing has also paved the way for various unethical publishing practices. The inappropriate use of journal- and publication-based metrics in funding, hiring and promotion decisions has been one of the obstacles in the transition to a more open science, and furthermore fails to recognize and reward the diverse set of competencies, activities, and outputs needed for our research ecosystem to flourish.

ALLEA therefore welcomes the principles set out in the Conclusion for designing novel approaches to research assessment, with particular weight on recognizing (1) the critical role for peer review in research assessment and (2) the importance of integrity and ethics in developing criteria focused on quality and impact. 

ALLEA underscores that the described reforms are urgently needed and require concerted efforts from the international academic community, supported by infrastructures for exchanging best practices as well as the necessary financial resources to implement these. 

Read ALLEA’s full response

ALLEA Joins the European Commission Coalition on Research Assessment Reform

ALLEA has joined the European Commission’s core group working on reforming research assessment. The group will support the drafting of an agreement led by the European University Association, Science Europe and the European Commission on key issues and timelines for implementing changes.

The coalition is composed by funding organisations, research performing organisations, national/regional assessment authorities or agencies, associations of research funders, of research performers, of researchers, as well as learned societies and other relevant organisations.

ALLEA is represented by Deborah Oughton, member of the ALLEA Permanent Group Science and Ethics and representative of the Norwegian Academy of Science and Letters. She is a Professor at the Environmental Sciences and Natural Resource Management Faculty of the Norwegian University of Life Sciences.

Towards a Research Assessment Reform

In 2021, the European Commission published the scoping report ‘Towards a reform of the research assessment system’. The publication presents the findings from a consultation with European research stakeholders and identifies the goals that should be pursued with a reform of research assessment. The report proposes a coordinated approach based on principles and actions that could be agreed upon by a coalition of research funding and research performing organisations committed to implement changes.

Research assessment reform is one of the topics ALLEA has worked jointly with its Member Academies and partners in recent years. In July 2021, ALLEA and the Global Young Academy (GYA) published a report covering the key takeaways of their webinar ‘Research Assessments that Promote Scholarly Progress and Reinforce the Contract with Society’. The event brought together science and policy stakeholders to rethink current research assessment models.

The key areas for research assessment identified by the stakeholders were how to strike a balance between funding of research to advance scientific progress and public accountability, how to assess the societal relevance of research and who defines the criteria, and how research assessment should be done.

In 2020, ALLEA, the Global Young Academy and STM (International Association of Scientific, Technical and Medical Publishers) organised a series of workshops about the future of peer review in scholarly communications. A short summary report is available here.