New report outlines foundations and practices to foster responsible computing research
Computing technologies are vital elements of our lives and have transformed the way societies organise and communicate. Amidst their rising impact and speed of development, it is vital that the ethical and social impacts of computing systems are fully considered to ensure their uses benefit society and avoid negative impacts such as perpetuating structural biases, amplifying false information, infringing on privacy, and diminishing human agency.
In a recent SRI Seminar, Barbara J. Grosz presented the results of a recently published U.S. National Academies report, “Fostering Responsible Computing Research: Foundations and Practices,” which investigates the social consequences of computing research and its applications, and offers recommendations and practical steps to ensure that the ethical and societal impacts of computing technologies are more fully considered.
Grosz is a Higgins Research Professor of Natural Sciences in the Paulson School of Engineering and Applied Sciences at Harvard University. Her research has made fundamental advances in natural-language processing, theories of multi-agent collaboration and their application to human-computer interaction, and innovative models to improve health coordination and science education. She is a co-founder of Harvard’s Embedded EthiCS program, which integrates the teaching of ethics into core computer science courses, and has played leading roles in the establishment of interdisciplinary institutions and the advancement of women in science.
In her talk, Grosz summarised the three core sections of the U.S. National Academies report, which focus on foundational research approaches through which computing technologies evolve. The report’s authoring committee, chaired by Grosz, comprises expertise across different areas of computer science, as well as engineering, social sciences, philosophy, and law. This interdisciplinary scope forms a key component of the report’s insights and interventions.
Expanding concepts and methods for computing research
A key insight of the National Academies report is that computing researchers ought to better utilise core ethical concepts and methods from the social and behavioural sciences. The authors suggest effective engagement requires researchers in computing, humanities, and social sciences to acquire familiarity with each other’s techniques and develop modes of collaboration.
While other reports to date have offered principles for computing technologies, “Fostering Responsible Computing Research” fills a gap by providing guidance on how to implement such principles in a way that is connected to practice. The report examines the fundamental building blocks of moral theories and looks at conflicts that may arise in trade-offs among values. Grosz clarified these with some examples: for instance, workplace cameras can be viewed in terms of ensuring security, or violating employee privacy. Similarly, there is a conflicting trade-off in scheduling between optimising for the autonomy and well-being of employees versus profit at an organisational level.
It is important for researchers to think carefully in advance about the impacts of their research through the lens of ethical and moral values. Although most researchers would like a checklist to ensure they adequately consider such principles, there is no such shortcut. Instead, the report proposes a path toward responsible research: to effectively plan for the social impacts of their work, researchers should collaborate in different areas to develop recommendations in practice.
Envisioning ethical and social challenges
Building on these foundations, the report provides sources of ethical challenges and problematic social impacts for computing research. Many such examples are drawn from ensuing innovations at a product design level, and it is important for researchers to foresee such consequences and think ahead about anticipated and unanticipated uses.
Grosz categorises ethical challenges and inherent limitations into five groups:
Interactions of social settings with computing technologies, such as aligning with existing norms, structures, and practices. For example, telemedicine during the COVID-19 pandemic was especially problematic for vulnerable children who often did not have access to internet-connected devices.
Limitations of human capabilities, such as the problems in “open worlds” where researchers have limited knowledge about the situation in which a system will operate.
Societal influences on design and deployment, such as engaging relevant stakeholders.
Failure to follow best practices for design and implementation of computing systems.
Limits of purely technical approaches where problems require solutions through human engagement, such as privacy and content moderation.
Recommendations for researchers
The report offers three key conclusions for ways computing research must change to engage in more responsible approaches. First, researchers must consider the ethical and societal impacts of the systems they develop, and seek effective means to address these factors. Second, researchers must engage the full spectrum of stakeholders and deploy rigorous methods to address the social contexts in which their systems are deployed. Third, governments must establish policies and regulations that protect against adverse impacts, with support from researchers who are best able to identify limitations and scenarios that require intervention.
Drawing on these conclusions, the report provides a set of practical recommendations to guide computing researchers to address potential ethical challenges and societal concerns, which Grosz explored in her seminar by way of examples and context:
Reshape computing research: The computing research community should include participants with expertise in social and behavioural sciences and ethics, and provide reporting on limitations and downstream risks of computing methods.
Foster and facilitate new types of research: Resources should be available for the participation of scholars from fields other than computer science and engineering, and ethical and societal impacts should be considered as a part of performance reviews.
Develop student and researcher expertise in responsible computing: This can be done by reshaping computer science and engineering curricula in universities, and educating students and researchers to carry out and assess responsible computing research.
Access to expertise for ethical and social implications: Scientific society and research institutions should assist researchers in finding and collaborating with scholars with societal impact and domain expertise for their projects.
Integrate ethical and societal considerations into computing research sponsorship.
Integrate ethical and societal considerations into publications and publicly released artefacts.
Adhere to best practices: These include revisions to standards for system design, deployment, oversight and monitoring to ensure accessibility, security, privacy, identifying potential unanticipated use, and mitigating harm.
Support engagement with the public and the public interest.
As part of her seminar, Grosz engaged participants in a short exercise that focused on identifying actions that scientific and scholarly organizations should take to help computing researchers address some of the challenges that these recommendations raise. This workshop portion was led by Steven Coyne, an assistant professor in the Department of Philosophy at the University of Toronto with a cross-appointment in the Department of Computer Science, who delivers ethics modules for computer science classes as part of U of T’s Embedded Ethics Initiative. The recommendations from the discussion groups, which were composed of a range of disciplinary backgrounds, will be reviewed to develop future recommendations for the project’s initiative.
Grosz concluded her talk by reiterating the importance of ethical and societal impacts, which must be first-order concerns for not only researchers but also the general public. Although the computing research community cannot eliminate every potential risk, they can proactively identify challenges and harms by following the recommendations in “Fostering Responsible Computing Research: Foundations and Practices,” and broadening their assessments to include not only performance analysis and mathematical advances but also the ethical and societal impacts of their work.
Watch the seminar:
Want to learn more?
About the author
Parand Alizadeh Alamdari is a PhD student in the Department of Computer Science at the University of Toronto under the supervision of Sheila McIlraith, and a member of the Vector Institute for Artificial Intelligence. Alamdari received her BSc in computer engineering from the Sharif University of Technology. Her research focuses on AI safety, reinforcement learning, deep learning, and how to build AI agents that are aligned with human values.