Ozlem Ulgen- Birmingham City University, UK
Title: Legal responsibility as a first-order ethical norm in the design and development of autonomous systems
AI and robotics are at the forefront of emerging technologies impacting on all aspects of society. Manufacturing, automotive, finance, healthcare, employment, legal, and the military are among the sectors exploiting such technologies. Fundamental legal and ethical issues arise relating to what values AI and robotics will adopt, and legal consequences for any resulting harm. Where there is potential or actual harm from the use of AI and robotics it is in the public interest to know who will be held legally responsible. A broad spectrum of agents may be attributed with legal responsibility for harms and unintended consequences (e.g. designers, programmers, engineers, manufacturers, operators and owners). In this talk I will explain why legal responsibility is a first-order ethical norm in the design and development of autonomous systems, and differentiate “responsibility” from “accountability”. I will use autonomous weapons as a case study to demonstrate legal responsibility throughout the pre-deployment and deployment stages of design, development, manufacturing, and use.
Ozlem Ulgen is Reader in International Law and Ethics at Birmingham City University, UK. She specialises in moral and legal philosophy, weapons law, international humanitarian law, and public international law. She has published works on cosmopolitan ethics in warfare, Kantian ethics and human dignity in the age of artificial intelligence and robotics, and the law and ethics of autonomous weapons. She has a forthcoming publication with Routledge, The Law and Ethics of Autonomous Weapons: A Cosmopolitan Perspective. Ozlem is involved in various legal and ethical international standard-setting initiatives. She is Chair of the Accountability Expert Focus Group for IEEE Ethics Certification Program for Autonomous and Intelligent Systems (ECPAIS), developing accountability requirements for ethical certification of autonomous and intelligent systems in the public and private sectors. She is involved in drafting legal and ethical rules at the United Nations Group of Governmental Experts on Lethal Autonomous Weapons Systems, and has produced reports defining lethal autonomous weapons systems, identifying human control elements in weapons systems, and reviewing regulatory models (Definition and Regulation of LAWS, Command Responsibility and LAWS). Ozlem is also an expert member of IEEE Standards Working Groups P7007 Ontological Standard for Ethically Driven Robotics and Automation Systems, and P7000 Model Process for Addressing Ethical Concerns During System Design. She was involved in drafting the chapter on Classical Ethics and A/IS for IEEE’s Ethically Aligned Design, a global treatise of high-level ethical principles, key issues, and practical recommendations intended to inform the public, engineers, policy makers, and manufacturers of autonomous and intelligent systems.