Gregory Falco
Governing AI safety through independent audits
Falco, Gregory; Shneiderman, Ben; Badger, Julia; Carrier, Ryan; Dahbura, Anton; Danks, David; Eling, Martin; Goodloe, Alwyn; Gupta, Jerry; Hart, Christopher; Jirotka, Marina; Johnson, Henric; LaPointe, Cara; Llorens, Ashley J.; Mackworth, Alan K.; Maple, Carsten; P�lsson, Sigur�ur Emil; Pasquale, Frank; Winfield, Alan; Yeong, Zee Kin
Authors
Ben Shneiderman
Julia Badger
Ryan Carrier
Anton Dahbura
David Danks
Martin Eling
Alwyn Goodloe
Jerry Gupta
Christopher Hart
Marina Jirotka
Henric Johnson
Cara LaPointe
Ashley J. Llorens
Alan K. Mackworth
Carsten Maple
Sigur�ur Emil P�lsson
Frank Pasquale
Professor Alan Winfield Alan.Winfield@uwe.ac.uk
Professor of Robot Ethics
Zee Kin Yeong
Abstract
Highly automated systems are becoming omnipresent. They range in function from self-driving vehicles to advanced medical diagnostics and afford many benefits. However, there are assurance challenges that have become increasingly visible in high-profile crashes and incidents. Governance of such systems is critical to garner widespread public trust. Governance principles have been previously proposed offering aspirational guidance to automated system developers; however, their implementation is often impractical given the excessive costs and processes required to enact and then enforce the principles. This Perspective, authored by an international and multidisciplinary team across government organizations, industry and academia, proposes a mechanism to drive widespread assurance of highly automated systems: independent audit. As proposed, independent audit of AI systems would embody three ‘AAA’ governance principles of prospective risk Assessments, operation Audit trails and system Adherence to jurisdictional requirements. Independent audit of AI systems serves as a pragmatic approach to an otherwise burdensome and unenforceable assurance challenge.
Journal Article Type | Article |
---|---|
Acceptance Date | Jun 8, 2021 |
Online Publication Date | Jul 20, 2021 |
Publication Date | Jul 20, 2021 |
Deposit Date | Jul 21, 2021 |
Publicly Available Date | Jan 21, 2022 |
Journal | Nature Machine Intelligence |
Print ISSN | 2522-5839 |
Electronic ISSN | 2522-5839 |
Publisher | Nature Research |
Peer Reviewed | Peer Reviewed |
Volume | 3 |
Issue | 7 |
Pages | 566-571 |
DOI | https://doi.org/10.1038/s42256-021-00370-7 |
Public URL | https://uwe-repository.worktribe.com/output/7562797 |
Publisher URL | https://doi.org/10.1038/s42256-021-00370-7 |
Files
Governing AI safety through independent audits
(99 Kb)
PDF
Licence
http://www.rioxx.net/licenses/all-rights-reserved
Publisher Licence URL
http://www.rioxx.net/licenses/all-rights-reserved
Copyright Statement
This is a post-peer-review, pre-copyedit version of an article published in Nature Machine Intelligence. The final authenticated version is available online at: https://doi.org/10.1038/s42256-021-00370-7
You might also like
Editorial: Autonomous (re)production, learning and bio-inspired robotics workshop
(2024)
Journal Article
Evolutionary robotics as a modelling tool in evolutionary biology
(2024)
Journal Article
Towards a unified framework for software-hardware integration in evolutionary robotics
(2024)
Journal Article
Understanding consumer attitudes towards second-hand robots for the home
(2024)
Journal Article
Towards robots with common sense
(2024)
Book Chapter
Downloadable Citations
About UWE Bristol Research Repository
Administrator e-mail: repository@uwe.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search