Marcus Ho, ‘The Question of Autonomy, Liability Attribution and Black Boxes Decision Making’

ABSTRACT
This paper canvasses the legal consequences of AI Development, and dives deep into the domain of Autonomous Vehicles and Smart Contract to reveal a myriad of unresolved legal questions in the Law of Tort (Civil Wrongs) and the Law of Contract. In the first part, this paper examines interlocking regimes in Tort governing liability attribution in the context of autonomous vehicles. The question one must unravel here is: who should be liable? The driver, or vehicle manufacturer, or the AI developer? With the aim of reducing legal uncertainty and strong product standards, this paper argues that strict liability (that the AI manufacturer ought to be held liable) ought to be the legal regime governing autonomous vehicles, as a solution based on negligence or product liability will not sit nicely within the overarching scope of the Law of Tort.

In the second part, this paper examines the role of AI in Smart Contracts and the issue of ‘black boxes’ preventing AI decisions from being transparently disclosed. This paper identifies that the use of data by AI in Smart Contracts is of key importance, as the use of erroneous data could lead to unexpected outputs. The existence of ‘black boxes’ ultimately prevents innocent parties from ever finding out the logical structure adopted by the AI, and current AI systems are in conflict with legislative regimes. Hence, this paper argues that transparency by design ought to exist from the ground up, hence resolving the issue of transparency created by AI in Contract.

In its final analysis, this paper argues that regulators will have to work with industry experts to create a world where the key principles of consumer protection and innovation are well balanced so as to combat the effects of AI disruption as society moves forward.

Ho, Marcus, The Question of Autonomy, Liability Attribution and Black Boxes Decision Making (September 13, 2020). University of Oxford, Broad Street Humanities Review, volume 3, pp 139-149 (2020).

Leave a Reply