The idea of robots that possess autonomous capabilities and intelligence and are ungoverned by human directions or supervision dates as far back as several decades ago. These (then-) futuristic ideas were steadily incorporated into very real and current technology. Combined with Artificial Intelligence (“AI”) technology, products and machines disrupt the idea of agency and the involvement of human beings in manufacturing and provision of services. How should liability be constructed when there is no apparent agency or personhood or when actions are almost inherently unforeseeable? More specifically, in the context of AI-based robots, do models of products liability or other tort liability fit the new framework? This Article seeks to explain why current law and doctrine, such as products liability and negligence, cannot provide an adequate framework for these technological advancements, mainly due to the lack of personhood, agency, and the inability to predict and explain robot behavior. Zooming out from specific doctrines, the Article also suggests that none of the three main liability regimes—strict liability, negligence, and no-fault mandatory insurance—adequately resolves the challenges of AI-based robots. Ultimately, this Article aims at suggesting supplementary rules that, together with existing liability models, could provide better legal structures that fit AI-based robots. Such supplementary rules will function as quasi-safe harbors or predetermined levels of care. Meeting them would shift the burden back to current tort doctrines. Failing to meet such rules would lead to liability. Such safe harbors may include a monitoring duty, built-in emergency brakes, and ongoing support and patching duties. The argument is these supplementary rules could be used as a basis for presumed negligence that complements the existing liability models. If adopted, they could establish clear rules or best practices that determine the scope of potential liability of designers, operators, and end-users of AI-based robots.
The full text of this Article is available to download as a PDF.