ABSTRACT
This article discusses the legal implications of a novel phenomenon, namely, digital reincarnations of deceased persons, sometimes known as postmortem avatars, deepfakes, replicas, holographs, or chatbots. To elide these multiple names, we use the term ‘ghostbots’. The piece is an early attempt to discuss the potential social and individual harms, roughly grouped around notions of privacy, property, personal data and reputation, arising from ghostbots, how they are regulated and whether they need to be regulated further adequately. For reasons of space and focus, the article does not deal with copyright implications, fraud, consumer protection, tort, product liability, and pornography laws, including non-consensual use of intimate images (‘revenge porn’). This paper focuses on law, although we fully acknowledge and refer to the role of philosophy and ethics in this domain. We canvas two interesting legal developments with implications for ghostbots, namely, the proposed EU Artificial Intelligence (AI) Act and the 2021 New York law amending publicity rights to protect the rights of celebrities whose personality is used in postmortem ‘replicas’. The latter especially evidences a remarkable shift from the norm we have chronicled in previous articles of no respect for postmortem privacy, to a growing recognition that personality rights do need protection postmortem in a world where pop stars and actors are routinely recreated using AI. While the legislative motivation here may still be primarily to protect economic interests, we argue it also shows a concern for dignitary and privacy interests. Given the apparent concern for appropriation of personality post-mortem, possibly in defiance or ignorance of what the deceased would have wished, we propose an early solutions to regulate the rise of ghostbots, namely an enforceable ‘do not bot me’ clause in analogue or digital wills.
Edwards, Lilian and Harbinja, Edina and McVey, Marisa, Governing Ghostbots (November 19, 2023).
Leave a Reply