Grief tech: AI afterlife raises questions over legal status

Under copyright law, personal identity or presence is considered abstract, similar to an idea. Copyright protects only material forms such as books or photographs.
Grief tech: AI afterlife raises questions over legal status
Updated on

Would you create an interactive “digital twin” of yourself that can communicate with loved ones after your death? Generative artificial intelligence (AI) has made it possible to seemingly resurrect the dead. So-called griefbots or deathbots AI-generated voices, video avatars or text-based chatbots trained on a deceased person’s data are proliferating in the growing digital afterlife industry, also known as grief tech.
Deathbots are usually created by the bereaved, often as part of grieving. But some services allow people to create digital twins while still alive. That raises an obvious question: why not create one for after death?
As with any new technology, digital immortality raises legal questions many without clear answers.

To create an AI digital twin, users sign up for a service and provide data about themselves through questionnaires, recorded memories, stories and voice samples. Visual likeness can be added through images or video. The AI then creates a replica based on this training data. After death, once the company is notified, loved ones can interact with the digital twin.
In doing so, however, users delegate agency to a company to simulate their identity after death.

This differs from using AI to “resurrect” someone who never consented. Here, a living person licenses personal data to an AI afterlife company before death, creating AI-generated content deliberately for posthumous use.

Yet many issues remain unresolved. What happens to copyright and privacy? What if the company closes or technology becomes obsolete? Could data be sold? Would a digital twin “die” again, potentially affecting grieving relatives?

Australian law currently does not protect identity, voice, presence, values or personality. Unlike the US, Australia has no general publicity or personality right. This means Australians do not legally own or control their identity, including their voice, image or likeness.
In short, the law does not recognise proprietary rights in most attributes that make a person unique.

Under copyright law, personal identity or presence is considered abstract, similar to an idea. Copyright protects only material forms such as books or photographs. However, typed responses or voice recordings submitted for AI training are material. The training data could therefore be protected, but fully autonomous AI-generated outputs are unlikely to carry copyright. Under current Australian law, they would probably be regarded as authorless because they originate from machines rather than human intellectual effort.

Moral rights, which protect a creator’s reputation and guard against false attribution or derogatory treatment of their work, are unlikely to apply to digital twins. These rights attach only to works created by human authors, not AI-generated output.

This creates further uncertainty. Companies may claim ownership of AI-generated content through their terms and conditions, grant users limited rights, or retain broad reuse powers. Users must examine these terms carefully.

Beyond legal issues, ethical risks remain. Even if training data is locked after death, others will continue interacting with it. AI systems, which operate probabilistically, may gradually distort responses, reducing resemblance to the original person. There is little clarity about legal recourse if such drift occurs.

The current legal landscape suggests stronger regulation is needed for the expanding grief tech industry. Even with consent, individuals cannot predict how future technologies might repurpose their data.
For now, anyone considering creating a digital afterlife should scrutinise contractual terms carefully. Ultimately, users remain bound by the agreements they sign.

The Conversation

Related Stories

No stories found.
X

DT Next
www.dtnext.in