TECHNOLOGY: THE TALKING DEAD

Published August 4, 2024
Shutterstock/The Conversation
Shutterstock/The Conversation

Imagine a future where your phone pings with a message that your dead father’s “digital immortal” bot is ready. This promise of chatting with a virtual version of your loved one — perhaps through a virtual reality (VR) headset — is like stepping into a sci-fi movie, both thrilling and a bit eerie.

As you interact with this digital dad, you find yourself on an emotional rollercoaster. You uncover secrets and stories you never knew, changing how you remember the real person.

This is not a distant, hypothetical scenario. The digital afterlife industry is rapidly evolving. Several companies promise to create virtual reconstructions of deceased individuals, based on their digital footprints.

From artificial intelligence (AI) chatbots and virtual avatars to holograms, this technology offers a strange blend of comfort and disruption. It may pull us into deeply personal experiences that blur the lines between past and present, memory and reality.

As the digital afterlife industry grows, it raises significant ethical and emotional challenges. These include concerns about consent, privacy and the psychological impact on the living.

An eerie ‘digital afterlife’ is no longer science fiction. So how do we navigate the risks?

What is the digital afterlife industry?

VR and AI technologies are making virtual reconstructions of our loved ones possible. Companies in this niche industry use data from social media posts, emails, text messages and voice recordings, to create digital personas that can interact with the living.

Although still niche, the number of players in the digital afterlife industry is growing.

HereAfter allows users to record stories and messages during their lifetime, which can then be accessed by loved ones posthumously. MyWishes offers the ability to send pre-scheduled messages after death, maintaining a presence in the lives of the living.

Hanson Robotics has created robotic busts that interact with people using the memories and personality traits of the deceased. 

Project December grants users access to so-called “deep AI” to engage in text-based conversations with those who have passed away.

Generative AI also plays a crucial role in the digital afterlife industry. These technologies enable the creation of highly realistic and interactive digital personas. But the high level of realism may blur the line between reality and simulation. This may enhance the user experience, but may also cause emotional and psychological distress.

A technology ripe for misuse

Digital afterlife technologies may aid the grieving process by offering continuity and connection with the deceased. Hearing a loved one’s voice or seeing their likeness may provide comfort and help process the loss.

For some of us, these digital immortals could be therapeutic tools. They may help us to preserve positive memories and feel close to loved ones, even after they have passed away.

But for others, the emotional impact may be profoundly negative, exacerbating grief rather than alleviating it. AI recreations of loved ones have the potential to cause psychological harm if the bereaved ends up having unwanted interactions with them. It’s essentially being subjected to a “digital haunting.”

Other major issues and ethical concerns surrounding this tech include consent, autonomy and privacy.

For example, the deceased may not have agreed to their data being used for a “digital afterlife.”

There’s also the risk of misuse and data manipulation. Companies could exploit digital immortals for commercial gain, using them to advertise products or services.

Digital personas could be altered to convey messages or behaviours the deceased would never have endorsed.

We need regulation

To address concerns around this quickly emerging industry, we need to update our legal frameworks. We need to address issues such as digital estate planning, who inherits the digital personas of the deceased and digital memory ownership.

The European Union’s General Data Protection Regulation (GDPR) reco­gnises post-mortem privacy rights, but faces challenges in enforcement.

Social media platforms control deceased users’ data access, often against heirs’ wishes, with clauses like “no right of survivorship” complicating matters. Limited platform practices hinder the GDPR’s effectiveness. Comprehensive protection demands reevaluating contractual rules, aligning with human rights.

The digital afterlife industry offers comfort and memory preservation, but raises ethical and emotional concerns. Implementing thoughtful regulations and ethical guidelines can honour both the living and the dead, to ensure digital immortality enhances our humanity.

What can we do?

Researchers have recommended several ethical guidelines and regulations. Some recommendations include: obtaining informed and documented consent before creating digital personas from people before they die age restrictions to protect vulnerable groups clear disclaimers to ensure transparency and strong data privacy and security measures.

Drawing from ethical frameworks in archaeology, a 2018 study has suggested treating digital remains as integral to personhood, proposing regulations to ensure dignity, especially in re-creation services.

Dialogue between policymakers, industry and academics is crucial for developing ethical and regulatory solutions. Providers should also offer ways for users to respectfully terminate their interactions with digital personas.

Through careful, responsible development, we can create a future where digital afterlife technologies meaningfully and respectfully honour our loved ones.

As we navigate this brave new world, it is crucial to balance the benefits of staying connected with our loved ones against the potential risks and ethical dilemmas.

By doing so, we can make sure the digital afterlife industry develops in a way that respects the memory of the deceased and supports the emotional well-being of the living.

The writer is Associate Professor of Digital Strategy and Data Science at Monash University in Australia

Republished from The Conversation

Published in Dawn, EOS, August 4th, 2024

Opinion

Editorial

Military option
Updated 21 Nov, 2024

Military option

While restoring peace is essential, addressing Balochistan’s socioeconomic deprivation is equally important.
HIV/AIDS disaster
21 Nov, 2024

HIV/AIDS disaster

A TORTUROUS sense of déjà vu is attached to the latest health fiasco at Multan’s Nishtar Hospital. The largest...
Dubious pardon
21 Nov, 2024

Dubious pardon

IT is disturbing how a crime as grave as custodial death has culminated in an out-of-court ‘settlement’. The...
Islamabad protest
Updated 20 Nov, 2024

Islamabad protest

As Nov 24 draws nearer, both the PTI and the Islamabad administration must remain wary and keep within the limits of reason and the law.
PIA uncertainty
20 Nov, 2024

PIA uncertainty

THE failed attempt to privatise the national flag carrier late last month has led to a fierce debate around the...
T20 disappointment
20 Nov, 2024

T20 disappointment

AFTER experiencing the historic high of the One-day International series triumph against Australia, Pakistan came...