AI-driven "digital afterlives" risk deception, privacy violations and exploitation

Early experiments such as the “Roman bot” in 2016 demonstrated how digital traces could recreate a friend’s voice or mannerisms after death. In subsequent years, services like HereAfter AI and StoryFile allowed families to build lifelike audio and video memorials. By the mid-2020s, major shifts occurred as new platforms began offering low-cost or even automated resurrection tools. In China, basic grief avatars could be generated for the price of a snack, making digital afterlife services accessible to millions.


CO-EDP, VisionRICO-EDP, VisionRI | Updated: 02-12-2025 14:47 IST | Created: 02-12-2025 14:47 IST
AI-driven "digital afterlives" risk deception, privacy violations and exploitation
Representative Image. Credit: ChatGPT

Artificial intelligence (AI) is reshaping not only daily life but the way societies grieve, remember, and interact with the dead. A new academic investigation warns that AI-driven recreations of deceased individuals, now appearing in homes, courtrooms, media, and even political activism, are rapidly outpacing ethical, legal, and social safeguards. The findings point to a fast-growing, largely unregulated field that is already influencing mourning practices and public discourse across countries.

The study, titled The Making of Digital Ghosts: Designing Ethical AI Afterlives and published on arXiv, examines the rise, risks, and governance challenges of AI-mediated posthumous simulations. The paper traces how digital ghosts, griefbots, and AI avatars trained on personal data are emerging not as fringe experiments but as mainstream technologies shaping how families cope with loss and how institutions handle memory, justice, and identity.

Digital ghosts move from fiction to reality as families and companies recreate the dead

The authors document how AI systems trained on social media posts, text messages, emails, voice notes, images, and videos can now generate interactive recreations of deceased individuals. These digital ghosts appear in several forms, from text-based chatbots that mimic a person’s writing style to voice clones that speak in a familiar tone and video avatars capable of lifelike movement. The paper shows that what was once a science-fiction concept has quickly become commercial reality.

Early experiments such as the “Roman bot” in 2016 demonstrated how digital traces could recreate a friend’s voice or mannerisms after death. In subsequent years, services like HereAfter AI and StoryFile allowed families to build lifelike audio and video memorials. By the mid-2020s, major shifts occurred as new platforms began offering low-cost or even automated resurrection tools. In China, basic grief avatars could be generated for the price of a snack, making digital afterlife services accessible to millions.

The report also traces a growing global appetite for more advanced simulations. Some families commission high-fidelity digital reconstructions to process grief, while companies see growing commercial opportunities in memorialization. AI startups now offer interactive avatars of terminally ill individuals who proactively record their memories before death, creating pre-mortem digital twins intended to comfort families in the future.

The technology’s reach has also extended beyond personal grief. Public and institutional cases have captured global attention, with digital ghosts appearing in activism, media interviews, and even legal proceedings. The study highlights a groundbreaking event in which an Arizona courtroom permitted an AI-generated avatar of a murder victim to deliver a posthumous victim-impact statement during sentencing. This unprecedented moment, where an AI simulation addressed a court on behalf of the deceased, revealed how rapidly digital ghosts are influencing public institutions.

The authors argue that such cases show how digital resurrection is no longer a private or cultural practice but a new form of participation in societal, political, and judicial spaces. These developments, they warn, introduce profound ethical tensions that must be addressed before the technology becomes deeply entrenched.

Ethical tensions intensify: Grief, deception, privacy, dignity, and commercialization collide

The paper identifies a set of core ethical tensions emerging as digital ghosts gain traction across societies. These tensions stem from the ways people interact with AI simulations of loved ones and the consequences of recreating a deceased person’s presence.

One key tension involves the psychological impact on grieving individuals. The study notes that many mourners find comfort in the chance to revisit conversations that feel familiar or emotionally grounding. Some users experience a sense of gradual healing when interacting with digital avatars that echo a loved one’s personality. However, the researchers caution that prolonged engagement can also trap individuals in extended mourning, create dependency, or prevent acceptance of death.

Another challenge arises around truthfulness. While users may intellectually understand that an AI avatar is a simulation, there is a risk that emotional attachment will blur cognitive boundaries. The paper stresses that digital ghosts can produce new responses never spoken by the deceased, raising questions about whether these outputs misrepresent the person’s true identity or worldview. The authors warn that if users begin relying on the avatar for guidance or emotional support, the line between memory and deception becomes fragile.

Consent and posthumous privacy form another major area of concern. Most deceased individuals never consented to having their digital remains used to create interactive replicas. The authors highlight that personal data is often repurposed without permission, either through scraping public platforms or by allowing third parties, typically family members, to upload saved messages. Current legal frameworks offer little guidance on ownership of digital remains, and the paper argues that unregulated reconstruction risks violating autonomy, dignity, and personal boundaries.

The study also warns of dignity breaches and identity distortion. Because AI systems generate novel responses, avatars may say things that contradict the deceased’s beliefs. This can undermine the integrity of a person’s legacy, especially if simulations appear in public contexts or are used for advocacy or political messaging.

A final issue involves commercialization. The paper documents a growing digital death industry where companies monetize grief by offering resurrection services for recurring fees. The authors caution that this market-driven model risks exploitation, as mourners may be pressured into long-term engagement or upsold premium features. They also warn of more predatory possibilities, including unsolicited creation of avatars marketed to families or the potential use of digital ghosts in manipulative advertising or fraud.

Together, these tensions underscore the need for clearer ethical boundaries and regulatory intervention.

A new framework for ethical digital afterlives seeks to address emerging risks

To clarify the risks and guide future governance, the study introduces a nine-dimensional taxonomy of digital afterlife technologies. This taxonomy categorizes systems based on timing of creation, consent, data sources, interaction type, fidelity, purpose, audience, governance, and behavioral agency. The authors argue that these variables determine the ethical footprint of any digital ghost and must be considered in regulation and design.

Using this framework, the study differentiates between pre-mortem legacy avatars, post-mortem griefbots, celebrity simulations, institutional avatars, and emerging speculative forms. Each category carries its own spectrum of risks. For example, legacy avatars created by individuals before death offer clearer consent and data boundaries, whereas post-mortem griefbots built from digital traces without authorization raise significant ethical red flags.

The paper then outlines the characteristics of an ethically acceptable digital ghost. According to the authors, such systems should be grounded in explicit premortem consent, transparent data use, and clear disclosure that the simulation is artificial. They should serve limited purposes connected to remembrance rather than entertainment, political messaging, or legal substitution. Access should be controlled, and governance should lie with the deceased’s estate or family, not commercial platforms. Crucially, the avatar’s behavioral agency should be kept minimal to avoid generating new statements that distort identity or misrepresent the person.

These design recommendations reflect broader ethical principles surrounding autonomy, privacy, dignity, and emotional safety. The authors note that many cultures treat the memory and identity of the deceased with strict moral norms, and digital ghosts challenge these traditions by transforming remembrance into dynamic, interactive simulation rather than static memorialization.

  • FIRST PUBLISHED IN:
  • Devdiscourse
Give Feedback