In the iconic 1990 fantasy film, Truly, Madly, Deeply, the lead role, Nina (played by Juliet Stevenson), mourns the recent loss of her lover, Jamie (Alan Rickman). Touched by her deep sorrow, Jamie re-emerges as a spirit to guide her through her grief. For anyone who has watched the movie, they’re aware that this ghostly reappearance nudges her to reassess her memories of him, leading to the realization that he may not have been as flawless as she had previously perceived. Flash forward to 2023, where innovative AI-driven “grief tech” gives us the opportunity to interact with our departed loved ones in various forms. However, instead of a benevolent apparition like Jamie, we have to rely on artificial intelligence to recreate those we’ve lost. What potential complications could arise?
AI-based platforms like ChatGPT and Midjourney are currently at the forefront of discourse, yet we seem to be neglecting to address larger ethical concerns surrounding issues like grief and mourning. Pondering the afterlife of your loved ones isn’t as palatable as other trendy AI topics. If you’re of the opinion that AI replicas of the departed are still in the distant future, you’re mistaken. There’s already at least one firm offering digital afterlife services, and it’s both expensive and eerie.
Consider Re;memory, a service provided by Deepbrain AI, a company known for “virtual assistant” screens and AI news anchors. This Korean firm has combined its expertise in chatbots and generative AI video to a somewhat morbid extreme. For a price tag of $10,000 and a few hours in a studio, you can create an avatar of yourself that your family can visit at a separate location for an extra fee. Korean mourning traditions, such as the annual “Jesa” visit to a loved one’s grave, influence this approach.
Presently, even by Deepbrain’s own account, the service doesn’t offer a particularly nuanced replica of a person’s personality. The limited dataset permits the avatar to exhibit only one “mood”. As Michael Jung, Business Development and Strategy Lead at Deepbrain, explained to BuyTechBlog, Re;memory isn’t striving to create a perfect copy of a person. Rather, it’s a basic interactive entity that can be visited occasionally, though one hopes they possess more character than a virtual hotel receptionist.
Re;memory stands out by offering a video avatar that can respond to queries, but audio-based HereAfter AI aims to encapsulate more of a person’s character with a series of questions. The result is an audio chatbot that friends and family can interact with, receiving verbal responses and even anecdotes from the past. These chatbots convincingly answer in the owners’ voices until the illusion shatters when they respond mechanically to an unanswerable question.
Whether these technologies can create a believable avatar is not the main issue, as AI’s rapid progress will likely address these shortcomings. More complex questions involve ownership of the avatar after a person’s death, the safety and security of memories and data, and the potential impact on those left behind.
Joanna Bryson, Professor of Ethics and Technology at Hertie School of Governance, likens the current surge of grief tech to the time when Facebook was a hub for young people to commemorate their departed friends. She highlights how the additional element that AI avatars introduce further intensifies concerns about their effect on our grieving minds. Bryson also postulates that this technology could be exploited beyond its original intent.
However, creating an AI version of a living person would require a vast amount of data and their consent. Yet, this may not be a hurdle for much longer. AI songs mimicking famous artists are already possible and there might soon be enough public data to feed a generative AI without needing celebrity status. For instance, Microsoft’s VALL-E can already clone a voice with just a few seconds of source material.
A related ethical concern arises from the inherent complexities of humans. There are details about us, often deliberately hidden, that make us complete individuals. In our private digital messages, secrets about our sexuality, political views, or even an affair might be exposed. If we’re not cautious, this data might be inadvertently used for AI training, potentially revealing secrets after death.
Even with the recreated individual’s consent, there are no guarantees that the digital version of you won’t fall into the wrong hands and be misused. Presently, this generally falls into the same legal category as credit card theft, unless used publicly, when other laws may apply. However, these protections are usually reserved for the living.
Bryson suggests a possible solution for data protection may already exist in the form of locally stored biometric data used to unlock our phones. But data will always be at risk, no matter where or how it’s stored. As we advance towards digital immortality, issues of cost, accuracy, and a sense of uncanniness loom. Despite these concerns, Bryson emphasizes that this technology forces us to face our mortality in new ways, which can ultimately help us appreciate our current relationships. After all, an AI version of someone will always be an imperfect imitation, so why not deepen our connection with the real person while they’re still here?
All products mentioned by BuyTechBlog are chosen by our editorial team independently of our parent company. Some of our articles may contain affiliate links. If you purchase something through one of these links, we might earn an affiliate commission. Prices are accurate at the time of publication.
Frequently Asked Questions (FAQs) about AI grief technology
What is the concept of digital immortality?
Digital immortality refers to the preservation of a person’s identity and personality through artificial intelligence. It often involves creating a digital avatar that represents the person after their death.
What is Re;memory service by Deepbrain AI?
Re;memory is a service provided by Deepbrain AI that allows users to create an AI-based video avatar of themselves, or of their deceased loved ones, for interaction. The service currently offers basic interaction, with each avatar having one dominant “mood”.
What does HereAfter AI offer?
HereAfter AI provides an audio-based service that uses AI to create chatbots representing individuals. These chatbots answer questions and share stories or anecdotes from the person’s life, providing a more personal interaction than text-based bots.
What are the ethical issues around AI avatars of deceased people?
AI avatars of deceased people raise a number of ethical issues. These include questions about data privacy and security, the potential for misuse of these avatars, and the potential for harmful psychological impacts on grieving individuals.
Can AI avatars truly replicate a person’s personality?
As of the time of writing, AI avatars cannot fully replicate a person’s personality. While they can mimic certain aspects, like voice and appearance, capturing the full depth and nuance of a person’s personality is beyond the current capabilities of AI.
What are the potential psychological impacts of interacting with AI avatars of deceased loved ones?
Interacting with AI avatars of deceased loved ones could potentially prolong the grieving process or lead to unhealthy obsession. It could also create confusion and distress if the avatar does not behave in a manner consistent with the deceased person’s personality.
More about AI grief technology
- Digital Immortality and Artificial Intelligence
- Ethical implications of AI and Machine Learning
- The ethics of creating AI versions of the deceased
- The psychological impact of prolonged grief
- Deepbrain AI’s Re;memory service
- HereAfter AI’s website
6 comments
a really thought provoking read. So many question raised. The part about who owns the avatar and if it can be abused is seriously disturbing.
This is really spooky. I mean, talkin’ to a AI version of someone passed away, it just doesnt feel right. Would it even sound like them or act like them? Don’t think I could handle it…
Wow, future is here folks! I wonder how much data they need to create one of these AI avatars. Can’t be easy, right? Amazing and a little creepy at the same time.
I’m honestly excited for this. Imagine, our ancestors live on through AI. I see a ton of ethical questions and it can definitely be misused but if done right, it’s amazing!
Kinda confused why we can’t just let people rest in peace. Why create a weird digital ghost of them that isnt really them. Feels disrespectful to me…
I’m not so sure about this, what about the grieving process? wouldn’t this make it harder to move on? It’s kinda like living in the past… And the cost! 10 grand for an avatar! Who can afford this?