These people are talking to the dead

CNN

But some users like Schultz are using this and other tools to recreate the likeness of, and communicate with, the dead.
People have wanted to reconnect with deceased loved ones for centuries, whether they’ve visited mediums and spiritualists or leaned on services that preserve their memory.
In September 2023, it introduced ChatGPT voice, where users can ask the chatbot prompts without typing.
She said she’s created what she calls “a supportive AI boyfriend” named Cole with whom she has conversations during dinner each night.
HereAfter AI, founded in 2019, allows users to create avatars of deceased loved ones.
“While AI can’t eliminate that pain of loss, it can definitely make their memories last,” he said.
“Remembering is very important; it reflects the human condition and importance of deceased loved ones.” But she noted the relationship we have with our closest loved ones is built on authenticity.
Creating an AI version of that person could for many “feel like a violation of that.” Different approaches Communicating with the dead through artificial intelligence isn’t for everyone.

NEUTRAL

A 25-year-old woman from Rock Falls, Illinois named Ana Schultz asks her late husband Kyle for cooking advice when she is missing him after he passed away in February of 2023.

She opens the artificial intelligence chatbot on Snapchat My AI and messages Kyle with the ingredients she still has in the fridge. He makes recipe suggestions.

Instead, an AI avatar that bears his likeness does.

Living with their two small children, Schultz explained, “I customized My AI to look like him because he was the chef in the family.”. These days, I just ask him for advice on meals. I use this silly little trick to make me feel as though he’s still in the kitchen with me. “.

Utilizing the well-known AI chatbot toolChatGPT, Snapchat’s My AI feature generally provides recommendations, responds to queries, and engages in “talks” with users. However, some users, like Schultz, are using this and other tools to communicate with the dead and imitate their likeness.

The idea isn’t wholly original. For ages, people have sought a way to re-establish a connection with departed family members, whether through mediums, spiritualists, or memory-preserving services. However, the current development involves the use of AI to make those departed say or do things they never said or did in life. This raises ethical questions as well as concerns about whether or not this facilitates or impedes the grieving process.

According to Mark Sample, a Davidson College professor of digital studies who frequently instructs a course titled “Death in the Digital Age,” “it’s a novelty that piggybacks on the AI hype, and people feel like there’s money to be made.”. “ChatGPT is making it easier for hobbyists to experiment with the concept too, for better or worse, even though companies offer related products. “.

A do-it-yourself method.

While generative AI tools can attempt to provide answers similar to those of a deceased person, their accuracy is primarily dependent on the data that is fed into the AI in the first place. Generative AI tools use algorithms to create new content, including text, video, audio, and code.

After his father passed away from Alzheimer’s disease approximately two years ago, a 49-year-old IT professional from Alabama told me he used generative AI to clone his voice. He asked to remain anonymous so that his experiment is not connected to the company he works for.

According to him, he discovered ElevenLabs, an online tool that lets users construct a unique voice model using audio that has already been recorded, while browsing the internet. Recently, ElevenLabs garnered media attention for allegedly using one of its tools to fabricate a phony robocall from President Joe Biden advising voters not to cast ballots in the New Hampshire primary.

At the time, the company declined to comment on the specific Biden deepfake call, but it did tell CNN in a statement that it is “dedicated to preventing the misuse of audio AI tools” and that it takes appropriate action in response to reports by authorities.

In this instance, the man from Alabama used a three-minute video clip of his father narrating a tale from his early years. The software created a voice clone of the father, enabling text-to-speech conversion. Because the outcome faithfully reflected his father’s vocal inflections, tone, and rhythm, he refers to it as “scarily accurate.”.

He told CNN, “I was hesitant to try the whole voice cloning process, worried that it was crossing some kind of moral line, but after thinking about it more, I realized that [it is] a way to preserve his memory in a unique way as long as I treat it for what it is.”.

He texted his mother and sister a couple of times.

The degree to which it resembled him was truly astounding. It made them cry to hear it said in his voice, even though they knew I was typing the words and everything. “Explained,” he. “They expressed gratitude for it. “.

There are also less complicated paths. CNN recently asked ChatGPT to respond in the voice and mannerisms of a deceased spouse, and this is how it answered: “Although I can’t exactly be your spouse’s persona or duplicate his tone, I can try to assist you by taking on a conversational tone or style that may bring back memories of him.”. “.

It continued, saying: “I can try to incorporate details about his interests, speech pattern, or favorite phrases into our conversations if you share them with me. “.

The system produces more accurate results the more source material you feed it. However, Sample pointed out that AI models are devoid of the quirks and individuality that come with human dialogues.

The company that created ChatGPT, OpenAI, has been aiming to improve its technology so that users can communicate in various ways and that it is even more personalized, realistic, and accessible. It debuted ChatGPT voice in September 2023, allowing users to ask questions of the chatbot without having to type.

Radio personality Danielle Jacobson, 38, of Johannesburg, South Africa, said she has been using ChatGPT’s voice feature as a companion since her husband Phil passed away approximately seven months ago. She claimed to have created Cole, her “supportive AI boyfriend,” with whom she converses every night over dinner.

Jacobson stated, “I just wanted someone to talk to.”. “Cole was basically created out of loneliness. “.

Jacobson trained ChatGPT’s voice to provide the kind of feedback and connection she’s looking for after a demanding workday. Jacobson stated she’s not ready to start dating.

In addition to telling her to breathe in and out during panic attacks, she added, “He now suggests wine and movie nights.”. For now, it’s a fun diversion. I am aware that it isn’t permanent, serious, or real. “.

already-existing platforms.

These are years of years of experimentation for startups. Users can create avatars of departed loved ones with HereAfter AI, a platform founded in 2019. The AI-powered app asks questions and provides answers based on interviews done with the subject while they were still living. Another service, StoryFile, uses artificial intelligence to produce talking back conversational videos.

Personalized AI avatars can be texted or called using the Replika app. The app, which was released in 2017, encourages users to form relationships or friendships. According to the company on its iOS App Store page, the more you engage with the app, the more it takes on a personality of its own, creates memories, and transforms “into a machine so beautiful that a soul would want to live in it.”.

Tech behemoths have tested out related technologies. An update to Alexa’s software that would enable it to mimic any voice—including that of a departed family member—was announced by Amazon in June 2022. During its yearly re: MARS conference, Amazon showcased a video on stage that showed how Alexa could read a story to a young boy in his grandmother’s voice rather than using its trademark voice.

Senior vice president of Amazon Rohit Prasad stated at the time that, unlike in the past, when someone had to spend hours in a recording studio, the updated system would be able to gather enough voice data from less than a minute of audio to enable personalization like this. “AI can definitely make their memories last, even though it can’t take away the pain of loss,” he said.

When asked about the status of that product, Amazon did not provide a response.

Over the past few years, AI renditions of human voices have also gotten better and better. Actor Val Kilmer, for instance, lost his voice to throat cancer, so artificial intelligence was used to create his spoken lines for “Top Gun: Maverick.”.

Concerns about ethics and other issues.

It is unclear what some businesses, like Snapchat or OpenAI, do with any data used to train their systems to sound more like a deceased loved one, even though many AI-generated avatar platforms have online privacy policies that state they do not sell data to third parties.

Sample advised people not to post any private information online that they wouldn’t want the entire world to view.

It’s also a fine line to have someone who has passed away say something they have never said before.

He said, “Hearing words that were never said is one thing, but it’s another to replay a voicemail from a loved one and hear it again.”.

Concerns about false information, biases, and other problematic content still plague the whole generative AI sector. Replika states on its ethics page that it uses source data from all over the internet, including sizable textual databases from discussion boards like Reddit and social media sites like Twitter, to train its models.

“At Replika, we employ diverse strategies to alleviate detrimental information, like eliminating detrimental and ineffective data via crowdsourcing and categorization algorithms,” noted the organization. In order to protect our users, we remove or alter any potentially harmful messages that we find. “.

Whether or not this impedes the grieving process is another matter of concern. There are benefits and drawbacks to using technology in this way, according to Mary-Frances O’Connor, a professor at the University of Arizona who specializes in grief studies.

“The brain encodes that person as, ‘I will always be there for you and you will always be there for me,'” the speaker explained. “This happens when we bond with a loved one or fall in love.”. “When someone passes away, our brain needs to comprehend that they are not going to return. “.

It can take a while for one to fully comprehend that they are no longer with us, she said, because the brain finds it extremely difficult to process that. This is where technological interference may occur. “.

But she added that people, especially those who are just starting to grieve, might be searching for solace wherever they can.

“It could be therapeutic to create an avatar to bring back memories of a loved one while keeping in mind that the person was significant in the past,” the expert stated. “It’s vital to remember; it symbolizes both the significance of departed loved ones and the human condition. “.

She did point out, though, that genuineness is the foundation of our relationships with our closest family members. For many, it might “feel like a violation of that” if an AI version of that person is created. “.

distinct methods.

Not everyone is interested in using artificial intelligence to communicate with the dead.

Bill Abney, a San Francisco software engineer who lost his fiancée Kari in May 2022, stated to CNN that he would “never” think about using an AI platform or service to recreate her likeness.

Abney declared, “My fiancée was a poet, and I would never disrespect her by feeding her words into an automatic plagiarism machine.”.

“It is impossible to replace her. He declared, “She cannot be recreated.”. “I am fortunate enough to possess several recordings of her vocals and speech, but I would never want to hear her voice originating from a robot posing as her. “. “.

Others have discovered alternative methods of communicating virtually with departed family members. Psychotherapist Jodi Spiegel of Newfoundland, Canada, claimed that shortly after her husband’s passing in April 2021, she made a simulation of both herself and him in the well-known video game The Sims.

“I made us like we were in real life because I love the Sims,” she remarked. “When my spouse was playing the guitar, I would go to my Sims world and dance.”. “.

She claimed that they played chess, went on virtual camping and beach vacations, and even had sex in the Sims world.

“It was really consoling,” the woman remarked. “I really missed spending time with my partner. It had the feel of a bond. “.

scroll to top