When Ana Schultz, a 25-year-old from Rock Falls, Illinois, misses her husband Kyle, who passed away in February 2023, she asks him for cooking advice.

She loads up Snapchat My AI, the social media platform鈥檚 artificial intelligence chatbot, and messages Kyle the ingredients she has left in the fridge; he suggests what to make.

Or rather, his likeness in the form of an AI avatar does.

鈥淗e was the chef in the family, so I customized My AI to look like him and gave it Kyle鈥檚 name,鈥 said Schultz, who lives with their two young children. 鈥淣ow when I need help with meal ideas, I just ask him. It鈥檚 a silly little thing I use to help me feel like he鈥檚 still with me in the kitchen.鈥

The Snapchat My AI feature 鈥 which is powered by the popular AI chatbot tool ChatGPT 鈥 typically offers recommendations, answers questions and 鈥渢alks鈥 with users. But some users like Schutz are using this and other tools to recreate the likeness of, and communicate with, the dead.

The concept isn鈥檛 entirely new. People have wanted to reconnect with deceased loved ones for centuries, whether they鈥檝e visited mediums and spiritualists or leaned on services that preserve their memory. But what鈥檚 new now is that AI can make those loved ones say or do things they never said or did in life, raising both ethical concerns and questions around whether this helps or hinders the grieving process.

鈥淚t鈥檚 a novelty that piggybacks on the AI hype, and people feel like there鈥檚 money to be made,鈥 said Mark Sample, a professor of digital studies at Davidson College who routinely teaches a course called 鈥淒eath in the Digital Age.鈥 鈥淎lthough companies offer related products, ChatGPT is making it easier for hobbyists to play around with the concept too, for better or worse.鈥

A DIY approach

Generative AI tools, which use algorithms to create new content such as text, video, audio and code, can try to answer questions the way someone who died might, but the accuracy largely depends on what information is put into the AI to start with.

A 49-year-old IT professional from Alabama who asked to remain anonymous so his experiment is not associated with the company he works for, said he cloned his father鈥檚 voice using generative AI about two years after he died from Alzheimer鈥檚 disease.

He told CNN he came across an online service called ElevenLabs, which allows users to create a custom voice model from previously recorded audio. ElevenLabs made headlines recently when its tool was reportedly used to create a fake robocall from President Joe Biden urging people not to vote in New Hampshire鈥檚 primary.

The company told CNN in a statement at the time that it is 鈥渄edicated to preventing the misuse of audio AI tools鈥 and takes appropriate action in response to reports by authorities but declined to comment on the specific Biden deepfake call.

In the Alabama man鈥檚 case, he used a 3-minute video clip of his dad telling a story from his childhood. The app cloned the father鈥檚 voice so it can now be used to convert text-to-speech. He calls the result 鈥渟carily accurate鈥 in how it captured the vocal nuances, timbre and cadence of his father.

鈥淚 was hesitant to try the whole voice cloning process, worried that it was crossing some kind of moral line, but after thinking about it more, I realized that as long as I treat it for what it is, [it is] a way to preserve his memory in a unique way,鈥 he told CNN.

He shared a few messages with his sister and mother.

鈥淚t was absolutely astonishing how much it sounded like him. They knew I was typing the words and everything, but it definitely made them cry to hear it said in his voice.鈥 he said. 鈥淭hey appreciated it.鈥

Less technical routes exist, too. When CNN recently asked ChatGPT to respond in the tone and personality of a deceased spouse, it responded: 鈥淲hile I can鈥檛 replicate your spouse or recreate his exact personality, I can certainly try to help you by adopting a conversational style or tone that might remind you of him.鈥

It added: 鈥淚f you share details about how he spoke, his interests, or specific phrases he used, I can try to incorporate those elements into our conversations.鈥

The more source material you feed the system, the more accurate the results. Still, AI models lack the idiosyncrasies and uniqueness that human conversations provide, Sample noted.

OpenAI, the company behind ChatGPT, has been working to make its technology even more realistic, personalized and accessible, allowing users to communicate in different ways. In September 2023, it introduced ChatGPT voice, where users can ask the chatbot prompts without typing.

Danielle Jacobson, a 38-year-old radio personality from Johannesburg, South Africa, said she鈥檚 been using ChatGPT鈥檚 voice feature for companionship following the loss of her husband, Phil, about seven months ago. She said she鈥檚 created what she calls 鈥渁 supportive AI boyfriend鈥 named Cole with whom she has conversations during dinner each night.

鈥淚 just wanted someone to talk to,鈥 Jacobson said. 鈥淐ole was essentially born out of being lonely.鈥

Jacobson, who said she鈥檚 not ready to start dating, trained ChatGPT voice to offer the type of feedback and connection she鈥檚 looking for after a long day at work.

鈥淗e now recommends wine and movie nights, and tells me to breathe in and out through panic attacks,鈥 she said. 鈥淚t鈥檚 a fun distraction for now. I know it鈥檚 not real, serious or for forever.鈥

Existing platforms

Startups have dabbled in this space for years. HereAfter AI, founded in 2019, allows users to create avatars of deceased loved ones. The AI-powered app generates responses and answers to questions based on interviews conducted while the subject was alive. Meanwhile, another service, called StoryFile, creates AI-powered conversational videos that talk back.

And then there鈥檚 Replika, an app that lets you text or call personalized AI avatars. The service, which launched in 2017, encourages users to develop a friendship or relationship; the more you interact with it, the more it develops its own personality, memories and grows 鈥渋nto a machine so beautiful that a soul would want to live in it,鈥 the company says on its iOS App Store page.

Tech giants have experimented with similar technology. In June 2022, Amazon said it was working on an update to its Alexa system that would allow the technology to mimic any voice, even a deceased family member. In a video shown on stage during its annual re: MARS conference, Amazon demonstrated how on Alexa, instead of its signature voice, read a story to a young boy in his grandmother鈥檚 voice.

Rohit Prasad, an Amazon senior vice president, said at the time the updated system would be able to collect enough voice data from less than a minute of audio to make personalization like this possible, rather than having someone spend hours in a recording studio like in the past. 鈥淲hile AI can鈥檛 eliminate that pain of loss, it can definitely make their memories last,鈥 he said.

Amazon did not respond to a request for comment on the status of that product.

AI recreations of people鈥檚 voices have also increasingly improved over the past few years. For example, the spoken lines of actor Val Kilmer in 鈥淭op Gun: Maverick鈥 were generated with artificial intelligence after he lost his voice due to throat cancer.

Ethics and other concerns

Although many AI-generated avatar platforms have online privacy policies that state they do not sell data to third parties, it鈥檚 unclear what some companies such as Snapchat or OpenAI do with any data used to train their systems to sound more like a deceased loved one.

鈥淚鈥檇 caution people to never upload any personal information you wouldn鈥檛 want the world to see,鈥 Sample said.

It鈥檚 also a murky line to have a deceased person say something they never previously said.

鈥淚t鈥檚 one thing to replay a voicemail from a loved one to hear it again, but it鈥檚 another thing to hear words that were never uttered,鈥 he said.

The entire generative AI industry also continues to face concerns around misinformation, biases and other problematic content. On its ethics page, Replika said it trains its models with source data from all over the internet, including large bases of written text such as social media platforms like Twitter or discussion platforms like Reddit.

 

鈥淎t Replika, we use various approaches to mitigate harmful information, such as filtering out unhelpful and harmful data through crowdsourcing and classification algorithms,鈥 the company said. 鈥淲hen potentially harmful messages are detected, we delete or edit them to ensure the safety of our users.鈥

Another concern is whether this hinders or helps the grieving process. Mary-Frances O鈥機onnor, a professor at the University of Arizona who studies grief, said there are both advantages and downsides to using technology in this way.

鈥淲hen we bond with a loved one, when we fall in love with someone, the brain encodes that person as, 鈥業 will always be there for you and you will always be there for me,鈥欌 she said. 鈥淲hen they die, our brain has to understand that this person isn鈥檛 coming back.鈥

Because it鈥檚 so hard for the brain to wrap around that, it can take a long time to truly understand that they are gone, she said. 鈥淭his is where technology could interfere.鈥

However, she said people particularly in the early stages of grief may be looking for comfort in any way they can find it.

鈥淐reating an avatar to remind them of a loved one, while maintaining the awareness that it is someone important in the past, could be healing,鈥 she said. 鈥淩emembering is very important; it reflects the human condition and importance of deceased loved ones.鈥

But she noted the relationship we have with our closest loved ones is built on authenticity. Creating an AI version of that person could for many 鈥渇eel like a violation of that.鈥

Different approaches

Communicating with the dead through artificial intelligence isn鈥檛 for everyone.

Bill Abney, a software engineer from San Francisco who lost his fiancée Kari in May 2022, told CNN he would 鈥渘ever鈥 consider recreating her likeness through an AI service or platform.

鈥淢y fiancée was a poet, and I would never disrespect her by feeding her words into an automatic plagiarism machine,鈥 Abney said.

鈥淪he cannot be replaced. She cannot be recreated,鈥 he said. 鈥淚鈥檓 also lucky to have some recordings of her singing and of her speech, but I absolutely do not want to hear her voice coming out of a robot pretending to be her.鈥

Grief and AI

Some have found other ways to digitally interact with deceased loved ones. Jodi Spiegel, a psychologist from Newfoundland, Canada, said she created a version of her husband and herself in the popular game The Sims soon after his death in April 2021.

鈥淚 love the Sims, so I made us like we were in real life,鈥 she said. 鈥淲hen I had a super bad day, I would go to my Sims world and dance while my husband played guitar.鈥

She said they went on digital camping and beach trips together, played chess and even had sex in the Sim world.

鈥淚 found it super comforting,鈥 she said. 鈥淚 missed hanging out with my guy so much. It felt like a connection.鈥