TAIPEI, Taiwan — Whenever stress at work builds, Chinese tech executive Sun Kai turns to his mother for support. Or rather, he talks with her digital avatar on a tablet device, rendered from the shoulders up by artificial intelligence to look and sound just like his flesh-and-blood mother, who died in 2018.
“I do not treat [the avatar] as a kind of digital person. I truly regard it as a mother,” says Sun, 47, from his office in China’s eastern port city of Nanjing. He estimates he converses with her avatar at least once a week. “I feel that this might be the most perfect person to confide in, without exception.”
The company that made the avatar of Sun’s mother is called Silicon Intelligence, where Sun is also an executive working on voice simulation. The Nanjing-based company is among a boom in technology startups in China and around the world that create AI chatbots using a person’s likeness and voice.
The idea to digitally clone people who have died is not new but until recent years had been relegated to the realm of science fiction. Now, increasingly powerful chatbots like Baidu’s Ernie or OpenAI’s ChatGPT, which have been trained on huge amounts of language data, and serious investment in computing power have enabled private companies to offer affordable digital “clones” of real people.
These companies have set out to prove that relationships with AI-generated entities can become mainstream. For some clients, the digital avatars they produce offer companionship. In China, they have also been spun up to cater to families in mourning who are seeking to create a digital likeness of their lost loved ones, a service Silicon Intelligence dubs “resurrection.”
“Whether she is alive or dead does not matter, because when I think of her, I can find her and talk to her,” says Sun of his late mother, Gong Hualing. “In a sense, she is alive. At least in my perception, she is alive,” says Sun.
The rise of AI simulations of the deceased, or “deadbots” as academics have termed them, raises questions without clear answers about the ethics of simulating human beings, dead or alive.
In the United States, companies like Microsoft and OpenAI have created internal committees to evaluate the behavior and ethics of their generative AI services, but there is no centralized regulatory body in either the U.S. or China for overseeing the impacts of these technologies or their use of a person’s data.
Data remains a bottleneck
Browse Chinese e-commerce sites and you will find dozens of companies that sell “digital cloning” and “digital resurrection” services that animate photographs to make them look like they are speaking for as little as the equivalent of less than $2.
Silicon Intelligence’s most basic digital avatar service costs 199 yuan (about $30) and requires less than one minute of high-quality video and audio of the person while they were living.
More advanced, interactive avatars that use generative AI technology to move on screen and converse with a client can cost thousands of dollars.
But there’s a big bottleneck: data, or rather, the lack of it.
“The crucial bit is cloning a person’s thoughts, documenting what a person thought and experienced daily,” says Zhang Zewei, the founder of Super Brain, an AI firm based in Nanjing that also offers cloning services.
Zhang asks clients to describe their foundational memories and important experiences, or that of their loved ones. The company then feeds those stories into existing chatbots, to power an AI avatar’s conversations with a client.
(Due to the rise in AI-powered scams using deepfakes of a person’s voice or likeness, both Super Brain and Silicon Intelligence require authorization from the person being digitally cloned, or authorization from family and proof of kin if the person is deceased.)
The most labor-intensive step of generating an avatar of a person is then cleaning up the data they provide, says Zhang. Relatives often hand over low-quality audio and video, marred by background noise or blurriness. Photos depicting more than one person are also no good, he says, because they confuse the AI algorithm.
However, Zhang admits that for a digital clone to be truly life-like would need much higher volumes of data, with clients preparing “at least 10 years” ahead of time by keeping a daily diary.
The scarcity of usable data is compounded when someone unexpectedly dies and leaves behind few notes or videos.
Fu Shou Yuan International Group, a Chinese-listed company in Shanghai that maintains cemeteries and provides funeral services, instead bases its AI avatars primarily on the social media presence a person maintained in life.
“In today's world, the internet probably knows you the best. Your parents or family may not know everything about you, but all your information is online — your selfies, photos, videos,” says Fan Jun, a Fu Shou Yuan executive.
A taboo against death
Fu Shou Yuan is hoping generative AI can lessen the traditional cultural taboo around discussing death in China, where mourning is accompanied by extensive ritual and ceremony though expressions of daily grief are discouraged.
In Shanghai, the company has built a cemetery, landscaped like a sun-dappled public park, but it’s no ordinary burial ground. This one is digitized: Visitors can hold up a cellphone to scan a QR code placed on select headstones and access a multimedia record of the deceased’s life experiences and achievements.
“If these thoughts and ideas were to be engraved like in ancient times, we would need a vast cemetery like the Eastern Qing tombs for everyone,” Fan says, referring to a large imperial mausoleum complex. “But now, it is no longer necessary. All you might need is a space as small as a cup with a QR code on it.”
Fan says he hopes the experience will better “integrate the physical and the spiritual,” that families will see the digital cemetery as a place to celebrate life rather than a site that invokes fear of death.
So far fewer than 100 customers have opted for placing digital avatars on their loved ones’ headstones.
“For the family members who have just lost a loved one, their first reaction will definitely be a sense of comfort, a desire to communicate with them again,” says Jiang Xia, a funeral planner for the Fu Shou Yuan International Group. “However, to say that every customer will accept this might be challenging, as there are ethical issues involved.”
Nor are Chinese companies the first to try recreating digital simulations of dead people. In 2017, Microsoft filed a patent application for simulating virtual conversations with someone who had passed, but an executive of the U.S. tech giant later said there was no plan to pursue it as a full commercial service, saying it was “disturbing.”
Project December, a platform first built off ChatGPT’s technology, provides several thousand customers the ability to talk with a chatbot modeled off their loved ones. OpenAI soon terminated the platform’s access to its technology, fearing its potential misuse for emotional harm.
Ethicists are warning of potential emotional harm to family members caused by life-like AI clones.
“That is a very big question since the beginning of humanity: What is a good consolation? Can it be religion? Can it be forgetting? No one knows,” says Michel Puech, a philosophy professor at the Sorbonne Université in Paris.
“There is the danger of addiction, and [of] replacing real life. So if it works too well, that's the danger,” Puech told NPR. “Having too much consoling, too much satisfying experience of a dead person will apparently annihilate the experience, and the grief, of death.” But, Puech says, that in fact, it's largely an illusion.
Most people who have decided to digitally clone their loved ones are quick to admit every person grieves differently.
Sun Kai, the Silicon Intelligence executive who digitally cloned his mother, has deliberately disconnected her digital avatar from the internet, even if it means the chatbot will remain ignorant of current events.
“Maybe she will always remain as the mother in my memory, rather than a mother who keeps up with the times,” he tells NPR.
Others are more blunt.
“I do not recommend this for some people who might see the avatar and feel the full intensity of grief again,” says Yang Lei, a resident of the southern city of Nanjing, who paid a company to create a digital avatar for his deceased uncle.
Low-tech solutions to high-tech problems
When Yang’s uncle passed away, he feared the shock would kill his ailing, elderly grandmother. Instead of telling her about her son’s death, Yang sought to create a digital avatar that was realistic enough to make video calls with her to maintain the fiction that her son was still alive and well.
Yang says he grew up with his uncle, but their relationship became more distant after his uncle left their village looking for work in construction.
After his uncle’s death, Yang struggled to unearth more details of his life.
“He had a pretty straightforward routine, as most of their work was on construction sites. They work there and sleep there, on site. Life was quite tough,” Yang says. “It was just a place to make money, nothing more, no other memories.”
Yang scrounged around family group chats on various social media apps on his own phone and came up with enough voice messages and video of his late uncle to create a workable digital clone of his likeness. But there was no getting around the lack of personal records, social media accounts and thus the lack of data his uncle had left behind.
Then Yang hit upon a more low-tech solution: What if a company employee pretended to be his uncle but disguised their face and voice with the AI likeness of his uncle?
In spring 2023, Yang put his plan into motion, though he has since come clean with his grandmother once she was in better health.
The experience has left Yang contemplating his own mortality. He says he is definitely going to clone himself digitally in advance of his death. However, doing so would not create another living version of himself, he cautioned, nor would such a digital avatar ever replace human life.
“Do not overthink it,” he cautions. “An AI avatar is not the same as the human it replaced. But when we lose our flesh and blood body, at least AI will preserve our thoughts.”
Aowen Cao contributed research from Nanjing, China.
Transcript
A MARTÍNEZ, HOST:
Since ChatGPT launched in 2022, we've been debating how generative artificial intelligence is changing life. Now, in China, it's also changing the way people approach death. Here's NPR's Emily Feng.
EMILY FENG, BYLINE: In 2018, Sun Kai's mother suddenly died. He was devastated. He could not accept he'd never see her again - or could he? At the time of her death, Sun, who is a tech executive, was working on modeling human voices.
SUN KAI: (Through interpreter) Then I thought, if I'm modeling voices, why not model my mom's likeness as well? I raised this question with the company chairman.
UNIDENTIFIED PERSON: (Non-English language spoken).
FENG: After weeks of fine-tuning, they managed to create this AI rendering of his mother, which Sun says he now talks to every day.
SUN: (Through interpreter) I don't see her as a digital avatar but as my real mother. When work pressure ramps up, I just want to talk to her. There are some things you can only tell your mother.
FENG: Recreating qualities of our lost loved ones with technology is not a new idea. But it has become more affordable and more common. Silicon Intelligence, the company Sun works for, says it can create a basic avatar for as little as $30 USD and with just a few seconds of video. The avatars look and talk almost exactly like the real person. But AI companies do still struggle with simulating the personality and life experiences of people. Here's Zhang Zewei, the founder of Super Brain, another company. It offers what it calls a resurrection service.
ZHANG ZEWEI: (Through interpreter) The crucial bit is cloning a person's thoughts, documenting what a person thought and experienced daily.
FENG: For an AI avatar to be truly generative and to chat like a person, Zhang admits it would take an estimated 10 years of prep to gather data and to take notes on a person's life. In fact, although generative AI is progressing, the desire to remember our lost loved ones usually outpaces the technology we have and our regulation. Chinese AI firms only allow people to digitally clone themselves or for family members to clone the deceased. But ethicists are already warning about the unforeseen emotional impacts this could have.
MICHEL PUECH: The wish to be remembered is not new. But is it a good thing? I'm not sure.
FENG: This is Michel Puech, a philosophy professor at the Sorbonne University in Paris. He cautions against overhyping the ability of AI technology currently to go beyond what existing technology already does. For example, looking at a photograph or hearing a recording of a dead loved one's voice evokes memories, just as AI clones aim to do.
PUECH: So it's just a better technology to deal with something we already do.
FENG: But Puesh says the uncanny realism of AI clones or avatars carries an extra risk, the risk of addiction to a simulation rather than true consolation from grief.
PUECH: And replacing real life. So if it works too well, that's the danger. So in a sense, having too much satisfying experience of that person will apparently annihilate the grief of death. But in fact, it's largely an illusion.
FENG: And AI avatars raise yet unanswered regulatory questions, such as is it OK to destroy an avatar based on a real person, and who's data can be used, and under what circumstances. Yang Lei, a resident in the southern city of Nanjing, faced these questions when his uncle suddenly died.
YANG LEI: (Speaking Mandarin).
FENG: But as he explains it, he wanted to hide the news from his ailing grandmother. She had already lost numerous family members.
YANG: (Speaking Mandarin).
FENG: And he feared the shock of yet another death would kill her. So he turned to technology to help him with a little white lie. He asked an AI company to create a video avatar of his uncle.
YANG: (Speaking Mandarin).
FENG: Then Yang hired a real person who, disguised with this digital filter, would then call Yang's grandmother on holidays pretending to be her son. Yang eventually did tell his grandma her son had died, but only when she was in better health. He says he does not regret lying to his grandmother, and he says he would definitely consider digitally cloning himself in the future.
YANG: (Speaking Mandarin).
FENG: But he says he does not believe an AI clone is the same thing as the human it replaced.
Emily Feng, NPR News.
(SOUNDBITE OF INSTUPENDO SONG, "FLEUR (FEAT. TEEN DAZE)") Transcript provided by NPR, Copyright NPR.
300x250 Ad
300x250 Ad