Do 'grief bots' help mourners deal with loss?

This article originally appeared on Dark.

Several commercial products known as “mournbots” create a simulation of a lost loved one. Built on artificial intelligence that uses large language models, or LLMs, the bots imitate the specific way the deceased spoke using emails, text messages, voice recordings and more. The technology should help relatives cope with grief by letting them chat with the bot as if they were talking to the person. But we lack evidence that this technology actually helps relatives cope with loss.

People have been using technology to cope with feelings of loss for more than a century. Post-mortem photographs, for example, gave nineteenth-century Victorians a likeness of their dead to remember them by, while they could not afford a painted portrait. Recent studies have shown that having a drawing or picture as a keepsake helps some survivors grieve. Yet researchers are still learning how people grieve and what things help the surviving relatives to deal with loss.

An approach to grief that focuses on continuing bonds with the deceased loved one suggests that finding closure involves more than letting the person go. Research and clinical practice show that renewing the bond with someone they have lost can help mourners come to terms with their death. This means that grief bots can help the bereaved by helping them transform their relationship with their deceased loved one. But a strong, ongoing bond only helps the bereaved if they can understand their loss. And the impersonation of loved ones could make it harder for people to do that and accept that their loved one is gone.

Carla Sofka, professor of social work at Siena College in upstate New York, is an expert on technology and grief. As the Internet grew in the mid-1990s, she coined the term “thanatechnology'to describe any technology – including digital or social media – that helps a person cope with death, grief and loss, such as families and friends posting together on the social media profile of a deceased loved one or creating a website in memory of them. Other survivors enjoy reading emails from the deceased or listening to their recorded voice messages. Some people may do this for years as they come to terms with the intense emotions of loss.

Grief bots can give the bereaved a new tool to deal with grief, or they can create the illusion that the loved one is not gone.

When companies start building AI simulations of the deceased, then “they need to talk to the people who think they want this technology” to better create something that meets their needs, Sofka said. Current commercial grief bots target different groups. For example, Seance AI's grief bot is intended for short-term use to provide a sense of feeling closurewhile the company You, Only Virtual – or YOV – promises to keep someone's loved one with them forever, so “never have to say goodbye.”

But if companies can create convincing simulations of people who have died, Sofka says it's possible this could change the entire reality of the person who is no longer there. While we can only speculate, this may affect the way people who knew them grieve. As Sofka wrote in an email, “Everyone is different in the way they process grief.” Grief bots can give the bereaved a new tool to cope with grief, or they can create the illusion that the loved one is not gone and force mourners to suffer a second death if they want to stop using the bot.

Public health and technology experts like Linnea Laestadius of the University of Wisconsin-Milwaukee worry that grief bots could trap mourners in remote online conversations, preventing them from moving on with their lives. Her work on chatbots suggests that humans can become powerful emotional bonds to virtual personas that make them dependent on the emotional support program. Given how difficult it is to predict how such chatbots will affect the way people grieve, Sofka wrote in an email: “It is a challenge for social scientists to develop research questions that capture all possible responses to this new technology.”

That hasn't stopped companies from bringing their products to market. But developing grief bots responsibly isn't just about knowing how to create an authentic bot and then running it, says Wan-Jou She, an assistant professor at the Kyoto Institute of Technology.

She collaborated with Anna Xygkou, a PhD candidate at the University of Kent, and other co-authors on one research project to see how chatbot technologies can be used to support grief. They interviewed ten people who used virtual characters created by various apps to cope with the loss of a loved one. Five of their participants chatted with a simulation of the person they lost, while the others used chatbots that took on different roles, such as a friend. Xygkou said the majority of them spoke to the characters for less than a year. “Most of them used it as a transitional phase to overcome grief, in the first phase,” she said, “when the grief is so intense that you can't cope with the loss.” Left to their own devices, these mourners chose a short-term tool to help them cope with loss. They didn't want to recreate a loved one to keep him by their side for life. While this research suggests that grief bots may be useful for some bereaved families, more studies will be needed to show that the technology does not harm them – and that it helps beyond this small group.

Plus, the grief bots didn't have to convince anyone that they were human. The users interviewed knew they were talking to a chatbot, and they didn't mind. They suspended their disbelief, Xygkou said, to talk to the bot as if they were talking to their loved ones. As anyone who has used LLM-driven chatbots knows, it's easy to get the feeling that there's a real person on the other side of the screen. During the emotional upheaval of losing a loved one, giving in to this fantasy can be especially problematic. Therefore, simulations must make it clear that they are not a person, Xygkou said.

People may feel more comfortable talking to computers, or poor supervision may result in many people not knowing they are talking to a computer.

Crucially, according to She, chatbots are not currently subject to any regulation, and without that it is difficult to get companies to prove that their products help users cope with loss. The lax legislation has encouraged other chatbot apps to claim they can help improve mental health, without providing any evidence. As long as these apps categorize themselves as wellness rather than therapy, the U.S. Food and Drug Administration will not enforce its requirements, including requiring that apps prove they do more good than harm. While it is unclear which regulatory agency will ultimately be responsible, it is possible that the Federal Trade Commission may handle false or unqualified claims of such products.

Without much evidence, it's uncertain what impact grief bots will have on the way we deal with loss. Usage data does not appear to be public, but she and Xygkou had such difficulty finding participants for their study that Xygkou thinks not many mourners are currently using the technology. But that could change as AI continues to spread into our lives. Maybe more people will use grief bots as the shortage of qualified spiritual professionals is deteriorating. People may feel more comfortable talking to computers, or poor supervision may result in many people not knowing they are talking to a computer in the first place. So far, neither questionable ethics nor enormous costs have stopped companies from using AI every chance they get.

But no matter what comfort a survivor finds in a bone, they should not trust it under any circumstances, she said. When an LLM talks to someone, “it's just predicting: what's the next word.”

Related Posts

The Japanese government is (finally) done with floppy disks

Until 2019, the computer system that controlled the U.S. nuclear arsenal relied on eight-inch floppy disks, a format you'd probably have to be in your 50s to have ever seen.…

Deep tech VC Sidney Scott explains why he's closing his company as the space booms

Sidney Scott has decided to retire from the rat race of venture capital and is now auctioning off his vests for fun, with a starting price of $500,000. The Floating…

Leave a Reply

Your email address will not be published. Required fields are marked *

You Missed

Caitlin Clark and Angel Reese Make WNBA All-Star Game Debuts: Full Rosters

  • July 3, 2024
Caitlin Clark and Angel Reese Make WNBA All-Star Game Debuts: Full Rosters

Civil War Troops Receive Medal of Honor for Great Locomotive Pursuit

  • July 3, 2024
Civil War Troops Receive Medal of Honor for Great Locomotive Pursuit

Tips for a safe, healthy Fourth of July, plus a new approval for Alzheimer's drugs

  • July 3, 2024
Tips for a safe, healthy Fourth of July, plus a new approval for Alzheimer's drugs

The Japanese government is (finally) done with floppy disks

  • July 3, 2024
The Japanese government is (finally) done with floppy disks

'Early Innings' of a US Manufacturing Boom: Tema ETFs CEO

  • July 3, 2024
'Early Innings' of a US Manufacturing Boom: Tema ETFs CEO

The 4th of July rush has begun. TSA expects to screen a record number of travelers this weekend

  • July 3, 2024
The 4th of July rush has begun. TSA expects to screen a record number of travelers this weekend

Research shows giant salamander-like predator with tusks existed 40 million years before dinosaurs

  • July 3, 2024
Research shows giant salamander-like predator with tusks existed 40 million years before dinosaurs

Ozempic and Wegovy may be linked to risk of rare form of blindness: study

  • July 3, 2024
Ozempic and Wegovy may be linked to risk of rare form of blindness: study

Southwest adopts rights plan to fend off activist Elliott Management

  • July 3, 2024
Southwest adopts rights plan to fend off activist Elliott Management

Fireworks Safety Tips: Avoid Injuries on July 4th by Taking Precautions

  • July 3, 2024
Fireworks Safety Tips: Avoid Injuries on July 4th by Taking Precautions

Deep tech VC Sidney Scott explains why he's closing his company as the space booms

  • July 3, 2024
Deep tech VC Sidney Scott explains why he's closing his company as the space booms