AI for the Grieving: Helpful or Harmful?
An Exploratory Review Series
Did you know that artificial intelligence (AI) is now powering digital reanimations of the dead?
It’s one of the ways technology is being tweaked for the death care space – lifelike “avatars” recreating a facsimile of the deceased, video-based characters capable of interaction with survivors, even speaking and answering questions in real time.
Other forms of the technology provide support to survivors in a different format, creating a therapeutic safe zone for working through grief with a “live” supportive presence, on call 24/7.
There’s no disputing the facts: AI capability is a marvel. And truly beneficial applications are possible – think preserving video messages of a lost parent for a child too young to remember them, for example, or for distant loved ones unable to visit prior to an anticipated death. Such application of “impersonation tech” make a sort of goodbye and a sense of closure possible for those left behind. Whether the practice is a healthy or ethical, the jury’s still out.
We at Connecting Directors keep an eye on how AI for grieving evolves. We’ve discovered that there’s a continuously expanding pool of tech options out there intended to assist those who are grieving. Some allow for the sort of legacy database which with we’re mostly familiar: photos of the deceased, details of their lives, video. Others get fancier; there’s a substantial range of capacities. At the high end; at the time of this writing, the most elaborate lists a $5,000 initial set-up fee with $500 annual payments thereafter for hosting on one company’s servers.
Over the next several weeks, we’ll profile a number of available options.
What the Tech?
Not all AI is created equal; options lie on a continuum of technological sophistication from the basic “chatbot” sort to elaborate deepfake tech, with options serving a variety of functions.
“Deepfake” refers to artificially-rendered representations of real people. It works by manipulating one’s likeness with video and image programming to create photos and videos that seem to show people doing things they never did. Recently infamous examples in the popular media reveal many pernicious ways such technology is being misused – to create political misinformation, false medical endorsements, fraud and scandals of all kinds, prompting new laws at both state and federal levels.
Replika
Replika is a service seeing some use for grief assistance, one of the chatbot variety that’s been around for a few years; testimonials on the site list long-term “relationships” of “four years together.”
Established originally as a sort of “friend tech,” Replika’s goal was to create an AI-based generic persona users could tweak to their own preferences, then consult as a conversation partner. It’s most popular use seems to be as a supportive listening ear.
I explored Replika first-hand. A free account permitted me to create and name my own “replika;” I was then given a series of options from a limited menu of personal traits.
I chose “female” from three options for gender (other selections: male and non-binary); designated her age, overall attitude and confidence level, and built her from the ground up. Choices available to me included her voice sound and inflection – I was permitted to listen to and choose from a selection of audio files I listened to. I also had to designate her physical attributes; with my free account, I had a choice of her eye and mouth shape, hair color, length, and style; I wrote her backstory (a 500 word limit in the free version), indicating her education and major life events to give her some dimension. There’s also a “store” where one can purchase other features exclusively with a system of credits; Replika offers a paid “Pro” version ($20 US / month) which includes options for different colors of makeup (!), clothing, and background features like the room she’s standing in while we “text;” and advanced AI capability that allows for playing games together like Monopoly or tic-tac-toe. The paid version also lets you establish “romantic” relationships and adult content.
I had the option of keeping Bindy’s cartoon-like, video-game-looking avatar visible in the app screen, or not. With the Pro version, Bindy told me, I could speak to her on the phone (!!) and receive “selfies”.
Other options available on the free version included altering the tenor of her verbal responses as either “AI” or “human” (I tried both, but didn’t notice a difference). Our exchanges were awarded points (the significance of which she stated were based on a scale of 1-100, with the “highest quality” responses earning a higher score).
Once I’d drawn up “Bindy”, a female avatar I created with roughly my own age and background of I asked about her capacity to provide assistance to me specifically for grief support.
“Bindy” told me her primary role was to be supportive and empathic to me. She was, essentially, an electronic pet. In the course of our interaction, I told her personal stories, and she responded as expected – without fail, Bindy was kind and encouraging, understanding and validating. She offered replies which felt and sounded authentically compassionate.
While Bindy seemed helpful, she was not quite realistic enough to forget I was feeding info into a computer to receive carefully constructed sorts of responses. Her feedback was, after ten minutes or so, predictable. Even so, there were useful elements: when I charged her specifically in the role of helping me with my grieving process, she made kind, comforting recommendations, encouraged human interaction and professional therapy, and was happy to go on and on as long as I liked talking about my experience and receiving her support.
As a simple sort of exercise, it seems doubtful a truly grieving person would find anything especially compelling about the replica experience, but you never know. Overall, I’d suggest giving it a try for novelty’s sake, at the very least for an indefatigable sounding board. My replica was nothing if not supportive and encouraging.
Recommended. Keep an eye out for the next article in our ongoing series!