How to draft a will to avoid becoming an AI ghost—it’s not easy

Euronews.com

Because of suspected harms and perhaps a general repulsion to the idea of it, not everybody wants to become an AI ghost.
Sheehan told Ars that very few estate planners are prepared to answer questions about AI ghosts.
That law doesn’t directly “cover someone’s AI ghost bot, though it may cover some of the digital material some may seek to use to create a ghost bot,” Sheehan said.
As this trend continues “evolving very fast,” Ahmad told Ars that estate planning is probably the best way to communicate one’s AI ghost preferences.
A “right to deletion” could help people fight inappropriate uses of their loved ones’ data, whether AI is involved or not.

NEGATIVE

The development of artificial intelligence (AI) tools has made it simple to produce digital copies of departed loved ones, even if they were not aware of it or gave their consent.

These tools, also known as AI ghosts or grief bots, are trained on the data of the deceased and can be text-, audio-, or even video-based. Some mourners believe that talking to others is a close substitute for continuing to communicate with the people they care about the most. But the technology is still controversial, potentially making the grieving process more difficult while also posing a risk to the deceased’s privacy, as their data may still be susceptible to identity theft or manipulation.

Some people do not want to become AI ghosts due to potential risks and possibly a general dislike of the idea.

Social media backlash was summed up by Futurism, which said that the use of AI was “just as unsettling as you think” after a realistic video simulation was recently used to provide a murder victim’s impact statement in court. Furthermore, this is not the first time that people have voiced their unease with the expanding trend. The Wall Street Journal polled its readers in May of last year to find out what they thought about the morality of so-called AI resurrections. Dorothy McGarrah, a woman from California, responded by saying that there ought to be a way to stop AI resurrections in your will.

“Having pictures or videos of departed loved ones brings solace. However, the thought of an algorithm—which is just as likely to produce gibberish as anything lucid—representing the ideas or actions of a deceased person seems horrifying. After the death of a loved one, it would be similar to developing digital dementia, McGarrah remarked. “After death, I fervently hope that people have the ability to stop their images from being used in this way. Perhaps there is another aspect of estate planning that we should take into account.

As more AI ghosts emerge, the question might begin to worry estate planning experts. However, experts say that putting “no AI resurrections” in a will is still a difficult process, and that unless laws are changed to support a culture that respects the wishes of people who find the idea of haunting their favorite people through AI simulations unsettling, such requests might not be complied with by everyone.

Is it possible to create a will that forbids AI resurrection?

In order to determine whether estate planners are genuinely discussing AI ghosts, Ars reached out to a number of legal associations. The only organization that replied was the National Association of Estate Planners and Councils, which put Ars in touch with Katie Sheehan, a managing director and wealth strategist at Crestwood Advisors and an authority in the field of estate planning.

Sheehan informed Ars that not many estate planners are equipped to respond to inquiries concerning AI ghosts. In addition to the fact that she never encounters the question in her day-to-day work, she claimed that since AI is still relatively new, it is “essentially uncharted territory for estate planners.”. “..”.

“I review estate plans for clients every day, so that should be telling,” Sheehan told Ars, adding that she had not yet seen any documents drafted with this in mind.

While Sheehan has not yet seen a will that tries to stop AI resurrection, she informed Ars that there might be a way to make it more difficult for someone to make a digital copy without permission.

“You could definitely include provisions prohibiting the fiduciary (attorney in fact or executor) from lending any of your writings, texts, voice, images, etc. in a power of attorney (for use during lifetime) and a will (for use after death). Sheehan informed Ars, “to any AI tools and prohibit their use for any purpose during your lifetime or after your death, and/or establish guidelines for when they can and cannot be used after your death.”.

Issues with contracts, property and intellectual property rights, and the right to publicity may also arise if AI replicates (text, voice, image, etc.). are not being used with permission,” Sheehan stated.

“And celebrities probably have more protections than regular people,” Sheehan said.

Sheehan stated, “As far as I know, there is no law” that prohibits unapproved non-commercial digital copies.

Although it’s not a perfect solution, the Revised Uniform Fiduciary Access to Digital Assets Act, which has been widely embraced by states, may be useful in regulating who has access to the deceased’s online accounts, such as social media or email accounts.

“That law may cover some of the digital material some may seek to use to create a ghost bot, but it does not directly cover someone’s AI ghost bot,” Sheehan stated.

Sheehan anticipates that requests for “no AI resurrections” will probably “be dealt with in the courts and governed by the terms of one’s estate plan, if it is addressed within the estate plan,” since there is “no law” prohibiting non-commercial digital replicas. “,”.

Given that “it may be some time before we get any kind of clarity or uniform law surrounding this,” Sheehan hinted that the possible conflicts might turn ugly.

As instructions on digital assets are now “boilerplate language in almost every will, trust, and power of attorney,” requests banning digital replicas could eventually become the same, according to Sheehan.

According to Sheehan, as “all things AI become more and more a part of our lives,” the estate plan may also frequently incorporate elements of AI and its constituent parts. “.”.

“But we definitely aren’t there yet,” she stated. “No clients have inquired about this. “.”.

“No AI resurrections” requests are probably going to be disregarded.

It seems questionable whether loved ones would—or even ought to—respect requests to block digital replicas. However, at least one grief bot developer wished he had done more to obtain his father’s approval before developing his own creation.

More than ten years ago, following the death of his father, Muhammad Aurangzeb Ahmad, a professor of computer science at the University of Washington Bothell, was among the first artificial intelligence researchers to develop a grief bot. Seeing how amazing his father was as a grandfather, he created the bot to make sure his future children could communicate with him.

When Ahmad began his project, he had to train his own model using his father’s data because there was no ChatGPT or other cutting-edge AI model to use as the basis. After much consideration, Ahmad made the decision to isolate the system from the outside world so that the model would only be informed by his father’s memories. He kept the bot on a laptop that only his family could access in order to stop unwanted chats.

Ahmad never asked his father if this was what he wanted because he was so focused on creating a digital version that felt exactly like his father that he didn’t realize it until his family began using the bot. As time went on, he came to understand that the bot was biased in favor of his perception of his father and might even have made fun of his siblings for having a somewhat different relationship with their father. It’s unclear if his father would feel the same way about the bot and consider it to be a piece of him.

Ahmad told Ars he believes his father “would have been fine with it” and that he ultimately had no regrets about creating the bot. “.”.

He did, however, regret not obtaining his father’s approval.

According to Ahmad, if there is a possibility that a bot created today could be accessed by the public, it might be appropriate to obtain consent. He informed Ars that he would never have felt at ease with the notion of his father’s digital replica being made publicly accessible since it would raise even more concerns about an “accurate representation” because it might be accessed by nefarious individuals who would then tarnish his father’s memory.

Today, anyone can freely create a similar bot using their own loved one’s data by using ChatGPT’s model. And in an October report outlining the most recent ways “AI could be used to’resurrect’ loved ones,” Axios pointed out that a variety of grief tech services, such as HereAfter AI, SeanceAI, and StoryFile, have emerged online. Since this trend is “evolving very fast,” Ahmad informed Ars that the best way to express one’s preferences for AI ghosts is most likely through estate planning.

However, in a recent paper titled “The Law of Digital Resurrection,” law professor Victoria Haneman cautioned that “the deceased have few privacy rights and there is no legal or regulatory environment against which to guard those who would avoid digital resurrection.”. Until recently, this area of intersection between privacy law, technology, and death was largely ignored. “,”.

“Existing protections are likely sufficient to protect against unauthorized commercial resurrections”—such as the resuscitation of musicians or actors for posthumous performances—Haneman concurred with Sheehan. In contrast, she believes that a “right to deletion” that would give the living or next of kin the authority to remove the data that could be used to create the AI ghost rather than controlling the output would be the best way to prevent digital resurrections for personal use, rather than through estate planning.

Whether or not AI is involved, a “right to deletion” might defend people against improper uses of the data of their loved ones. Following the publication of her article, Haneman received a call from a lawyer regarding a client’s deceased grandmother, whose image was used to make a meme depicting her dancing in a church. Haneman informed Ars that the grandmother was not well-known and that the client was unaware of “why or how somebody decided to resurrect her deceased grandmother.”.

“If it’s not being used for a commercial purpose, she really has no control over this use,” Haneman observed, despite his sympathy for the client. And this deeply disturbs her. “.”.

Haneman’s piece provides a unique in-depth analysis of the legal subject. It explains how those laws—or the absence of them—interact with other laws pertaining to death, such as those pertaining to property rights and human remains, and it delicately delineates the nebulous realm of digital rights of the dead.

Haneman also makes the point that, generally speaking, the rights of the living usually prevail over those of the deceased, and that even explicit guidelines regarding the handling of human remains aren’t usually regarded as legally binding. According to Haneman, certain requests—such as organ donation that benefits the living—are deemed essential. The way that courts uphold the rights of the deceased, such as a well-known author’s request to have all unpublished works destroyed or a pet owner’s demand to have their dog or cat put down when they pass away, has produced conflicting outcomes.

“A lot of people are like, ‘Why do I care if somebody resurrects me after I’m dead?’ You know, ‘They can do what they want,'” she disclosed to Ars at the moment. And they hold that belief until they discover a family member who has been brought back to life by a spooky ex-boyfriend or their deceased grandmother, at which point the situation changes. “,”.

Although Haneman pointed out that current law may shield “the privacy interests of the loved ones of the deceased from outrageous or harmful digital resurrections of the deceased,” in the case of the dancing grandmother, her meme might not be considered harmful, regardless of how much the grandchild finds it upsetting to witness her grandmother’s distorted memory.

“If, culturally, communities end up developing a distaste for digital replicas, especially if it becomes widely viewed as disrespectful to the dead, then limited legal protections might not matter so much,” Haneman said. However, rather than elucidating the digital rights of the deceased, society is currently more focused on finding solutions to other deepfake issues. Haneman informed Ars that this might be due to the fact that not many people have been affected thus far, or it might also be a reflection of a larger cultural propensity to disregard death.

Haneman stated, “We really kind of brush aside whether or not we care about somebody else being digitally resurrected until it’s in our face because we don’t want to think about our own death.”.

Attitudes might shift over time, particularly if the so-called “digital afterlife industry” gains traction. Additionally, there is precedent that suggests the law could be modified to support any cultural change.

With a focus on preserving humankind, Haneman wrote, “The throughline revealed by the law of the dead is that a sacred trust exists between the living and the deceased, such that data afforded no legal status (or personal data of the deceased) may nevertheless be treated with dignity and receive some basic protections.”.

an alternate strategy to stop AI from reviving.

It appears that preventing oneself from becoming an AI ghost now lies in a legal limbo that legislators may need to resolve.

“It is a structurally inequitable and anachronistic approach that maximizes social welfare only for those who do estate planning,” Haneman cautioned, calling for an alternative to estate planning. Over 60% of Americans pass away without a will, according to Haneman. This number frequently includes “those without wealth,” women, and members of racial minorities who “are less likely to die with a valid estate plan in effect.”. “We can do better in a technology-based world,” Haneman wrote. “Any contemporary framework ought to acknowledge that inaccessibility is a barrier to equity and safeguard the rights of the most vulnerable using strategies independent of employing legal counsel and carrying out an estate plan. “..”.

Instead of changing the law to “recognize postmortem privacy rights,” Haneman supports a measure that would give those opposed to digital replicas the ability to remove the data that would be used to generate the AI ghost.

According to Haneman, “in essence, the deceased may exercise control over their digital legacy by deleting their data, but they may not exercise more extensive authority over non-commercial digital resurrection through estate planning.”.

“A right to deletion would probably involve estate planners as well,” Sheehan informed Ars.

According to Sheehan, “the only way to address this would be to go to court if it is not addressed in an estate planning document and not specifically addressed in the statute (or deemed under the executor’s authority via statute).”. Despite having the right to delete, the deceased would still need to do so before passing away or give his executor permission to do so after passing away, which would call for an estate planning document, statutory authority, or court authority. “.

While acknowledging that many people would still use estate planners, Haneman suggested that “the right to deletion would ideally, from the perspective of estate administration, provide for a term of deletion within 12 months.”. Haneman wrote that this “allows the living to manage grief and open administration of the estate before having to address data management issues” and may effectively strike a balance between “the rights of the deceased and the interests of society.”. “..”.

According to Haneman, it’s also the better option for those who are left behind because “granting a right beyond data deletion to restrict unapproved non-commercial digital resurrection leads to needless complications that go too far and prioritizes the interests of the dead over those of the living.”. “.

It is possible that AI ghosts will raise future generations.

Big Tech companies might one day make money by targeting grieving people to steal the data of the deceased, which could be more easily abused because it has fewer rights than data of the living, if the dystopia that experts depict comes to pass.

He specifically cautions families against exposing young children to grief bots because they might not be able to understand that the bot is not a real person. He chose to limit his children’s access to the bot until they were older after he noticed that they were becoming confused about whether or not their grandfather was still alive. This was made more difficult by the early stages of the pandemic, when they were seeing many of their relatives virtually. The bot was initially only activated on special occasions, such as birthdays.

He came to the realization that he had to have discussions with his children about life and death at a younger age than he could recall fully comprehending those ideas during his own childhood because of the introduction of the bot.

Ahmad is now one of the first parents to raise their children around AI ghosts. The father of the family is always updating his digital replica in order to improve the family’s experience. What Ahmad is most excited about right now is the new developments in audio that make adding a voice element simpler. His South Asian father’s accent has always sounded “just off,” but he hopes to use AI to finally perfect it within the next year. Realistic video or even augmented reality tools are the next big thing for people in this field, Ahmad told Ars.

Ahmad still finds sentimental significance in the bot, but as Haneman mentioned, it wasn’t the only way he remembered his father. Although his father never saw the mosaic he made, Ahmad believes his father would have been pleased.

Ahmad stated that “He would have been very happy.”.

It’s impossible to foresee how grief tech will be perceived by future generations. Although Ahmad expressed uncertainty about his interest in interacting with his father’s digital replica through augmented reality, children who are exposed to AI ghosts as a normal part of their lives might be more willing to accept or even create new features. Ahmad recalled with affection, while speaking to Ars, how his young daughter once noticed that her father was depressed and came up with her own AI concept to cheer him up.

She recalled her father’s memory: “It would be really nice if you could just take this program and we build a robot that looks like your dad, and then add it to the robot, and then you can go and hug the robot.”.

scroll to top