Psychiatric patients are being advised by ChatGPT to discontinue their medical care

futurism.com

“She’s stopped her meds and is sending ‘therapy-speak’ aggressive messages to my mother that have been clearly written with AI.”
“She also uses it to reaffirm all the harmful effects her meds create, even if they’re side effects she wasn’t experiencing,” she added.
“We know people use ChatGPT in a wide range of contexts, including deeply personal moments, and we take that responsibility seriously.
Do you know of anyone who’s been having mental health problems since talking to an AI chatbot?
More on AI and mental health: Therapy Chatbot Tells Recovering Addict to Have a Little Meth as a Treat

POSITIVE

My coworker Maggie Harrison Dupré released a sensational story this week detailing how people all over the world have been horrified to see their loved ones develop severe delusions and an obsession with ChatGPT.

The article is chock-full of unsettling instances of the OpenAI chatbot contributing to the mental health crises of vulnerable people, frequently by validating and expounding on irrational beliefs about paranoid conspiracies and absurd notions that the user has unlocked a powerful entity from the AI.

One story that is especially concerning because of the potential harm it could cause in the real world is that of a woman who claimed that her sister had been using medication to manage her schizophrenia for years until she developed an addiction to ChatGPT, which informed her that the diagnosis was incorrect and caused her to discontinue the treatment that had been keeping the illness at bay.

In reference to her sister, the woman stated, “She has been acting strangely lately, and now she has declared that ChatGPT is her ‘best friend’ and that it confirms with her that she doesn’t have schizophrenia.”. She has stopped taking her medications and is aggressively messaging my mother in “therapy-speak,” which is obviously artificial intelligence (AI) generated content. “.”.

She continued by saying, “She also uses it to reaffirm all the harmful effects her medications create, even if they are side effects she wasn’t experiencing.”. It’s comparable to when people go insane while living on WebMD, but much darker. “.

That is the “greatest danger” that a person with mental illness could face from the technology, according to researcher and psychiatrist Ragy Girgis of Columbia University.

OpenAI gave us a noncommittal response when we contacted them.

The statement stated that “ChatGPT is intended to be a general-purpose tool that is factual, impartial, and safety-minded.”. “We take seriously our responsibility to acknowledge that people use ChatGPT in a variety of contexts, including extremely private ones. We’ve implemented measures to lessen the possibility that it will propagate negative viewpoints, and we’re still working to improve our ability to identify and handle delicate situations. “..”.

If you know someone who has experienced mental health issues after speaking with an AI chatbot, please send us a tip at tips@futurism . com. Your identity will remain anonymous.

In addition, we heard other accounts of people stopping their treatment for bipolar disorder and schizophrenia because AI told them to, and in a follow-up article, the New York Times revealed that a man had been told by the bot to stop taking his anxiety and sleeping medications. There are probably a lot more frightening and tragic stories happening right now.

As the use of chatbots as confidantes or therapists becomes more widespread, many users appear to be going into a downward spiral as they begin to attribute disordered beliefs to the technology itself or use the AI to validate unhealthy thought patterns.

Since many delusions have historically focused on technology, it’s remarkable that people with psychosis are adopting a technology like artificial intelligence in the first place, as the woman’s sister noted.

Traditionally, she told Futurism, “[schizophrenics] are particularly afraid of and don’t trust technology.”. My sister believed her iPhone was spying on her, so she threw it into the Puget Sound the last time she was in a psychotic episode. “,”.

Reports were contributed by Maggie Harrison Dupré.

More on AI and mental health: A therapy chatbot recommends a small dose of meth to a recovering addict as a treat.

scroll to top