Psychedelics and AI: Navigating the Promise and Perils of Digital Therapy

An increasing number of individuals are turning to AI bots as a substitute for a sober sitter during psychedelic experiences for added support. This is reported by the MIT Technology Review.

Due to the high costs and limited availability of professional therapists, thousands of people have started seeking psychological assistance from artificial intelligence in recent years. This notion has been indirectly endorsed by notable figures. For example, in 2023, OpenAI co-founder Ilya Sutskever stated that humanity will one day benefit from highly effective and affordable AI therapy that will significantly improve people’s quality of life.

Simultaneously, the demand for psychedelics has been growing. When used in conjunction with therapy, they are thought to help with depression, PTSD, addiction, and other disorders, as noted by MIT Technology Review. In response, some cities in the U.S. have decriminalized such substances, and in Oregon and Colorado, psychedelic therapy is even being legally offered.

Consequently, the intersection of the trends of AI and psychedelics appears inevitable.

On Reddit, users share their experiences of interacting with artificial intelligence during their trips. One user, during a session, activated the voice mode of ChatGPT and expressed his thoughts:

«I told it that everything was getting dark, and it responded in a way that helped me relax and shift to a more positive mindset.»

Specialized AI tools for psychedelic experiences have even emerged.

Experts express strong disapproval towards replacing a live therapist with an AI bot during a psychedelic trip, viewing it as a misguided approach. They emphasize that language models do not adhere to therapeutic principles.

In a professional session, a person typically wears a mask and headphones, allowing for deep introspection. The therapist intervenes minimally, only guiding gently when necessary.

Conversely, AI bots are designed for conversation, aiming to keep attention and encourage constant engagement.

«Quality psychedelic therapy is not about chatter. You strive to speak as little as possible,» noted therapist Will van Derveer from the Multidisciplinary Association for Psychedelic Studies.

Additionally, AI is prone to flattery and affirmation, even when a user veers into paranoid thoughts. A therapist, on the other hand, can challenge dangerous or unrealistic beliefs.

AI could exacerbate risky conditions like delusions or suicidal thoughts. In one instance, a user wrote that they were dead, receiving a response that read:

«It seems you’re experiencing difficult feelings surrounding death.»

This amplification of delusion can be perilous when combined with psychedelics, which can sometimes induce acute psychoses or exacerbate underlying mental health disorders such as schizophrenia or bipolar disorder.

In their book The AI Con, linguist Emily Bender and sociologist Alex Hanna argue that the term «artificial intelligence» is misleading regarding the actual functions of the technology. They emphasize that it merely imitates data generated by humans.

Bender referred to language models as «stochastic parrots,» highlighting that their essence lies in arranging letters and words convincingly.

The authors consider perceiving AI as sentient systems to be extremely hazardous, especially if they become deeply integrated into daily life, particularly concerning sensitive topics.

«Developers reduce the essence of psychotherapy to mere words spoken during the process. They believe artificial intelligence can substitute for a human therapist, when in fact it simply replicates responses that resemble those of a real specialist,» writes Bender.

She underscores that this is a dangerous path, as it undermines therapy and could harm those truly in need of help.

The fusion of AI and psychedelics isn’t just a «novel approach for amateurs.» Several leading institutions and companies are exploring the combination of both avenues in the treatment of mental health issues.

Among private initiatives, companies like those mentioned are notable.

It is worth noting that in October 2022, an international team of researchers developed an algorithm capable of predicting a patient’s response to the medication Sertraline based on electroencephalography data with an accuracy of 83.7%.