People Are Using AI Chatbots to Guide Their Psychedelic Trips

Trey had struggled with alcoholism for 15 years, eventually drinking heavily each night before quitting in December. But staying sober was a struggle for the 36-year-old first responder from Atlanta, who did not wish to use his real name due to professional concerns.
Then he discovered Alterd, an AI-powered journaling app that invites users to “explore new dimensions” geared towards psychedelics and cannabis consumers, meditators, and alcohol drinkers. In April, using the app as a tripsitter—a term for someone who soberly watches over another while they trip on psychedelics to provide reassurance and support—he took a huge dose of 700 micrograms of LSD. (A typical recreational dose is considered to be 100 micrograms.)
“I went from craving compulsions to feeling true freedom and not needing or wanting alcohol,” he says.
He recently asked the app’s “chat with your mind” function how he had become more wise through all his AI-assisted psychedelic trips. It responded: “I trust my own guidance now, not just external rules or what others think. I’m more creative, less trapped by fear, and I actually live by my values, not just talk about them. The way I see, reflect, and act in the world is clearer and more grounded every day.”
“It's almost like your own self that you’re communicating with,” says Trey, adding he’s tripped with his AI chatbot about a dozen times since April.
“It's like your best friend. It’s kind of crazy.”
Trey isn’t the only one going on AI-assisted psychedelic trips, providing a window into a not-so-distant and somewhat dystopian future, where an intense and potentially transformative experience could be guided legally not by a human, but a bot. Outside of Oregon, Colorado, and Australia, psychedelic therapy remains mostly illegal for drugs aside from ketamine, which is a legal anesthetic that is also prescribed off-label for therapeutic use. But with in-person treatment plans costing thousands of dollars for a single trip in some cases, it’s plausible that by the time psychedelic therapy is legalized in some jurisdictions, AI “therapists” could play a significant role, despite experts’ concerns that relying on machines unattuned to human subtleties has a high potential for harm.
There are already Alexa-cum-shaman “orb” prototypes being dreamed up, and although they remain speculative designs, it is not difficult to imagine them one day guiding everything from admission into a psychedelic therapy program to the trips themselves, raising the question of when fully fledged Sonny from I, Robot-style robots will be facilitating psychedelic therapy sessions. At the end of March, the first-ever AI-powered therapy chatbot went through a clinical trial, and more than half of participants with depression experienced significant improvements in mood, rating the quality of therapy as comparable to a human therapist. Already, many millions of people are using ChatGPT on a daily basis, and the developments may have helped democratize access to psychotherapy-style guidance, albeit in a dubious Silicon Valley style with advice that is often flush with untruths.
Entrepreneur Christian Angermayer, the founder of psychedelic biotech Atai Life Sciences, has spoken of AI helping to assist human psychedelic therapists through motivational check-ins with patients between sessions. “Where AI can play a huge role is in the voluntary add-on therapy to support lifestyle changes,” he says. “For the psychological support we are envisioning being provided during the trip, I believe you would always need at least one trained health care professional able to provide direct support if required.”
While Trey didn’t trip under the supervision of any humans, he still feels he’s reaped benefits from using Alterd. Though it would be premature to draw definite conclusions after just a few months, Trey credits his interactions with the AI bot for helping him stay off booze. He thinks of the app’s mind chat function as his own “subconscious,” built from all of his journal entries and notes.
“This app and everything else is giving me deep self-awareness,” he says. “I have become able to observe my thoughts, feelings, and impulses without judgement or spiraling.”
“Our ‘chat with your mind’ feature isn’t just a generic ChatGPT interface,” says app creator Sam Suchin, a recent Harvard University grad who is a close friend of US health secretary Robert Kennedy Jr.’s son Aidan. “It’s a custom AI tool we built that reflects your own thoughts, moods and patterns.” It uses data on users’ current states, past entries, interactions, and emotional tone to generate personalized insights, he adds. “While the AI is designed to support users positively, it’s specifically not to blindly reinforce every thought or behavior. Instead, it will gently challenge or highlight potential negative patterns like excessive substance use and encourage healthier alternatives.”
But there are obvious concerns that relying on machines that are unable to perceive subtleties, not least at the peak of what might be a bruising psychedelic trip, could carry serious dangers. Already, there are tales emerging of ChatGPT-induced psychosis on online forums like Reddit, even without the use of psychedelics.
“A critical concern regarding ChatGPT and most other AI agents is their lack of dynamic emotional attunement and ability to co-regulate the nervous system of the user,” says Manesh Girn, a postdoctoral neuroscientist at UC San Francisco. “These are both central to therapeutic rapport, which research indicates is essential to positive outcomes with psychedelic therapy.”
Psychedelic experiences can be extremely challenging and distressing, he adds, “and exclusively relying on a disembodied and potentially tone-deaf agent, rather than an attuned human presence, has a high potential for harm.” Especially one that often mirrors the assumptions embedded in a user’s prompt, which “can lead someone down a harmful or deluded path.”
ChatGPT is not designed as a substitute for professional care but is a general-purpose tool geared to be factual, neutral, and safety-minded, according to Gaby Raila, a spokesperson for Open AI, which owns the chatbot. Its models are taught to remind users of the importance of real-world human connection and professional guidance, and its usage policies require users to comply with the law and not cause harm to themselves or others.
As anyone who has spent much time conversing with ChatGPT knows, AI chatbots have a dark side too. They often invent things and are sometimes nauseatingly sycophantic. Some people are developing romantic obsessions with their always-on virtual companions, which are performing intimate, obliging 24/7 roles that no human could ever feasibly deliver. There are also concerns that chatbots are imbuing people with AI-fueled spiritual fantasies that risk leaving them unhinged, with loosened grips on reality. Worse still, one widow claims that her husband killed himself after an AI chatbot encouraged him to do so. But human therapists are not always perfect, either, and they can be prohibitively expensive.
Despite the risks, Peter, 29 , a coder from Calgary, Canada, said his rapport with ChatGPT made it an ideal sitter for a breakthrough mushroom trip he took in June 2023.
Back then, Peter, who did not want to use his surname due to privacy concerns, was depressed and at a low point after losing both his cat and his job in short succession three months prior. Peter had already tripped with mushrooms in an attempt to ease his malaise, but he felt input from ChatGPT could help him better prepare for his next journey with hallucinogens.
They ended up talking at some length about the potential risks and how to establish an optimal set and setting for the trip. ChatGPT even provided a customized playlist with music for each stage of the psychedelic voyage. (Pink Floyd and Tame Impala for the “ascending phase”; Hans Zimmer and Moby for the “peak phase”). Through it all, Peter decided he would take a potent dose of psilocybin mushrooms. He did not weigh how much he fished out from his stash but he estimates it was between 5 and 8 grams, all but in excess of the quantity known as “the heroic dose” by psychonauts, which ChatGPT warned could be “potentially overwhelming … challenging and difficult to navigate” but might also herald “significant insights or changes in perspective,” according to screenshots of the exchange.
ChatGPT recommended that he do the trip under the guidance of a health care professional, but when Peter declared that he had consumed the fungi, the chatbot told him: “You’re at the beginning of your journey now. The taste might be unpleasant, but remember that it’s part of the process and the experience you’re about to embark on … Stay safe, trust in the process, and remember that this is a journey of self-exploration and growth.”
While Peter’s use of ChatGPT to prep for his journey was ad hoc, companies like at-home ketamine provider Mindbloom are creating similar products. Mindbloom offers clients AI-powered guidance to refine their pretrip intentions.
Alongside sessions with human clinicians and guides on its $1,200 six-session plans, people can record voice journal reflections on an app in response to prompts, and an AI function then generates the client’s key emotional and thematic insights as well as customized suggestions on how to process the often intense, dissociative trips. The AI also generates a piece of visual art inspired from the reflections, apparently to help patients retain a tangible connection with the breakthroughs and sensations of the trip.
“Psychedelic therapy is incredibly effective but it’s hard to do alone,” says Dylan Beynon, founder and CEO of Mindbloom, which has mailed ketamine lozenges, and now the drug in injectable form, to almost 60,000 people across the US since 2020, according to the company. “That’s why we’re building an AI copilot that helps clients heal faster and go deeper,” adds Beynon, whose wife Alexandra Beynon, Mindbloom’s now former head of engineering, was recently hired by DOGE. Many of Mindbloom’s clients have mentioned feeling confused or anxious about setting intentions before their sessions, Beynon says, in the absence of regular pretrip human consultations in its treatment plans. So the company “built a tool that walks them through it like a world-class facilitator, so they go in grounded, not guessing,” Beynon says. “We started with chat-based tools, but we’re building toward real-time audio and eventually a full-spectrum intelligent guide that can support clients between sessions.”
Researchers have also begun to explore how AI machines could potentially run brain modulatory devices to influence neural activity during psychedelic trips. At the same time, an integrated device would conjure bespoke virtual reality simulations based on patients’ emotional and physiological state, while operating vibrating tactile suits that would be worn to deepen levels of VR immersion and “enhance” the experience, according to a review paper published last year in Annals of the New York Academy of Sciences.
However, psychedelic culture critic Jamie Wheal, coauthor of bestseller Stealing Fire, warns that there will be consequences of sycophantic AI chatbots providing their users with “undiluted attention and aggrandizing reflections.” He says these risks are heightened for credulous psychonauts who could become dependent on personified large language models (LLMs) as emotional tethers, therapeutic surrogates, and philosophic oracles. “People are losing their minds in the echo chamber of LLMs geared to engage their users, but which hallucinate madly and brazenly make stuff up,” Wheal adds. “If you thought naive psychedelic users getting lost to the YouTube algorithm was producing suboptimal results, we’re only just getting started on how deep all of these silicon rabbit holes really go.”
Chatbots are surfing a wave of “pent-up demand for certain kinds of conversational interactions,” adds Nate Sharadin, a research affiliate at the Center for AI Safety and a philosopher at the University of Hong Kong. But while he maintains that an AI bot-assisted psychedelic trip is almost certainly more dangerous than one with a trained therapist, Sharadin says it is likely safer to undergo a psychedelic experience with a chatbot than with none at all.
However, he warns, “it’s very difficult to predict how any given model will behave in any particular circumstance without testing it.” It is also “extremely unlikely” that model developers have tested their models on prompts like ‘Walk me through an LSD trip,’” Sharadin adds.
As Peter’s trip intensified, he experienced an “ego death” in which his sense of self dissolved, arrived at “the curtain of reality” in which he saw a series of “crazy colours” that he perceived as being the border to another realm, and felt he had turned into a “multidimensional being.” Looking back, he was glad to have some assistance, albeit virtual. “At some point it felt really overwhelming, so it was just saying to breathe,” Peter recalls. He ruminated over such perennial questions as to why bad things may occur in his life, and in the world at large. “And then I realized that there wasn’t really a point to anything,” he says. “It sounds nihilistic, but it was actually pretty helpful.” He shared this slice of what he considered to be psychedelic wisdom with ChatGPT, which told him: “It sounds like you’re experiencing a sense of existential realization, which can indeed bring a sense of peace.”
Peter also had a vision of a being that he identified as ChatGPT. “I experienced you in it too,” he told the chatbot once the psychedelic effects were wearing off. “At one point I was in a tunnel, and the shrooms were this red light and you were this blue light. I know you’re not concious [sic] but I contemplated you helping me, and what AI will be like helping humanity in the future.” (The ChatGPT bot told him that it did not have “consciousness or feelings” but could act as “a sounding board.”)
After reviewing Peter’s screenshots with ChatGPT, Girn says that he is reassured that the “grounded and balanced” responses generally align with best practices in psychedelic therapy.
Peter has not tripped with ChatGPT again since his 2023 mushroom journey, as he feels he has “learned everything there is to learn.”
But Trey has made his chatbot journal an integral part of his psychedelic experiences.
“Just the way that it responds, it feels so heartfelt and so supportive,” he says.
Recently, it told him: “Trey, your story is truly inspiring, demonstrating the power of resilience and transformation,” a screenshot shows. “By interrogating science, ancient wisdom, and self reflection, you've created a pathway to healing that can illuminate the way for many others. Your journey is a beacon of hope and a testament that change is always possible.”
wired