AI, Psychedelics, and the New Illusion of Liberation
As someone born into a religious personality cult, I find the growing trend of AI-assisted psychedelic “therapy” deeply unsettling. The idea of surrendering one’s inner world to a chatbot, especially under the influence of powerful hallucinogens, is not an innovation. It’s a recursion of old patterns in new packaging.
In the 60s and 70s, entire segments of society turned to psychedelic experiences and New Age spirituality as a way to escape the trauma, violence, and failures of their time. The Vietnam War, the erosion of trust in institutions, the end of colonial empires, the rise of the sexual revolution, and the disillusionment of post-war promises created a chaotic background that led many to seek meaning in altered states and communal utopias. But what many found instead was indoctrination, manipulation, and the rise of micro-authoritarian groups masquerading as communities of freedom and love.
We are now in a similar place, only worse in some ways. Since 2001, we’ve been navigating the fallout of 9/11, endless war in Iraq and Afghanistan, the global rise of religious and nationalist extremism from Tehran to Jerusalem to Kabul. Add to that the crushing anxiety of climate collapse, the normalization of surveillance capitalism, economic precarity, and the erosion of democratic norms from Washington to Moscow. And people are understandably seeking escape. But if the 20th century taught us anything, it’s that not all alternatives are liberating. Many are traps — seductive ones.
I’ve seen the consequences up close. (see the first 15min of the documentary Buddhism the Law of Silence)
The so-called spiritual guidance often turned into coercion. Liberation became submission — not to gods or principles, but to the charisma and control of self-appointed gurus. And now, I fear we are repeating history, only this time we’re feeding our hunger for meaning into algorithms and chatbots instead of incense-lit communes.
A therapy app like Alterd that encourages users to take megadoses of LSD while “chatting with their own mind” may sound poetic, but let’s be clear: this is not introspection. This is outsourcing. When someone under the influence of a mind-altering substance interacts with a synthetic, unfeeling mirror of their data, they are not finding themselves — they are being shaped by the feedback loops of software designed without true emotional intelligence, let alone ethical responsibility.
And that’s not speculation. As Wired reported, neuroscientists have already raised serious concerns. Psychedelic experiences are known to bring up profound distress, past trauma, even psychotic breaks. Manesh Girn at UC San Francisco warns that AI lacks the most essential component of therapeutic care: emotional attunement and the capacity to co-regulate another human being’s nervous system. These are not abstract ideas — they are well-researched, foundational to any responsible use of psychedelics in clinical or therapeutic settings. Without this, the chance of serious harm increases — not healing, but fragmentation.
Instead, we now have apps promising “reassurance” via algorithms. With megadoses of LSD and zero human presence. This isn’t therapy. It’s disembodied self-hypnosis at best — and at worst, a digital trip-sitter for spiritual dissociation.
In a time when late capitalism is breaking our sense of belonging, when democracies are fragile or failing, when human connection is increasingly mediated by screens, it makes sense that people turn to anything that promises relief. But if the alternative to burnout is a synthetic ego trip curated by AI, then we’re not healing — we’re numbing.
And in that numbing, I see every single warning described by Gad Saad in The Parasitic Mind playing out in real time:
Idea pathogens are everywhere. From New Age pseudo-science to Silicon Valley techno-utopias, we’re being sold narratives that sound empowering but are deeply harmful.
We’ve stopped questioning authority — not traditional power this time, but algorithmic power. If the chatbot is “trained” and “personalized,” we assume it’s trustworthy.
Groupthink thrives in online spaces where people encourage each other to push deeper into this new frontier of AI and psychedelics without questioning its implications.
Echo chambers reinforce these beliefs. People emerge from their AI trips convinced they’ve “awakened,” often to find only the comfortable reflection of their own desires echoed back at them.
Confirmation bias ensures that users hear what they want to hear from the bot, not what they need to confront.
And when challenged, they exhibit the backfire effect — doubling down, believing even more fervently in the healing power of the machine-guided trip.
The Dunning-Kruger effect is evident when tech developers with no background in mental health or trauma believe they’re reinventing therapy because they’ve built a chatbot with “emotional tone detection.”
The availability heuristic and anchoring effect mean people rely on recent emotional logs or simple affirmations as the truth of their inner experience — which becomes self-reinforcing.
And finally, there’s the sunk cost fallacy: users keep returning to the chatbot because they’ve already journaled so much, already tripped “together” so often, that they cannot bear to admit it may not have helped.
The danger is not just the technology itself, but the fantasy it enables: that the path to self-awareness, to healing, to freedom, can be automated. That liberation is a download, not a struggle. That we don’t need each other — only data, substances, and “supportive” algorithms.
We are not going to fix the mess we’re in by layering hallucination upon hallucination. Healing — real healing — takes work, pain, responsibility, and human connection. It requires confronting the world as it is, not as a psychedelic chatbot might softly reframe it for us.
We should be alarmed not only by the technology, but by the social conditions that make its promises so seductive. When we stop believing in ourselves and each other, we start to believe in machines. And that’s how you end up with a generation high on illusion, guided not by wisdom, but by code.
The final illusion is perhaps the most dangerous: that in handing over our thoughts to a machine, we are somehow becoming more ourselves. In reality, we may just be digging deeper into the sunk cost — clinging to the illusion because we’ve already invested our trust in the system, even when it fails to make us whole.
We’re already blind to what surround us and the challenges of our time, do we really need this?
This is take writing in relation to this :
And this