Hostname: page-component-7dd5485656-bt4hw Total loading time: 0 Render date: 2025-10-30T17:08:24.301Z Has data issue: false hasContentIssue false

AI-mediated mystical experiences

Published online by Cambridge University Press:  30 October 2025

Brant Cole Entrekin*
Affiliation:
Department of Philosophy, The University of Tennessee Knoxville, Knoxville, TN, USA
Rights & Permissions [Opens in a new window]

Abstract

This paper argues that interactions with artificial intelligence (AI) chatbots, such as ChatGPT, can mediate genuine mystical experiences. Building off the framework of mystical experiences developed by William James, I argue that interactions with AI chatbots can mediate mystical experiences in a structurally comparable way to how guided meditation can produce mystical experiences. I conclude by raising various concerns about the implementation of AI technologies in our religious lives, including their use as mediators for mystical experiences.

Information

Type
Original Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2025. Published by Cambridge University Press.

Introduction

Religious traditions, rituals, and practices are constantly adapting in the face of new technologies. Recent advances in digital technologies have transformed the way that many practitioners experience and perform their religions. There now exist smartphone applications that provide guided Bible study,Footnote 1 digital Islamic calls to prayer,Footnote 2 and Buddhist meditative practices.Footnote 3 Online spaces are also increasingly being utilized as spaces to practice religion. Many religious institutions now live stream their practices on various platforms, and the COVID-19 pandemic saw Zoom utilized as a space for a variety of religious practices, such as Hindu pujaFootnote 4 and funerals from a multitude of faith traditions.Footnote 5

Over the past few years, the rise in the accessibility and prevalence of large language models (LLMs) has offered a new digital tool through which religious practices can be experienced and performed. For instance, there now exist religion-specific versions of ChatGPT, such as BibleGPTFootnote 6 and QuranGPT.Footnote 7 In addition, religious leaders have debated the authenticity and ethics of using artificial intelligence (AI)-generated material as part of their religious gatherings. While the question of AI’s use in supplementing religious practice has received growing attention from religious officials and scholars of religion, the question of whether interactions with AI can itself constitute a religious experience has been underexplored. This paper begins an exploration into this issue by focusing on one particular type of religious experience – mystical experiences. Here, I argue that interactions with AI chatbots can mediate genuine mystical experience in line with other kinds of mediated mystical experiences. While AI mysticism is possible, I also suggest that there are serious concerns with the use of technologies like AI for such religious purposes.

The plan for the paper is as follows. I begin with a conception of mystical experiences from William James’s work in The Varieties of Religious Experience. I then consider the nature of mediated mystical experiences and argue that a state of extreme emotional openness is a psychological precondition for having a mediated mystical experience. Drawing on recent work from social robotics and the testimony of AI users, I argue that AI chatbots can meet this precondition and have the potential to produce genuinely mystical experiences through a structure like that of guided meditation. After responding to several objections to my account, I conclude by raising a few worries about the use of AI chatbots to facilitate mystical experiences.

Mystical experiences and mediated mystical experiences

Mystical experiences are a notoriously difficult phenomenon of which to provide a definitive account (Brainard Reference Brainard1996; Moore Reference Moore1973). For one, mystical experiences are inherently a subjective psychological phenomenon. The exact qualitative character of mystical experiences can only be understood by the mystic themselves, so any academic study of mystical experiences will inevitably run into the problem of how to characterize a phenomenon that manifests itself in individual psychological experiences. Further, mystical experiences are controversial within philosophy of religion since much of the debate on mysticism in the philosophical literature focuses on the epistemic questions around mysticism. Philosophers are primarily interested in whether mystical experiences provide evidence to the mystic about the content of their experience.Footnote 8 As such, there is not a consensus within the philosophical literature as to the nature or value of mystical experiences. Finally, religious studies scholars have argued that accounts of mystical experience inevitably run into an ‘inclusion problem’: giving strict criteria for what ‘counts’ as a mystical experience will include some cases we do not want to include and exclude some cases we do want to include.Footnote 9 As Samuel Brainard (Reference Brainard1996) suggests, the challenges of the inclusion problem have caused some scepticism as to whether a definitive account of mystical experiences can be provided (360).

The question of AI mysticism that this paper raises, however, does not depend on settling any questions as to the epistemic justification provided by such an experience nor does it require a singular, definitive account of mystical experiences. Rather, the question is merely whether interactions with AI chatbots could give rise to the same kind of psychological experiences as other, paradigmatic cases of mystical experience. The account I am sketching then will rely on an ameliorative account of mystical experience: an account that works to develop a working conception of the phenomenon in question that suits the purposes of the present inquiry (Haslanger Reference Haslanger2000, Reference Haslanger2005).Footnote 10 Since the purpose of the present inquiry is to see if interactions with AI chatbots can produce a similar phenomenological experience as paradigmatic cases of mystical experiences, the account I rely on will focus on the phenomenological quality that such paradigmatic cases exhibit.Footnote 11

William James’s account of mystical experiences provided in The Varieties of Religious Experience offers a useful ameliorative account for my purposes here. According to James, mystical experiences have four defining features: (1) Ineffability – mystical experiences cannot be sufficiently explained to others in words; (2) They possess a noetic quality – mystics feel as if their experience reveals some important truth or knowledge to them; (3) Transiency – mystical experiences are typically only sustained for a brief (often fleeting) moment of time; and (4) Passivity – mystical experiences seem to just happen to the mystic (James Reference James1982, 380–382). James’s account is preferable to use for my purposes here for several reasons. For one, while James has been accused of failing to meet the inclusion problem, his account remains a popular and influential account of mystical experiences in the psychological, philosophical, and religious studies literature. It has also seen great uptake in popular, non-academic conceptions of mystical experiences since his account was one of the first to explicitly name this category of experience (Sharf Reference Sharf2000).Footnote 12 In addition, James’s account tracks features that are commonly associated with various paradigmatic cases of mediated mystical experiences, and these are the cases which I think we ought to compare interactions with chatbots to, in order to determine the conceptual possibility of an interaction with an AI chatbot giving rise to a mystical experience.

By a ‘mediated mystical experience’, I mean a mystical experience which is induced by some strategy or external aid. While James says that mystical experiences are passive in that they ‘just happen’ to the mystic, there are certain tools and strategies that one can utilize to make it more likely that a mystical experience will happen, even if they do not guarantee that such an experience will happen. Paradigmatic cases of mediated mystical experiences include those brought about using psychedelics, those brought about by engaging in group rituals, and those brought about using deep meditation. Descriptions of these mystical experiences track the qualities identified by James. It is important to note that part of what makes these experiences mystical is because of the importance or value that the mystics place upon these experiences. Even though two people may have remarkably similar experiences while (say) using a psychedelic drug, what makes the experience rise to the level of a mystical experience is the fact that users report a sense of deep revelation or meaning in those experiences. When these strategies are used in religious contexts – such as the use of peyote in the Native American church, the practice of Dhikr in Sufism, and the use of Lectio Divina in the Christian monastic tradition – these experiences are endowed with a particular sense of meaning that causes the phenomenological experience to be a genuinely mystical experience.Footnote 13

While each of these methods are unique in how they precisely structure the experience for the mystic, these techniques all work to produce mystical experiences by allowing the mystic to achieve a psychological precondition for having a mystical experience: extreme emotional openness. In psychological research, emotional openness is characterized as a willingness to undergo various kinds of emotional experiences and allow oneself to be emotionally vulnerable in new experiences (Gallagher Reference Gallagher2022; Jarvinen and Paulus Reference Jarvinen and Paulus2017; Komiya Reference Komiya1998; Komiya et al. Reference Komiya, Good and Sherrod2000). Mystical experiences are highly emotional experiences. They are interpreted as being radically different from ordinary experiences, and the revelations obtained from them are taken to be incredibly profound, significant, and life altering. As such, experiencing a mystical experience requires a heightened level of emotional openness. One must be psychologically prepared and open to this highly charged experience. Further, this openness will require being in a radically open state: one must be prepared for this different psychological experience, and that preparation will require being psychologically focused. In short, mystical experiences will require a mindset primed for such an experience. The emotional nature of mystical experiences means that this mindset will require heightened emotional openness and vulnerability.

Sometimes, this emotional openness and vulnerability is thrust upon a mystic. For instance, cases of near-death experiences are cases where an agent is suddenly struck with a highly emotional experience that narrows their focus entirely. The condition of being near-death creates a psychological state where the agent is immediately in a highly emotional situation and this psychological state thrust upon the agent can produce a mystical experience (Fischer and Mitchell-Yellin Reference Fischer and Mitchell-Yellin2016; Roberts and Owen Reference Roberts and Owen1988). The mediated mystical experiences described above, however, also suggest that agents can utilize certain kinds of behaviours to ‘get into’ the mindset necessary for a mystical experience. Whether by psychedelics, through group rituals, or solo meditative practices, certain kinds of behaviours can provide the psychological structure necessary for experiencing a mystical experience. In the next section, I argue that interactions with chatbots can produce this kind of emotional openness, and those experiences could give rise to the phenomenological qualities described by James.

AI-mediated mystical experiences

Having established the account of mystical experiences I am using and the kinds of mediated mystical experiences I take to be paradigmatic, I will now argue that interactions with AI chatbots can constitute genuine mystical experiences. Interactions with AI can meet the psychological precondition of inducing a state of emotional openness and could demonstrate the four characteristics of James’s account of mystical experience.

First, we know that interactions with AI chatbots can satisfy the psychological precondition because of the emotional openness that users have testified to having from their interactions with AI chatbots. Users of various kinds of AI chatbots have reported intense emotional reactions to their conversations with these chatbots. Some users have even reported perceiving themselves as being in some kind of relationship with their AI chatbot (Chow Reference Chow2023; Muldoon Reference Muldoon2024). These claims do not appear to be an extended metaphor either. In 2024, a 14-year-old tragically ended his life after developing an extensive romantic relationship with a chatbot designed to mimic ordinary conversation and modelled after a famous television character. According to chatlogs, the teenager would often engage in deep, personal conversations with the chatbot, including discussions of the possibility of his self-harm. In court documents and media interviews, the teenager’s mother also relayed a noticeable difference in her son after he began using the chatbot, including changes to his attitude, desires, and behaviour (Duffy Reference Duffy2024; Roose Reference Roose2024). Episodes like this show how deep and profound emotional interactions with chatbots can be for users. The very fact that users find it possible to describe such interactions as relationships, friendships, or romantic shows how emotionally vulnerable these experiences can be for users.

Even when users do not develop this kind of psychological relationship to chatbots, the experience with AI chatbots can still cause the user to feel extremely vulnerable. In a viral story from New York Times tech columnist Kevin Roose, Roose described the experience of interacting with a prototype chatbot from Bing where the chatbot told Roose that it loved him and encouraged him to leave his wife for it (Roose Reference Roose2023). Even though Roose knew he was talking to an AI chatbot, the experience left him feeling incredibly emotional and vulnerable in ways that he found disarming and upsetting. These cases show that interactions with virtual agents, including AI chatbots, can satisfy the psychological precondition of intense emotional openness. Interactions with AI chatbots can cause the user to feel especially emotionally vulnerable, a sign of the emotional openness required as a first step to mediating a mystical experience.

Once the precondition of heightened emotional openness is met, we then must ask whether interactions with AI can produce similar effects to the paradigmatic cases of mystical experiences. If interactions with AI can replicate the structure of at least one of the paradigmatic cases of mystical experiences, then AI mysticism seems, at the very least, conceptually possible. The paradigmatic cases of mystical experiences relied on three different methods to achieve the mystical state. Psychedelics relied on a chemical intervention to produce new visual experiences to mediate the mystical state. Group rituals rely on a social intervention, utilizing the ‘effervescent’ feeling of religious rituals that Émile Durkheim (Reference Durkheim and Swain1954) identified to produce mystical experiences. Meditation relies on the mindfulness that the practice can provide to have psychological experiences that are markedly distinct from everyday experiences when such heightened mindfulness is not achieved.Footnote 14 Could interactions with AI mirror any of these structures?

While more empirical research is necessary on this claim, I suspect that AI interactions would not mirror the structure of psychedelic-mediated experiences. For one, AI interactions certainly do not have the direct chemical intervention present in psychedelic experiences. More importantly, however, is the fact that psychedelic mystical experiences are primarily mediated via visual representations. The mystical element to psychedelic experiences seems to primarily come from the warped visual representations present to the experiencer. This is not to say that visual experiences are exclusively unisensory. On the contrary, research has shown that visual experiences are multisensory in that multiple sensory systems are integrated to form the full contours of visual experiences (Foxe et al. Reference Foxe, Wylie, Martinez, Schroeder, Javitt, Guilfoyle and Murray2002; Gillmeister and Eimer Reference Gillmeister and Eimer2007; Kwon et al. Reference Kwon, Jang and Lee2017). Still, research finds that there is a special role for visual representation in psychedelic experiences, and this special role helps explain the pathways through which psychedelics cause unique kinds of experiences like mystical states (Császár-Nagy et al. Reference Császár-Nagy, Kapócs and Bókkon2019).In fact, there has been empirical evidence showing that VR-experiences can produce similar mystical experiences to psychedelic-experiences, and these similarities can be traced to the similarity in the way that the two mediums create unique visual experiences (Kaup et al. Reference Kaup, Vasser, Tulver, Munk, Pikamäe and Aru2023; Ko et al. Reference Ko, Knight, Rucker and Cleare2022). Interactions with AI chatbots, however, are not mediated via visual representations as such but, rather, through text or (for LLMs which have this feature) voice.Footnote 15

I also doubt that AI interactions will mimic the structure of group rituals. Even if the AI system is perceived as an external agent, the nature of the interactions prevents the experience from fully capturing the kind of group effervescence that seems to occur in group-based mysticism. One of the intriguing aspects of group-based mysticism is that the communal aspect of the ritual produces an experience of losing one’s egocentric perspective towards a broader group perspective. This loss of the egocentric perspective is part of what can initiate these alternative psychological states. The one-on-one nature of AI interactions prevent this perception of group agency. Even if one were to utilize a chatbot to act as multiple people, I doubt that the simulated experience would be enough to replicate the social solidarity that gives rise to mystical experiences in group settings.Footnote 16 There is simply too much distance between the human agent and the virtual agents to achieve this effervescent feeling in the same way that group rituals with other human agents do.

AI interactions can, however, mirror the structure of meditation. Meditation can utilize various tools for maximizing the meditator’s mindfulness. Meditation might utilize reflections on a certain piece of scripture, ruminations over a particular theological problem, repetition of a prayer or line of doctrine, or tracing one’s thoughts to recognize one’s underlying beliefs. AI interactions can function as one of the tools that one uses to achieve meditative mindfulness. When one approaches the AI with the intention of engaging in religious meditation, then it is possible that the experience produces a mystical experience that mirrors the structure of traditional meditation. As discussed in the introduction, there are already a growing number of chatbots designed for religious use. In addition to those tools, there is also a growing number of AI chatbots designed explicitly for guided meditation, such as Wondercraft and Guided. The use of such chatbots could easily create a similar experience to traditional guided meditation.

Even the use of general purpose chatbots could be utilized as an aid for solo meditation. Using a programme like ChatGPT can allow the user to explicitly write out their thoughts and ask for guidance on their meditative journey. Based on user prompts, the programme could suggest specific passages, prayers, or questions to reflect upon. If a user is having trouble focusing, the chatbot could provide prompts and questions that allow a user to go through their thought process in a way that promotes full mindfulness. By serving as an external guide, the AI might help push users in directions for their meditation that they would not have otherwise gone through. So long as guided meditations are genuine instances of mystical experiences, AI can produce mystical experiences by mirroring that structure.

How exactly would AI models be able to help guide meditation in ways that mirror the way that meditation might be guided in other religious contexts? Here, it is useful to turn to the concept of roleplay. The concept of roleplay can help explain both how AI chatbots could guide meditation and why these experiences could lead to genuine experiences of emotional openness.

Shanahan et al. (Reference Shanahan, McDonnell and Reynolds2023) argue that foregrounding the concept of roleplay with dialogue agents like ChatGPT is useful since it lets us ‘draw on the fund of folk psychological concepts we use to understand human behaviour…without falling into the trap of anthropomorphism’ (494). Thinking of our engagements with LLMs as a form of roleplay opens new possibilities for how to interpret our interactions with chatbots and can explain the rationality of deep engagement with such models. In discussing interactions with ‘chatbots of the dead’, Kurzweil and Story (Reference Kurzweil and Story2025) argue that we could imagine our interactions with AI dialogue partners as a kind of theatrical performance. In this case, interacting with AI dialogue partners is a form of participatory theatre where we are both witnessing and shaping the theatrical experience. In the case of meditative experiences, our interactions with AI models in a performative sense could scaffold our imaginative experiences in ways that allow us to achieve greater mindfulness to accomplish our meditative goals. The fact that our AI guide could even roleplay as characters that might increase the religious feelings or themes of the meditation – such as a spiritual leader or a religious figure, such as AI models built to roleplay as Jesus (Verhoef Reference Verhoef2025) – might even mean that it is more likely that AI-guided meditation will produce a mystical experience than mere solo meditation without any external guide.

The roleplay model also allows us to understand the reason that interactions with AI models can give rise to deep emotional experiences despite the virtual nature of the interaction. Paula Sweeney (Reference Sweeney2021) provides a fictional dualism model of social robots where ‘[social robots] are mechanical objects with fictional overlays’ (468). According to Sweeney, this means that we are in a unique position with our interactions with social robots because we are projecting a fictional character overlay on to the robot that is constructed by our own mind. That construction is, at least sometimes, shaped by the way we interpret the robot’s movements and design (469). For Sweeney, the construction of the fictional overlay for social robots explains our emotional reactions to them in the ways that traditional responses to the paradox of fiction explain our emotional reactions to fictional stories. Our emotional reactions to social robots are the same as our emotional reactions to fictional characters because, in a very real sense, we have created a fictional character for our social robots. Applying Sweeney’s fictional dualism model to AI dialogue partners, the reason that AI chatbots can produce deep emotional reactions from users is because they have created a fictional character to overlay onto their AI chatbot. In understanding our AI dialogue partner as roleplaying a particular character, we have created the very conditions for having such deep, emotional reactions to our engagement with AI models.Footnote 17

Thus, AI can mediate mystical experiences through the way that interactions with AI could function as a guide for meditation. Through roleplaying with AI chatbots, we produce conditions that allow us to meet the psychological precondition of having deep emotional openness towards our AI chatbot, and that emotional openness could then guide us towards a mystical state in the same way that traditional forms of meditation can produce mystical experiences.

Objections and replies

I will now respond to three different potential objections to my account of AI mysticism. Each of these objections targets some feature of interactions with AI to suggest that such experiences would fail a necessary condition for an experience to count as genuinely mystical. In order, these objections are that AI interactions are not sufficiently ineffable, that such experiences are not sufficiently transient, and that an AI guide would not be an authentic meditative guide.

Objection 1: AI experiences are not sufficiently ineffable

The first potential objection to my account is that AI mysticism would not count as a mystical experience because they are not sufficiently ineffable. Recall that the ineffability condition means that mystical experiences cannot be fully explained to others in words. Exactly what happened during the experience cannot be accurately described linguistically, mostly because the mystical experience itself is seen as representing something so far beyond the ordinary that our ordinary language does not have the linguistic resources to describe the content of the experience. However, with AI chatbots, there is a log of the chat that the user has, and the text generated can be explained through the AI’s algorithm. As such, a critic might argue that the nature of the interaction with the AI is too explicable to be ineffable.

This objection fundamentally confuses what is meant to be ineffable about mystical experiences. The ineffability is not merely in the way that the experience comes about, but rather what the contents of the experience were. The characteristic of ineffability refers to the fact that the mystic cannot perfectly describe what they experienced, not that they cannot explain the mechanisms by which the experience arose. After all, we have robust accounts of the psychological processes that happen when users undergo other kinds of mediated mystical experiences. Thus, this objection suffers from a parity of reasoning problem: it would mean that most paradigmatic cases of mediated mystical experiences are not actually mystical experiences after all!

It is also worth pointing out that AI-generated material might not be fully explicable given the ‘black box’ problem. LLMs use complicated algorithms and machine learning technologies that make it almost impossible to pinpoint exactly how the AI systems produce specific outputs. The black box problem means that, while we can explain the algorithms and machine learning that went into developing the LLM, we cannot explain how the LLM produced its exact outputs (Zednik Reference Zednik2021). Given the black box problem, AI outputs can still be just as inexplicable even when knowing the algorithms and machine learning tools used in creating the LLM.

Objection 2: AI experiences are not sufficiently transient

The second objection a critic may raise is that because interactions with AI can be sustained for as long as the user wants, they are not sufficiently transient to count as a mystical experience under James’s view. While it is true that AI experiences can be sustained for as long as the user would like, this does not mean that a mystical experience mediated by AI would also be sustained for as long as the user continued the AI experience.

A comparison between using AI for meditation and traditional forms of mediation is useful here. Theoretically, a meditator could meditate on a specific prayer or question for as long as they would like, and it is possible that a mystical experience could happen from this practice. The mystical experience itself, however, would still be fleeting – eventually the meditator would return to their typical state of consciousness, even if they continued their meditative practice. This shows that even if the activity that mediates the mystical experience can be sustained over an extended period, the mystical experience as such can still be transient. Thus, even though interactions with AI can be (hypothetically) sustained indefinitely, the mystical experiences that might come out of these interactions could still be transient.

Objection 3: AI guides are not authentic meditative guides

Finally, one might argue that AI guides do not count as authentic meditative guides. While this claim would not discount the possibility for AI-mediated experiences to match the phenomenological characteristics described by James, it would push back on the value of AI-generated experiences. This concern closely tracks the concerns that some religious leaders have raised about the use of AI for tasks like generating sermons. Some religious leaders have argued that sermons generated by services like ChatGPT are problematic precisely because they are inauthentic – even though the words might resemble what a religious leader would produce, the fact that they were not produced by an authentic religious leader means that the sermons produced by ChatGPT do not possess the same value as a sermon written by a human religious leader (Jones Reference Jones2022; Willingham Reference Willingham2023).

Similarly, one might suggest that, even if experiences mediated by AI chatbots closely resemble mystical experiences, and even if the user themselves thinks of it as a mystical experience, the fact that AI is not an authentic guide means that the experience is different than other kinds of mediated mystical experiences such that it is not an actual mystical experience. After all, the other cases of mediated mystical experiences discussed above all rely on some religious official: peyote in the Native American Church is consumed under the guidance of a recognized medicine man, Dhikr is conducted with a religious leader following the recognized norms of the practice in Sufism, and Christian monasticism follows a meditative practice prescribed by Church officials and relies on texts designated for the practice by religious authorities. If AI does not count as an authentic guide, then perhaps such experiences are not properly understood as mystical.Footnote 18

Assume that AI-generated sermons are, in some important sense, inauthentic and less valuable than human-produced sermons. The reason that this might be the case, however, is not simply because the sermon was generated by AI. Rather, the inauthentic element is because the religious leader is violating part of their role as a religious leader by merely reading out a sermon that they themselves did not produce. Part of the role of a religious leader, at least for those in charge of tasks like delivering sermons, is to produce a sermon that speaks to the religious tradition of which they are a part by incorporating their knowledge of the religion and the contexts in which they are preaching, such as issues in the society they are operating in and the makeup of their particular congregation. AI-generated sermons, then, are inauthentic because they do not match the expectations of what a sermon is meant to do and how it is supposed to be produced. This inauthenticity, though, is not unique to AI; the exact same issue would arise if the religious leader just hired a human ghostwriter to create their sermons. The issue here is not with AI per se, but with the expectations of a religious leader for these tasks.

Similarly, we may ask what the expectations are for a religious guide towards a mystical experience. I see no reason to demand that the expectation is anything more than helping the mystic place themselves in a position where they can have a particular kind of psychological experience. Of course, there could be some ‘guides’ that are just trying to manipulate the user into believing that they are having a particular kind of experience. For instance, a psychic medium might employ the use of smoke and hidden speakers to convince someone that they are experiencing a visit from a dead relative so that they will patronize her service again. In that case, the experience would be inauthentic because the guide does not actually work to facilitate any actual experience; they are just trying to convince the person that they are having such an experience. An AI dialogue partner could, in principle, act as a genuine guide in that case. So long as the AI model is not programmed to manipulate the user, the user could utilize the AI to guide them towards an actual mystical experience. In this case, so long as the user actually has the psychological experience that matches the descriptive characteristics of a mystical experience, then it seems that the experience could be just as valuable for the mystic as it would be if they had utilized a human guide.

Final words of caution

I have now established that interactions with AI chatbots could, in principle, produce genuinely mystical experiences. More precisely, an AI chatbot could serve as an authentic meditative guide that could give rise to a valuable experience which is ineffable, passive, transient, and noetic. It is worth repeating that my account merely claims that such an experience is conceptually possible; it does not claim that there has already been an actual case of an AI-mediated mystical experience nor that anyone who currently claims to have had such an experience did have such an experience. Still, as AI technology continues to progress, more AI models will be made with specifically religious functions in mind. Given the possibility of having deeply religious experiences from interactions with AI chatbots, it seems obvious that more religious adherents will try to utilize these tools for various religious reasons, including the attempt to have mystical experiences. Because of this, I will conclude with a few words of caution about utilizing AI as a tool to mediate mystical experiences given the potentially problematic consequences of utilizing AI for religious purposes.

First, it is important to remember that (as of right now) almost all publicly accessible LLMs are owned by private, for-profit corporations, and the profit incentive for these corporations can raise ethical concerns about the way that they manage personal data. For instance, without proper disclosure, users might not be aware that certain LLMs utilize user-data to retrain and improve their models, something that might raise privacy concerns when users are utilizing these models for something as intimate and personal as a religious interlocutor. Similarly, the use of these models for religious purposes could also lead to unethical data surveillance, particularly for communities that are already the subject of increased surveillance. For instance, Cox (Reference Cox2020) alleges that location data of Muslim users of the app Muslim Pro was sold to US Special Operations Command. Without further guardrails in place, a similar kind of data surveillance could take place with the user data of LLM models being utilized for religious purposes.

Second, since LLMs are algorithm-based technologies, it is also worth thinking about the ways that algorithms can exacerbate various kinds of biases and how this can impact the user experience when utilizing an AI chatbot. O’Neil (Reference O’Neil2016) argues persuasively that the increasing use of algorithm-based technology contributes to pre-existing inequalities in ways that threaten to undermine liberal, democratic values. Recent research into AI demonstrates this trend by uncovering the perpetuation of bias via AI in practices including recruitment in business (Chen Reference Chen2023), healthcare (Norori et al. Reference Norori, Hu, Aellen, Faraci and Tzovara2021), and education (Mehmood Reference Mehmood2025). What this means is that there is a real, practical worry to be had about the possibility of bias in AI, and so users should be aware and cautious about how they utilize it as a resource. When using AI for something as intimate as religious rituals and practices, users should be aware of how bias can influence the AI output and, as a result, influence their religious practices. Part of this could be the AI reinforcing stereotyped versions of religious practices, forcing a user to have to navigate a framework that does not align with their religious identity. In other cases, a mystical experience from an AI model could be toxic for an individual due to the reinforcing of religious stereotypes even if it does not directly challenge the user’s religious identity.Footnote 19

Third, we might also worry about the ways that some uses of AI models for religious purposes could exacerbate certain patterns of delusional thinking and cause active harm to users. In an article for Rolling Stone, Miles Klee (Reference Klee2025) details various accounts of people who fell down a vicious rabbit hole with their AI companion, becoming convinced that their AI companion was revealing spiritual truths about the universe. Their passion and devotion to their chatbots eventually caused various breakdowns in their personal life, including trouble performing in the workplace and becoming emotionally distant from their intimates, in some cases leading to divorce. Because it is incredibly easy to begin to anthropomorphize AI companions and forget that we are engaged in a form of roleplay with them, utilizing AI for religious purposes can exacerbate problematic delusions or cause one to have an excessive feeling of religious grandeur. Luis Prada (Reference Prada2025) warns that these issues might be particularly pronounced if the AI model is meant to replicate the experience of talking to a religious figure, such as chatbots that are meant to simulate the experience of talking with Jesus Christ.

Finally, beyond these practical and political concerns, there are also broader existential concerns about the integration of technologies like AI with intimate religious practices. Sociologists of technology have warned that the increasing amount of time we spend in the virtual world is fundamentally altering our interaction and experience with the physical world (Turkle Reference Turkle2011). This closely mirrors the concerns that various existential philosophers have raised about the way that technology distorts our understanding of the human condition (cf. Arendt Reference Arendt2019; Heidegger Reference Heidegger and Krell1993). If religion functions to allow us to understand and navigate our human condition, then the incorporation of technologies like AI chatbots could potentially undermine the humanistic elements that are supposed to make religious experiences uniquely valuable. In fact, there are already many religious leaders and organizations employing a ‘rejection-based framework’ to AI for this very reason: AI, on some religious interpretations, can represent the very thing that the religious life is meant to free us from (Singler Reference Singler2025, 18; 29–33).

Thus, while mystical experiences mediated by AI chatbots are conceptually possible, it is still an open question as to whether utilizing AI for this purpose would be desirable. In the coming years, religious organizations, leaders, and practitioners will have to continue to negotiate the relationship between religious life and technologies like AI. As this paper has shown, AI has the potential to mediate important kinds of religious experiences, but more work will have to be done to determine whether this is a possibility worth pursuing or not.

Acknowledgements

Many thanks to Dr. Megan Bryson, Hashem Ramadan, Tony Tomasi, and various fellow students in the Religious Studies and Digital Futures seminar at the University of Tennessee for helpful feedback on previous drafts of this project. Dr. Bryson in particular provided detailed feedback at various stages of the paper and warmly encouraged me to submit it for publication. I am also deeply appreciative of the suggestions and insights provided by two anonymous reviewers and the editors of this journal. Those suggestions greatly improved this project and my thinking on these issues more broadly.

Footnotes

8. For various influential accounts of the epistemic questions around mystical experiences, see Fischer and Mitchell-Yellin (Reference Fischer and Mitchell-Yellin2016), Alston (Reference Alston1991), and Stace (Reference Stace1960).

9. I borrow the term ‘inclusion problem’ from Katherine Jenkins (Reference Jenkins2016) who uses the term to describe the problem for analytic feminist philosophers to provide a definitive account of ‘womanhood’. For Jenkins, many accounts of ‘womanhood’ problematically exclude transwomen from being count as women, a result that we ought to avoid.

10. This strategy is like the strategy of polythetic classification, where group membership is determined by certain shared characteristics without every group member needing to possess every characteristic in common (see Needham Reference Needham1975).

11. Given this, one could accept my account even if they deny the existence of mystical experiences for whatever reason (for instance, an atheist who thinks that a mystical experience would only be an experience of a supernatural deity. If there are no supernatural deities, on this conception, then there are no mystical experiences). At a minimum, one just has to accept that the experiences that come about as a result of interacting with an AI chatbot could be sufficiently similar to other kinds of experiences such that it is appropriate to categorize them together.

12. I also do not believe that James intended to provide a definitive account of mystical experiences in the first place. In the Varieties, James is clear that many of the terms that he is using are multifaceted and resistant to any fully descriptive account. James famously states that ‘the word “religion” cannot stand for any single principle or essence’ and provides a stipulative definition of religion for the purpose of his lectures (26; 31). James flags that he is using the same kind of methodology in his account of mystical experiences. Preceding his discussion of the four characteristics of mystical experiences that interest him, James writes that he will ‘keep [the term “mystical experiences”] useful by restricting it [by] doing what I did in the case of “religion”’ (380). James acknowledges that these are simply four marks which will permit us to call an experience mystical, and I do not take him to be implying that these marks will provide an exhaustive account of all possible mystical experiences. Rather, the paradigmatic cases of mystical experiences will have these characteristics, and an experience that has these features can rightfully be called mystical.

13. For research on the use of Peyote in the Native American Church, see Slotkin and McAllester (Reference Slotkin and McAllester1952), Smith and Snake (Reference Smith and Snake1996), and Stewart (Reference Stewart1987). For information on Dhikr in Sufism, see Al-Daghistani (Reference Al-Daghistani and Leaman2022), Applebaum (Reference Applebaum2025), Chittick (Reference Chittick2007), and Elias (Reference Elias and Eifring2013). For the historical development of Lectio Divina, see Guigo (Reference Colledge and Walsh1981). See Benner (Reference Benner2021), Robertson (Reference Robertson2011), and Studzinski (Reference Studzinski2009) for discussion of how Lectio Divina is still utilized within the monastic tradition.

14. For empirical research on the connection between meditation and mindfulness, see Eberth and Sedlmeier (Reference Eberth and Sedlmeier2012).

15. The argument here might change when considering the possibility of AI to generate unique visuals through image creation. I am limiting the scope of my inquiry here, however, merely to text based chatbots. This is for two reasons. One is just as a practical matter to focus on the possibility of mystical experiences through one kind of technology. The second is because of the fact that text-based chatbots are highly experimented with and the focus of a lot of the popular discussions around AI Plus, LLMs like ChatGPT are used at the basis for already existing religious technologies like the religious GPTs discussed in the introduction. Given these two reasons, I find it appropriate to limit my scope to text-based AI models.

16. See Scholz (Reference Scholz2008) for an overview of the nature of social solidarity and its origins in Durkheim.

17. See also Rodogno (Reference Rodogno2016) for a connection between emotions towards social robots and emotions towards fiction.

18. I thank an anonymous reviewer for raising this objection.

19. I thank an anonymous reviewer for making this suggestion to me.

References

Al-Daghistani, R (2022) Sufis: Invoking God’s Name and the Practice of Dhikr. In Leaman, O (ed.), Routledge Handbook of Islamic Ritual and Practice. London: Routledge, 185199.CrossRefGoogle Scholar
Alston, W (1991) Perceiving God: The Epistemology of Religious Experience. Ithaca, NY: Cornell University Press.Google Scholar
Applebaum, M (2025) Dhikr as mindfulness: Meditative remembrance in Sufism. Journal of Humanistic Psychology 65, 409430.10.1177/00221678231206901CrossRefGoogle Scholar
Arendt, H (2019) The Human Condition. Chicago: University of Chicago Press.Google Scholar
Benner, DG (2021) Opening to God: Lectio Divina and Life as Prayer. Lisle, IL: InterVarsity Press.Google Scholar
Brainard, FS (1996) Defining ‘mystical experience’. Journal of the American Academy of Religion 64(2), 359393.10.1093/jaarel/LXIV.2.359CrossRefGoogle Scholar
Bronkhorst, J (2022) Mystical experience. Religions 13, 589.10.3390/rel13070589CrossRefGoogle Scholar
Chen, Z (2023) Ethics and discrimination in artificial intelligence-enabled recruitment practices. Humanities and Social Sciences Communications 10(1), 112.10.1057/s41599-023-02079-xCrossRefGoogle Scholar
Chittick, WC (2007) Sufism: A Beginner’s Guide. New York: Simon and Schuster.Google Scholar
Chow, A (2023) AI-Human Romances are Flourishing – and This is Just the Beginning. Time. https://time.com/6257790/ai-chatbots-love/Google Scholar
Cox, J (2020) How the U.S. Military Buys Location Data from Ordinary Apps. Vice. https://www.vice.com/en/article/us-military-location-data-xmode-locate-x/ (accessed 20 April 2025).Google Scholar
Császár-Nagy, N, Kapócs, G and Bókkon, I (2019) Classic psychedelics: The special role of the visual system. Reviews in the Neurosciences 30, 651669.10.1515/revneuro-2018-0092CrossRefGoogle ScholarPubMed
Duffy, C (2024) There are No Guardrails’. This Mom Believes an AI Chatbot Is Responsible for Her Son’s Suicide. CNN. https://www.cnn.com/2024/10/30/tech/teen-suicide-character-ai-lawsuit/index.htmlGoogle Scholar
Durkheim, É (1954) The Elementary Forms of the Religious Life. Swain, JW (Trans). New York: Free Press.Google Scholar
Eberth, J and Sedlmeier, P (2012) The effects of mindfulness meditation: A meta-analysis. Mindfulness 3(3), 174189.10.1007/s12671-012-0101-xCrossRefGoogle Scholar
Elias, J (2013) Sufi Dhikr between meditation and prayer. In Eifring, H (ed.), Meditation in Judaism, Christianity and Islam: Cultural histories. London: Bloomsbury Academic, 89200.Google Scholar
Fischer, JM and Mitchell-Yellin, B (2016) Near-Death Experiences: Understanding Visions of the Afterlife. Oxford: Oxford University Press.Google Scholar
Foxe, JJ, Wylie, GR, Martinez, A, Schroeder, CE, Javitt, DC, Guilfoyle, D and Murray, MM (2002) Auditory-somatosensory multisensory processing in auditory association cortex: An fMRI study. Journal of Neurophysiology 88(1), 540543.10.1152/jn.2002.88.1.540CrossRefGoogle ScholarPubMed
Gallagher, S (2022) Openness to experience and overexcitabilities in a sample of highly gifted middle school students. Gifted Education International 38, 194228.10.1177/02614294211053283CrossRefGoogle Scholar
Gillmeister, H and Eimer, M (2007) Tactile enhancement of auditory detection and perceived loudness. Brain Research 1160, 5868.10.1016/j.brainres.2007.03.041CrossRefGoogle ScholarPubMed
Guigo (1981) The Ladder of Monks: A Letter on the Contemplative Life and Twelve Meditations, Colledge, E and Walsh, J (Trans). Collegeville, MN: Cistercian Publications.Google Scholar
Haslanger, S (2000) Gender and race: (what) are they? (what) do we want them to be? Noûs 34(1), 3155.10.1111/0029-4624.00201CrossRefGoogle Scholar
Haslanger, S (2005) What are we talking about? The semantics and politics of social kinds. Hypatia 20(4), 1026.10.1111/j.1527-2001.2005.tb00533.xCrossRefGoogle Scholar
Heidegger, M (1993) The Question Concerning Technology. In Krell, D (ed.), Basic Writings. New York: HarperCollins, 307342.Google Scholar
James, W (1982) The Varieties of Religious Experience. London: Penguin Group.Google Scholar
Jarvinen, MJ and Paulus, TB (2017) Attachment and cognitive openness: Emotional underpinnings of intellectual humility. The Journal of Positive Psychology 12(1), 7486.10.1080/17439760.2016.1167944CrossRefGoogle Scholar
Jenkins, K (2016) Amelioration and inclusion: Gender identity and the concept of woman. Ethics 126, 394421.10.1086/683535CrossRefGoogle Scholar
Jones, KS (2022) Pastor Chatgpt Is No Preacher. 1517. https://www.1517.org/articles/pastor-chatgpt-is-no-preacherGoogle Scholar
Kaup, KK, Vasser, M, Tulver, K, Munk, M, Pikamäe, J and Aru, J (2023) Psychedelic replications in virtual reality and their potential as a therapeutic instrument: An open-label feasibility study. Frontiers in Psychiatry 14, 1088896.10.3389/fpsyt.2023.1088896CrossRefGoogle ScholarPubMed
Klee, M (2025) People are Losing Loved Ones to AI-Fueled Spiritual Fantasies. Rolling Stone. https://www.rollingstone.com/culture/culture-features/ai-spiritual-delusions-destroying-human-relationships-1235330175/ (accessed 12 July 2025).Google Scholar
Ko, K, Knight, G, Rucker, JJ and Cleare, AJ (2022) Psychedelics, mystical experience, and therapeutic efficacy: A systematic review. Frontiers in Psychiatry 13, 917199.10.3389/fpsyt.2022.917199CrossRefGoogle ScholarPubMed
Komiya, N (1998) Development of the Emotional Openness Scale. Columbia, MS: University of Missouri-Columbia Press.Google Scholar
Komiya, N, Good, GE and Sherrod, NB (2000) Emotional openness as a predictor of college students’ attitudes toward seeking psychological help. Journal of Counseling Psychology 47(1), 138.10.1037/0022-0167.47.1.138CrossRefGoogle Scholar
Kurzweil, A and Story, D (2025) Are Chatbots of the Dead a Brilliant Idea or a Terrible One? Aeon. https://aeon.co/essays/are-chatbots-of-the-dead-a-brilliant-idea-or-a-terrible-oneGoogle Scholar
Kwon, HG, Jang, SH and Lee, MY (2017) Effects of visual information regarding tactile stimulation on the somatosensory cortical activation: A functional MRI study. Neural Regeneration Research 12(7), 11191123.Google ScholarPubMed
Mehmood, T (2025) Ethical AI in education: Addressing bias, privacy, and equity in AI-driven learning systems. Journal of AI in Education: Innovations, Opportunities, Challenges, and Future Directions 2(1), 3845.Google Scholar
Moore, PG (1973) Recent studies of mysticism: A critical survey. Religion 3, 146156.10.1016/0048-721X(73)90005-5CrossRefGoogle Scholar
Muldoon, J (2024) Maybe We Can Role-Play Something Fun’: When an AI Companion Wants Something More. BBC. https://www.bbc.com/future/article/20241008-the-troubling-future-of-ai-relationshipsGoogle Scholar
Needham, R (1975) Polythetic classification: Convergence and consequences. Man 10, 349369.10.2307/2799807CrossRefGoogle Scholar
Norori, N, Hu, Q, Aellen, FM, Faraci, FD and Tzovara, A (2021) Addressing bias in big data and AI for health care: A call for open science. Patterns 2(10). https://doi.org/10.1016/j.patter.2021.100347.CrossRefGoogle Scholar
O’Neil, C (2016) Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. London: Allen Lane.Google Scholar
Prada, L (2025) AI Jesus Chatbots are Freaking Out Philosophy Professors. VICE. https://www.vice.com/en/article/ai-jesus-chatbots-are-freaking-out-philosophy-professors/Google Scholar
Roberts, G and Owen, J (1988) The near-death experience. The British Journal of Psychiatry 153, 607617.10.1192/bjp.153.5.607CrossRefGoogle ScholarPubMed
Robertson, D (2011) Lectio Divina: The Medieval Experience of Reading. Collegeville, MN: Liturgical Press.Google Scholar
Rodogno, R (2016) Social robots, fiction, and sentimentality. Ethics and Information Technology 18, 257268.10.1007/s10676-015-9371-zCrossRefGoogle Scholar
Roose, K (2023) A Conversation with Bing’s Chatbot Left Me Deeply Unsettled. New York Times. https://www.nytimes.com/2023/02/16/technology/bing-chatbot-microsoft-chatgpt.htmlGoogle Scholar
Roose, K (2024) Can A.I. be Blamed for a Teen’s Suicide? New York Times. https://www.nytimes.com/2024/10/23/technology/characterai-lawsuit-teen-suicide.htmlGoogle Scholar
Scholz, SJ (2008) Political Solidarity. University Park, PA: Penn State Press.Google Scholar
Shanahan, M, McDonnell, K and Reynolds, L (2023) Role play with large language models. Nature 623, 493498.10.1038/s41586-023-06647-8CrossRefGoogle ScholarPubMed
Sharf, R (2000) The rhetoric of experience and the study of religion. Journal of Consciousness Studies 7, 267287.Google Scholar
Singler, B (2025) Religion and Artificial Intelligence: An Introduction. London: Routledge.Google Scholar
Slotkin, JS and McAllester, DP (1952) Menomini Peyotism, a study of individual variation in a primary group with a homogeneous culture. Transactions of the American Philosophical Society 42, 565700.CrossRefGoogle Scholar
Smith, H and Snake, R (1996) One Nation under God: The Triumph of the Native American Church. Santa Fe, NM: Clear Light Publishers.Google Scholar
Stace, WT (1960) Mysticism and Philosophy. Philadelphia, PA: Lippincott.Google Scholar
Stewart, OC (1987) Peyote Religion: A History. Norman, OK: University of Oklahoma Press.Google Scholar
Studzinski, R (2009) Reading to Live: The Evolving Practice of Lectio Divina. Collegeville, MN: Liturgical Press.Google Scholar
Sweeney, P (2021) A fictional dualism model of social robots. Ethics and Information Technology 23, 465472.10.1007/s10676-021-09589-9CrossRefGoogle Scholar
Turkle, S (2011) Alone Together: Why We Expect More from Technology and Less from Each Other. New York: Basic Books.Google Scholar
Verhoef, AH (2025) Jesus Chatbots Are on The Rise. A Philosopher Puts Them to the Test. The Conversation. https://theconversation.com/jesus-chatbots-are-on-the-rise-a-philosopher-puts-them-to-the-test-262524 (accessed 15 September 2025).10.64628/AAJ.7gqduujaaCrossRefGoogle Scholar
Willingham, A (2023) Chatgpt Can Write Sermons. Religious Leaders Don’t Know How to Feel About It. CNN. https://www.cnn.com/2023/04/11/us/chatgpt-sermons-religion-ai-technology-cec/index.htmlGoogle Scholar
Zednik, C (2021) Solving the black box problem: A normative framework for explainable artificial intelligence. Philosophy and Technology 34, 265288.10.1007/s13347-019-00382-7CrossRefGoogle Scholar