Hostname: page-component-68c7f8b79f-lvtpz Total loading time: 0 Render date: 2025-12-19T00:03:36.635Z Has data issue: false hasContentIssue false

Generative AI and reassurance-seeking in OCD

Published online by Cambridge University Press:  18 December 2025

Dara Friars*
Affiliation:
General Adult Psychiatry, Cluain Mhuire Mental Health Services, Dublin, Ireland
Gráinne Flynn
Affiliation:
General Adult Psychiatry, Cluain Mhuire Mental Health Services, Dublin, Ireland
*
Corresponding author: Dara Friars; Email: dara.friars@gmail.com
Rights & Permissions [Opens in a new window]

Abstract

Information

Type
Letter to the Editor
Copyright
© The Author(s), 2025. Published by Cambridge University Press on behalf of College of Psychiatrists of Ireland

Sir,

Re: Generative AI Large Language Models and Reassurance-Seeking in Obsessive–Compulsive Disorder.

I write to highlight – and prompt reflection on – the evolving relationship between generative artificial intelligence (AI), particularly large language models (LLMs) such as ChatGPT, and obsessive–compulsive disorder (OCD). While research in this area is still emerging, there is growing recognition among clinicians and academics that LLMs are facilitating novel patterns of compulsive reassurance-seeking and checking which merit consideration.

Classically, reassurance-seeking in OCD has involved repeated appeals to family, friends, or health professionals in a vain pursuit of certainty. With the advent of the internet over the past three decades, a digital analogue has emerged in the form of the Google search bar. “Cyberchondria” – a pattern of repeated internet searches for medical information that ultimately results in distress rather than reassurance – has become familiar across clinical settings (Doherty-Torstrick et al. Reference Doherty-Torstrick, Walton and Fallon2016; Starcevic & Berle, Reference Starcevic and Berle2013). Although this phenomenon is more closely tied to health anxiety, it exemplifies a broader trend of online reassurance-seeking, which has also been observed in OCD – especially among individuals with more severe symptoms – where such behaviours reinforce the disorder’s pathological cycle (Parsons, Reference Parsons2025; Parsons & Alden, Reference Parsons and Alden2022).

The arrival of LLMs represents a further evolution. Unlike static webpages or the inherently finite patience of human interlocutors, ChatGPT and its competitors provide seemingly limitless courteous and adaptive responses. Each reply shifts subtly in response to the nuances of the user’s question, creating a sense of personalised engagement, while their conversational yet authoritative tone fosters a human-like credibility (Parsons, Reference Parsons2025). Available round the clock and devoid of social cues that typically signal “enough is enough”, these models may unintentionally give rise to cycles of compulsive reassurance seeking that would have been self-limiting in bygone years (Golden & Aboujaoude, Reference Golden and Aboujaoude2024).

As AI tools become embedded in daily life, their potential to reinforce compulsive reassurance-seeking grows increasingly relevant to clinical practice. Clinicians may consider screening for LLM use during OCD assessments or discuss healthy ways of interacting with AI tools during psychoeducation. With early research into AI’s role in OCD diagnosis and treatment now underway, a healthy scepticism is important as ever to ensure these technologies support, rather than impede, our patients’ recoveries (Golden & Aboujaoude, Reference Golden and Aboujaoude2024; Kim et al. Reference Kim, Pacheco, Golden, Aboujaoude, van Roessel, Gandhi, Mukunda, Avanesyan, Xue, Adeli, Kim, Saggar, Stirman, Kuhn, Supekar, Pohl and Rodriguez2025)

Yours sincerely,

Dr Dara Friars,

Psychiatry Registrar, Cluain Mhuire Mental Health Service, Blackrock, Co. Dublin

Dr Grainne Flynn,

Consultant Psychiatrist, Cluain Mhuire Mental Health Service, Blackrock, Co. Dublin

Funding statement

The authors have no funding to declare.

Competing interests

The authors have no conflicts of interest to declare.

References

Doherty-Torstrick, ER, Walton, KE and Fallon, BA (2016) Cyberchondria: Parsing health anxiety from online behavior. Psychosomatics 57, 390400. https://doi.org/10.1016/j.psym.2016.02.002.CrossRefGoogle ScholarPubMed
Golden, A and Aboujaoude, E (2024) Describing the framework for AI tool assessment in mental health and applying It to a generative AI obsessive–Compulsive disorder platform: Tutorial. JMIR Formative Research 8, e62963. https://doi.org/10.2196/62963.CrossRefGoogle ScholarPubMed
Kim, J, Pacheco, JPG, Golden, A, Aboujaoude, E, van Roessel, P, Gandhi, A, Mukunda, P, Avanesyan, T, Xue, H, Adeli, E, Kim, JP, Saggar, M, Stirman, SW, Kuhn, E, Supekar, K, Pohl, KM and Rodriguez, CI (2025) Artificial intelligence in obsessive–Compulsive disorder: A systematic review. Current Treatment Options in Psychiatry 12, 23. https://doi.org/10.1007/s40501-025-00359-8.CrossRefGoogle ScholarPubMed
Parsons, CA (2025) Online Reassurance-seeking in Obsessive–compulsive Disorder [Text]. Available at: https://open.library.ubc.ca/collections/24/items/1.0448291 (accessed 28 July 2025).Google Scholar
Parsons, CA and Alden, LE (2022) Online reassurance-seeking and relationships with obsessive–compulsive symptoms, shame, and fear of self. Journal of Obsessive–Compulsive and Related Disorders 33, 100714.10.1016/j.jocrd.2022.100714CrossRefGoogle Scholar
Starcevic, V and Berle, D (2013) Cyberchondria: Towards a better understanding of excessive health-related internet use. Expert Review of Neurotherapeutics 13, 205213. https://doi.org/10.1586/ern.12.162.CrossRefGoogle ScholarPubMed