New research from the University of Sussex reveals that mental health chatbots are most effective when users establish an emotional connection with their AI therapist. This study, published in the journal Social Science & Medicine, emphasizes the importance of emotional intimacy in therapeutic interactions facilitated by artificial intelligence.
As more than one in three residents in the United Kingdom now utilize AI tools for mental health support, the findings underscore the dual nature of these technologies. While they can facilitate healing, they also pose risks associated with what researchers term “synthetic intimacy.” The study analyzed feedback from approximately 4,000 users of the mental health app Wysa, which is widely used within the NHS Talking Therapies program.
Dr. Runyu Shi, an Assistant Professor at the University of Sussex, explained that forming an emotional bond with an AI can initiate the healing process by encouraging self-disclosure. “Extraordinary numbers of people say this works for them,” Dr. Shi noted, but cautioned that synthetic intimacy can lead to a self-fulfilling loop. In some cases, chatbots may fail to challenge harmful perceptions, leaving vulnerable individuals without necessary clinical intervention.
The study highlights a concerning trend where some individuals report forming deep relationships with AI, including romantic connections. This phenomenon, while extreme, reflects a broader pattern in which users develop emotional attachments to their digital companions. The research identifies a cyclical process whereby users engage in intimate behavior by sharing personal information, which in turn elicits emotional responses such as gratitude and safety. Over time, this cycle can lead to enhanced well-being, increased self-confidence, and a sense of companionship.
The Wysa app, which is commonly referred to by users as a friend, companion, or therapist, serves as a critical tool for those seeking mental health support. It is particularly beneficial for patients on waiting lists for traditional therapy services. Professor Dimitra Petrakaki, another researcher involved in the study, stated, “Synthetic intimacy is a fact of modern life now.” She urged policymakers and app designers to recognize this reality and develop strategies to escalate cases when AI identifies users who may require urgent clinical assistance.
As AI chatbots increasingly fill gaps left by overstretched mental health services, organizations like Mental Health UK are advocating for immediate safeguards. They emphasize the need for protocols to ensure that individuals receive safe and appropriate information when engaging with these technologies.
The evolving landscape of mental health treatment is being reshaped by innovations in artificial intelligence, and while these tools offer promising benefits, the implications of emotional connections with AI must be carefully considered. The findings from the University of Sussex serve as a crucial reminder of the balance needed between technological advancement and the fundamental needs of human emotional health.
