Researchers at Cedars-Sinai have developed a virtual reality AI mental health support tool called eXtended-reality Artificial Intelligence Assistant (XAIA).
The study was conducted by Cedars-Sinai researchers led by Brennan MR Spiegel and published in Nature's npj Digital Medicineused AI, spatial computing and VR to immerse users in calming, nature-inspired environments where they’ve therapeutic conversations with an AI avatar.
The system used GPT-4 to offer immersive therapy sessions to 14 individuals with mild to moderate anxiety or depression. XAIA may be accessed via the Apple Vision Pro VR headset.
Lead researcher Brennan Spiegel, MD, MSHS, wrote in a Cedars-Sinai blog: “Apple Vision Pro provides a gateway into Xaia's world of immersive, interactive behavioral health support – making advances that I’d describe as only a quantum leap beyond previous technologies can describe.” ”
He continued: “With Xaia and the stunning Apple Vision Pro display, we’re in a position to harness every pixel of this remarkable resolution and the total spectrum of vibrant colours to create a type of immersive therapy that’s charming and deeply personal .”
To train the AI, Spiegel and his team used transcriptions of cognitive behavioral therapy (CBT) sessions conducted by experienced therapists that focused on empathy, validation, and effective communication.
The AI's responses were further refined through iterative testing during which therapists role-played various clinical scenarios. This led to a continuous improvement within the system's psychotherapeutic communication.
Researchers used spatial computing, VR and AI (GPT-4) to develop an immersive therapy chatbot. Source: Nature.
Participants discussed various topics using AI, allowing researchers to document AI's use of psychotherapeutic techniques. Overall, XAIA was known for its ability to precise empathy, compassion, and affirmation, thereby enhancing the therapeutic experience.
For example, XAIA's empathetic response to at least one participant's experience of feeling omitted was: “I'm sorry to listen to that you just felt rejected in such a transparent way, especially once you cared about what was necessary to you. “It will need to have been a tough experience.”
The researchers conducted a qualitative thematic evaluation of participant feedback, indicating a general appreciation for the non-judgmental nature of AI and the standard of VR environments.
Some said XAIA could offer a useful alternative to traditional therapy, particularly for those in search of anonymity or wary of in-person sessions.
Others emphasized the importance of human interaction and the unique advantages of connecting with a human therapist.
The study also identified areas for improvement, similar to AI's tendency to over-interview participants or under-explore emotional responses to major life events.
Explaining the tool's mission, Brennan Spiegel clarified: “While this technology is just not intended to switch psychologists, but somewhat to enhance them, we developed can provide meaningful mental health support.”
It appears to be an interesting start line for a deeper exploration of immersive therapy environments, which could definitely profit some who wouldn’t have access to in-person therapy or who wish to stay private and anonymous of their conversations.
AI for analyzing therapy conversations
In addition to the role of the therapist, AI has also been used to investigate the dynamics of real therapy conversations.
In a 2023 study, researchers used AI to uncover the layers of psychotherapy sessions, revealing that certain speech patterns could also be key to understanding the bond between therapists and their patients.
The impetus behind this research arises from a long-standing dilemma in psychotherapy: How can we accurately assess and improve the therapeutic alliance?
Published in iScience magazineThe study showed how personal pronouns and language hesitations signal the depth of the therapist-patient connection.
This term refers back to the essential relationship between therapists and their patients, a vital foundation for effective therapy.
Traditionally, understanding this relationship has been a subjective matter, counting on personal accounts and third-party observations, which, while useful, may miss the fluid dynamics of actual therapy sessions.
Researchers on the Icahn School of Medicine at Mount Sinai saw a chance to make use of machine learning to make clear how therapeutic communication works.
The study took place in clinics in New York City and included 28 patients and 18 therapists who participated in various therapy sessions. Before sessions began, patients reflected on their previous therapeutic relationships and attachment styles using online surveys.
The researchers used machine learning to investigate meeting minutes using natural language processing (NLP), specializing in using pronouns similar to “I” and “we” in addition to non-fluent markers similar to “um” and “like”.
The way therapists and patients used personal pronouns appeared to influence the alliance.
For example, the study found that therapists' frequent use of “we” didn’t all the time strengthen the alliance as expected, particularly in cases involving personality disorders. This contradicts the common assumption that inclusive language robotically strengthens connections.
Additionally, either party's overreliance on “I” was related to lower alliance rankings, suggesting the potential dangers of over-focusing on self in therapy sessions.
The authors wrote: “Our primary finding was that more frequent use of first-person pronouns in each therapists and patients (“we,” “I do,” “I believe,” “if I”) correlated with lower levels alliance rankings.”
An unexpected finding was that hesitation, often viewed as a negative sign for conversation, was related to higher alliance rankings, suggesting that pauses can promote authenticity and commitment.
Previous research has found that pauses are a necessary a part of a very thoughtful conversation.
In the researchers' words, “We found that higher non-fluency amongst patients (e.g., “is like,” “um”), but not therapists, characterised sessions with higher alliance rankings by patients.”
The researchers also cautioned that these correlations weren’t entirely reliable on account of the dimensions and observational nature of the study.
AI has been used for speech evaluation in medical settings, similar to when researchers at UCL and the University of Oxford developed one Model for detecting potential schizophrenia from language patterns.