LAWRENCE — People have long turned to support groups to find assurance and connection from others with similar experiences in dealing with some of life's most difficult situations. But little is known about which nonverbal behaviors, especially in a virtual group session, identify if two people are connecting. New research from the University of Kansas and University of Southern California is measuring the behavioral markers of alliance between support group participants.
Researchers analyzed data from 18 support groups with 96 participants. They measured dyadic alliance, or connection between two individuals, by surveying participants about how connected they felt with each other. Using computational algorithms to detect verbal and nonverbal communication features that are language, audio and visual, the study also explored the viability of using machine learning to measure dyadic alliance and its potential role in mental health settings.
Mental health service demand has been steadily increasing in recent years, according to the researchers. While it has been difficult for professionals to meet demand, people have also increasingly turned to artificial intelligence.
"This project came into inception when we were looking at burnout among mental health professionals, especially with the rising need of support groups and one-on-one counseling during and after the COVID-19 pandemic," said Yunwen Wang, assistant professor of journalism & mass communications, faculty affiliate of the Institute for Information Sciences at KU and one of the study's authors. "We kept talking about getting back to normal after COVID, but I felt like there is this lingering effect of COVID, and there are a lot of thoughts that we had about the opportunity of AI and how we can ethically leverage it and large language models — instead of replacing human therapists —but for increasing mental health access to more people."
Researchers recruited participants to support groups for general anxiety. Participants were measured for their mental health and affective states before and after taking part in a support group session and reported their connection with each other afterward. The session took place through online video conferencing and was facilitated by a virtual conversational agent with a robot embodiment. Participants saw the virtual robot agent on their screen that facilitated conversation, but a person was operating the agent and could intervene if necessary.
From the consented session recordings, participants' verbal communication was transcribed, and nonverbal gestures such as head nodding, smiling, eyebrow raising, frowns and side-to-side head movement and tilt were detected using computational tools. Researchers also recorded amounts of words spoken per utterance or per unit of time, pitch variation and visual factors like smile intensity, or how long a gesture lasted. The agent was programmed to facilitate conversation via a script developed by researchers to ask participants, who were college students, how their semester was going, about academic stress and related topics.
Analysis comparing different combinations of verbal and nonverbal communication features showed listeners' and speakers' head poses and facial expressions were strong predictors of dyadic alliance, according to the researchers. A speaker would feel more alliance with a listener of more frequent head nods, more brow raises and fewer frowns. For speakers, more pitch variation, brow raises, head nods and head pose variation, and less intense smile and frown, also increased dyadic alliance.
Researchers said the findings suggest verbal and nonverbal communication should be considered when measuring alliance among participants in a group setting and add to previous research showing that alliance among pairs can also benefit overall group engagement, among other outcomes.
The study, written by Kevin Hyekang Joo, Zongjian Li, Yuanfeixue Nan, Mina Kian, Shriya Upadhyay, Maja Matarić, Lynn Carol Miller and Mohammad Soleymani, all of the University of Southern California, and Wang of KU, was published in the Proceedings of the 27th International Conference on Multimodal Interaction of the Association for Computing Machinery.
The findings show that there can potentially be a role for LLMs and AI as computational identification of behavioral markers of dyadic alliance. However, researchers said they are not advocating for unrestricted use of AI in mental health settings and that further research is needed.
"The goal wasn't to replace humans or to compare human versus AI-assisted mental health support facilitators," Wang said. "It's comparing machine learning fusion of unimodal, bimodal and trimodal communication features to computationally estimate how humans perceive their relationship with each other in the group. We consider this work as one of the first steps to understand the boundaries of AI's applications in mental health. As chatbots are gaining popularity, there have been discussions, including legislation discussions in states like California on questions including, 'Should we allow or ban AI therapists? Where's the ethical boundaries? And in which cases can AI technologies be fairly and responsibly incorporated into mental health applications?'"
The research team continues to examine those questions and others such as privacy-preserving technologies for safety and security with agent-assisted support groups for more serious topics like substance use disorders. Wang said her ongoing work with the team is investigating how much users report trusting AI agents in mental health settings and outcomes in situations where participants perceive different levels of AI involvement.
Ultimately, the results can help improve mental health services by better understanding when people in group sessions are fostering genuine connections with others, Wang said.
"The goal is to see, with many people needing mental health support and services, and a limited number of trained professionals, if the use of AI-assisted agents or systems can be perceived acceptable by users. Ultimately, in support groups, it may be the human-to-human dynamic that's really helpful for people to come together and share their experiences," Wang said. "Or even with different experiences, a place for them to be compassionate with each other and provide empathy, talking through unique experiences. So in this case, the AI agent is more of like a facilitator of conversation, where human-to-human relationship is still the key."