Nearly Half of UK Adults Favor ChatGPT for Counseling

Bournemouth University

More than 4 in 10 adults in the UK are happy to use ChatGPT for their mental health support, new research suggests.

The study, led by Bournemouth University surveyed nearly 31,000 adults in 35 countries about their use of Artificial Intelligence (AI) large language models such as ChatGPT. The research also discovered that:

  • One quarter of UK adults would be happy to delegate the role of teaching their children to AI.
  • Globally, 45% of people would trust AI models to take on the role of their doctor.
  • Three quarters of people surveyed said they would use an AI chat tool as a companion and a friend.

The study has been published in the journal AI and Society .

Dr Ala Yankouskaya, Senior Lecturer in Psychology at Bournemouth University who led the study said: "With the rapid development and mass availability of AI, more people are placing their trust in it. We wanted to learn more about how people would trust generative AI tools, such as ChatGPT, to carry out some of the most important roles in their daily lives."

AI for mental health support

41% of participants from the UK, and 61% globally, said that they would be happy to using AI for counselling services. The researchers suggest that for the UK, this could be the result of the waiting times many people face to access the mental health services that they need.

"If someone is experiencing depression, they do not want to wait months for an appointment, so instead they can turn to AI," Dr Yankouskaya said. "However, when I tested some of the tools myself, I found the language used very vague and confusing because the developers are careful not to jump into providing diagnoses. So, it is no substitute for speaking to a health professional."

The researchers also noted that users were already familiar with NHS chatbots, which use similar AI technology, and this could be normalising their use of AI in other apps such as ChatGPT for their mental health care.

AI as a teacher

A quarter of people in the UK and half of everyone surveyed globally said that they would trust AI to carry out the role of a teacher, which the research team found particularly concerning.

"It really knocked me down when I saw how many people would be willing to delegate AI to the role of teaching their children," Dr Yankouskaya explained. "We still do not know the long-term effects that using these tools for education could have on children's memory and cognitive functions. We could be heading to the stage where we are developing children who are good at putting prompts into AI tools but not as good at taking the information in," she continued.

The researchers were also concerned about the long-term physical effects on the brain if learning information in the traditional way was replaced by excessive search-engine use, and whether this could shrink the hippocampus region of the brain that used for spatial awareness and learning.

AI as a doctor

45% of all respondents and 25% in the UK said that they would trust AI to carry out the role of their doctor. The numbers were particularly higher in countries where healthcare is more expensive and harder to access.

This wasn't as surprising to the researchers who believe people that live in parts of the world where access to health care services is not readily available, might rely on technology for quick answers.

However, they were cautious about the underlying algorithm used to retain the user's attention and keep them in a relaxed chat. This might be more harmful for mental health advice, where traditional methods of advice might be to alert the user to specific services such as The Samaritans.

AI as a companion

The highest amount of trust participants were willing to place in AI came in the role of friendship. Over three quarters of people globally and over half of people in the UK said they would talk to ChatGPT as a companion.

The researchers think this is explained by a perceived sense of empathy from generative language tools because they are designed to adapt the tone of their responses to the suit the user's.

"AI tools come across as a friend who knows you well and understands you," Dr Yankouskaya explained. "ChatGPT can remember every chat it has had with a user and it feels like a private conversation between them. Nowadays people can be very sensitive to being judged and AI tools are designed to be non-judgemental. This means they can provide the sense of security people need," she continued.

Dr Yankouskaya and the team concluded that as the prospect of AI playing a bigger role in people's lives moves from a theoretical prospect to reality, there needs to be more awareness within societies about how generative AI tools work and their limitations. The lack of knowledge about the long-term effects on someone's memory means caution needs to be applied before they take over roles in education in particular.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.