Can chatbots help young people with mental health problems? A group of researchers will try to find out.

Can artificial intelligence help teenagers with depression?

Researchers will test whether ChatGPT can provide answers to questions young people have about mental health problems.

Published

The chatbot ChatGPT quickly became a topic of conversation when it was launched in late 2022.

Now, a group of researchers from the University of Oslo will investigate whether the chatbot can be used to help young people with depression.

Information needs

Previously, the researchers conducted a study where they collected questions about depression that had been submitted to Ung.no, a website where young people can ask questions to various professionals. They then analysed the questions to find out what young people are curious about regarding depression.

Kim Kristoffer Dysthe told Dagens Medisin that many of the young people expressed a great need for information. He is the lead researcher behind this new study.

The study shows that young people hesitate to talk openly about mental health. The researchers also write that young people appear to be difficult to motivate for treatment due to stigma and social shame.

Believe that chatbots can help

The plan for further research is for the researchers to ask the questions they collected to ChatGPT and compare the answers they receive with the answers given by the experts at Ung.no.

Dysthe told Dagens Medisin that he believes in the possibilities that chatbot technology can provide, but also points out that ChatGPT has some clear weaknesses. For example, if it does not know the answer to a question, it can come up with an answer itself, which is not always accurate.

Tine Nordgreen also believes that chatbots can be helpful for young people with mental health problems. She is head of the Research Center for Digital Health Services at Haukeland University Hospital and an associate professor at the University of Bergen. She researches digital health services for mental health problems.

Nordgreen explains that her research team has recently conducted some interviews with young people to understand their needs in digital health services.

They found that many young people receive information about mental health on TikTok, but that the information is not necessarily good and correct.

“Many of them say that they need evidence-based information. If that information can be provided through chatbots or similar technologies, developed by a combination of technology and expertise, then I think that is really good,” she tells sciencenorway.no.

Tine Nordgreen researches digital health services.

Lower threshold

One possible advantage of chatbots is that many people feel that the threshold for contacting a chatbot is lower than for contacting a human. Nordgreen says that research shows that it is perceived to be easier to report suicidal thoughts to a machine than to a human.

As part of the Social Health Bots project, a study was published in 2021 where researchers invited young people between the ages of 16 and 21 to use and reflect on chatbots as a source of social support.

The focus of the study was not on young people with symptoms of mental health problems, but on examining whether social support from a chatbot can be beneficial for everyone.

Participants in the study reported that the threshold for communicating with a chatbot was lower than for communicating with another human. The reason for this was that they did not feel that the chatbot would judge them for their weaknesses and problems.

There were also several participants who reported using the chatbot in their bed at home. They considered this a safe context in which to talk about difficult problems.

Evidence-based content

Nordgreen emphasises that an important prerequisite for chatbots to work is that they have evidence-based content.

“And if it’s intelligent and starts producing its own answers, it’s important to have a good moderation function. We have seen how intelligent chatbots can go off track and start becoming both rude and offensive,” she says.

She adds that it is also important for people to understand that they are talking to a bot and not a human.

“It has to be genuine enough to create a psychological process in the recipient, but at the same time not pretend to be a human,” Nordgreen says.

———

Translated by Alette Bjordal Gjellesvik.

Read the Norwegian version of this article on forskning.no

References:

Aukrust, Ø. Forskere skal teste kunstig intelligens for ungdommer med depresjon (Researchers will test artificial intelligence for young people with depression), Dagens Medisin, 2023.

Brandtzaeg et al. When the Social Becomes Non-Human: Young People’s Perception of Social Support in Chatbots, CHI Conference on Human Factors in Computing Systems (CHI ’21), 2021. DOI: 10.1145/3411764.3445318

Dysthe et al. Analyzing User-Generated Web-Based Posts of Adolescents’ Emotional, Behavioral, and Symptom Responses to Beliefs About Depression: Qualitative Thematic Analysis, Journal of Medical Internet Research, vol. 25, 2023. DOI: 10.2196/37289

Powered by Labrador CMS