You can get an appointment with an AI therapist 24/7 and talk through your problems on your own comfy couch. Any downsides to that?

An AI therapist on a phone

An AI therapist on a phone that resembles Sigmund Freud. (Illustration by News Decoder)

This article was produced exclusively for News Decoder’s global news service. It is through articles like this that News Decoder strives to provide context to complex global events and issues and teach global awareness through the lens of journalism. Learn how you can incorporate our resources and services into your classroom or educational program. 

Young people are turning to AI-based chatbots for immediate and inconspicuous support for mental health challenges. But its use for therapy is fraught with ethical and regulatory conundrums.

Dr. Fan Yang, assistant professor in the School of Social Work at the University of Illinois Urbana-Champaign said that AI can be very efficient. “It’s available anytime,” Yang said. “It can let people talk with the machine without thinking about stigma.”

In one ongoing project, the Addictions and Concurrent Disorders Research Group at the University of British Columbia surveyed 423 university-aged recent ChatGPT users. Rishika Daswani, clinical research assistant for the study, said that they found that just over half had used the app for mental health support.

“When comparing it to traditional support, a lot of our respondents said that it was actually similar, and a small but significant portion of people noted its superiority,” Daswani said.

In the wake of financial barriers and long waitlists for in-person care, AI-based mental health apps and chatbots are well-intentioned to provide interim support during this gap, said Dr. Bryanna Moore, assistant professor in the Department of Health Humanities and Bioethics at the University of Rochester.

There’s an app for that.

Still, a study led by Yang and colleagues in the Journal of Medical Internet Research mHealth and uHealth showed that high-quality apps still carried financial barriers to access through subscription or one-time fees.

“In the future, we need to be careful about the word ‘availability’,” Yang said. “We can distinguish technological availability versus financial availability.”

Daswani said the most common drawback of AI use for therapy identified by participants in the group’s study was that AI lacked emotional tone and depth. While a therapist might challenge one’s thoughts and help them reflect critically, chatbots tend to regurgitate information and act as echo chambers to reinforce pre-existing beliefs, Daswani said.

Moore described AI therapy as sycophantic. “They are designed to draw you in to keep you clicking and engaged for as long as possible,” Moore said. “The responses they give are meant to make you feel good or seen or validated.”

Loneliness and social isolation are among the root causes of many mental health issues for which young people use chatbots for support.

“I don’t think it’s a leap to say that for some people connecting with a therapy bot or an online persona, [it could] promote the development of coping skills, but for others, it could really erode that,” Moore said.

When children turn to AI therapists

While most of the discussion around AI use for therapy has been centered around adults, Moore said specific considerations need to be taken into account for young children and adolescents.

“Children are developmentally, morally, socially and legally distinct from adults,” Moore said. “The use of AI-based apps for mental health care by children and adolescents might impact their social and cognitive development in ways that it doesn’t for adults.”

Childhood and adolescence are pivotal times for cementing how someone understands what it means to have friendships or relationships and learns to pick up on social and emotional cues. Chatbots often fail to fully understand a child within the context of this environment, Moore said.

“Especially when it comes to things like mental health care, the environmental stressors on the child are central to understanding how their symptoms are presenting and identifying effective avenues of intervention,” Moore said.

Therapeutic interventions usually involve shared decision-making with the child, caregiver and clinician to fully explore the benefits, risks and alternatives of each option. However, mental health apps can short-circuit these essential conversations, Moore said.

Putting trust in technology

In their survey of 27 mental health apps, Yang and colleagues identified several user design concerns for a youth target audience.

Many apps featured dark colors and attained low readability scores, with an average sixth-grade reading level for in-app content and ninth-grade reading level for app store descriptions. While all apps were based on text, Yang said including non-text formats would make the apps more youth-friendly, especially for non-English speakers.

Daswani cautioned that while AI may seem to have lowered the barrier for access to mental health care, it may be slow to gain acceptance in communities with low institutional trust in technology and authority.

“Western language has specific emotional frameworks which may not fully capture other cultures’ ways of expressing distress,” Daswani said. “If AI tools don’t recognize these culturally encoded expressions, then you have a risk of misunderstanding and your needs not being met.”

Moore and other experts worry that the reliance on AI for mental health support could perpetuate the pervasive notion that mental illness is something one deals with on their own.

“If it’s as simple as downloading and jumping on an app once a day or once a week, there’s this idea that the barriers to having good mental health are gone,” Moore said.

The value of human interaction

The reliance could normalize turning to technology as the best, easiest and most appropriate avenue for support when someone is struggling. “I don’t think there’s anything inherently good or bad about the technology,” Moore said. “My big worry is, will it become a substitute for also seeking out meaningful human interactions and developing those skills and coping mechanisms?”

If these chatbots are truly treatments, they must be subject to the same regulations that other treatments are subject to, said Moore, but for now, there is a lack of regulations and clear guidelines about who is responsible for assuming the risks involved in using AI for therapy.

“It’s just such an unregulated space, and I think placing the responsibility on children, adolescents, parents and caregivers, and even individual clinicians to navigate this quagmire is really unfair,” Moore said.

In the study by Yang and colleagues, many of the apps lacked detailed privacy policies, aside from the baseline information provided on the app store. How the apps handle personal data and information about traumatic experiences was not explicitly stated.

It is also currently unclear how best to integrate these apps into clinical practice. Moore said a logical starting point is for clinicians to ask patients about their digital intake and understand how much time they are spending on these apps.

Daswani said that integrating AI literacy into mental health education can help people understand the benefits and limitations of these apps. “We’re not saying that it’s to replace a therapist,” Daswani said. “But that doesn’t mean that we want to discredit it completely.”

What’s needed now, Yang said, is to improve the quality of the apps. “So hopefully one day we can have human-centered treatment plans for people, with AI being some supplemental treatment support,” Yang said.


Questions to consider:

1. What is an advantage of a therapy app?

2. What are some concerns health professionals have about children relying on AI therapy?

3. Why might you feel more comfortable talking to a digital tool than a human?

Karolina Krakowiak

Jasmine Ryu Won Kang is an MD/PhD student at the Temerty Faculty of Medicine at the University of Toronto.

Share This
ScienceTechnologyArtificial IntelligenceCan AI therapists really help you sort out your problems?